Linear Algebra II¶
week 1-1 (3/2/2020)¶
- review
- linear independency
- basis and dimension
- dimension theorem
- direct sum
- matrix representation : $$ Rep_{\alpha,\beta}(T) = (Rep_\beta(T(\vec{\alpha_1}) ...Rep_\beta(T(\vec{\alpha_i})) $$
- change of basis $$ Rep_\alpha(T) = Rep_{\beta,\alpha}(id)Rep_\beta(T)Rep_{\alpha,\beta}(id) $$
- determinant
- Invariant Subspace
- a subspace \(W \subset V\ s.t. \forall w \in W, T(w)\in W\)
- direct sum of Invariant Subspace
- Eigenspace
- def: \(E(\lambda) = \{v \in V \mid T(v)=\lambda v \}\)
- \(ker(A-\lambda I)\), \(E(\lambda)\ of\ A\)
- attendence 10%
- next quiz hint:
- matrix representation + diagonization
- lecturing
- \(T:V\rightarrow V\) be linear transformation
- For \(\vec{v} \in V\) what is the smallest T-invariant subspace?
- Theorem : ::: info let \(W = span\{v, T(v), T^2(v), T^3(v)...\}\) then
- \(W\) is a T-invariant subspace
- let \(dim(W) = m\), then \(\(\alpha = \{ \vec{v} , T(\vec{v}), T^2(\vec{v}), ...T^{m-1}(\vec{v})\}\)\) is a basis of \(W\)
- \(T^m(\vec{v}) = \sum a_iT^i(\vec{v})\)
- \(Rep_{\alpha}(T)\) = \begin{bmatrix} 0 & 0 & ...& a_0 \ 1 & 0 & ...& a_1 \ 0 & 1 & ...& ...\ 0 & 0 & ...& a_{m-1} \end{bmatrix}
- determinant and characristic function by MI
- \(f_{T\mid _W}(x) = ?\) ::: ::: info Proof:
- By definition : \(W = span\{w, T(w), T^2(w), T^3(w)...\}\)
- \(\exists \ k\ \ni T^k \in span\{w, T(w), T^2(w), T^3(w)...T^{k-1}(w)\}\) for V is finite-dimentional
- \(T^k(w)\) is linear combination of \(\alpha\)
- by induction , \(T^{n}\) with \(n > k\) is too
- then \(T(\vec{w}) \in W\) is trivial :::
- asked teaher for concept confirmation
- max linear independent set = generating set in this case. But teacher emphisize the concept is different
- uses \(F\) instead of \(R\) for generasity
week 1-2(3/4/2020)¶
- matrix transformation self review
- http://www.taiwan921.lib.ntu.edu.tw/mypdf/math02.pdf
- isomorphic linear transformation <=> invertible linear transformation
- standard representation
- change of coordinate
- find eigenvalues and vectors
- \(A = P^{-1}QP\)
- course
- invariant subspace is useful for analyzing
week 2-1(March/9th/2020)¶
- 3/10/2020 ask teacher
- teacher explained in detailed and intuively
- What's the problem?
- Why invariant subspace?
- to create more 0
- Why Annihilator?
- to get blocks of invariant subspaces
- Usage of Cayley-Hamilton Theorem?
- find such a f(x) quickly
- The next will be Jordan form
- what to fill in blocks?
- btw, diagonizable and invertible is independent
!T-invariant subspaces¶
- to find a basis \(\alpha\) \(\ni\)
- and extend the basis from span W to span V
- then the matrix representation of T : V->V $$ \begin{pmatrix} T|_W & A \ O & B \end{pmatrix} $$
- if we get direct sum of T-invariant subspaces $$ \begin{pmatrix} T|{W_1} & O \ O & T| \end{pmatrix} $$
Annihilator - Ann(T)¶
- \(L(V,V) \rightarrow \ set\ of \ polynomials\)
- \(for\ T \in L(V,V),\ Ann(T) = \{f(x)\in F[x]\mid f(T)\ is \ 0\ transformation\}\)
- a way to get decomposition of \(V\) to direct sum of T-invariant subspaces
!Cayley-Hamilton Theorem¶
- \(f_T(T) \in Ann(T)\)
- in this case, the usage is to find a \(f(x) \ni f(T)\equiv 0\)
- proof: \(\forall v \in V\), let T invariant subspace generated by \(v\) be \(W\) \(let\ V=W+W^{'}\) since \([T]_{W+W^{'}}\) can be express as $$ \begin{pmatrix} T|_{W} & A \ O & B \end{pmatrix} $$ \(f_{T|_{W+W^{'}}}(T) = g(x)f_{T|_W}(x)\) Note that g(x) is the \(det(\lambda I-B)\) part \(f_{T|_W}(v) = 0\) from collagory that followed directly with definition of \(T|_W\) \(Q.E.D.\)
!(General Eigenspace)Decomposition of V¶
- goal : \(V=ker_\infty(T) \bigoplus Im_\infty(T)\)
- goal2 : \(V=\bigoplus E_\infty(\lambda)\)
-
part 1 Note : hint for T-invariance - exchangable ops
-
part 2
Note: in part 2 if m is inf of all ms satisfy prerequisition then \(V = ker(T^{m-1}) \bigoplus Im(T^{m-1})\) does not hold in general confirmed by teacher * part 3 * part 4
Note: * Theorem 4 * case minimal m = 0 => invertible(rank(n)) => trivial * case minimal m >= 1 => is true * above notes are from Ming-Hsuan Kang * proof of diagonizability
Next Jordan Form¶
- what's in the block exactly
What Linear Algebra studies¶
- real world problem, natural functions
- how to express in good basis(change of basis)
- fourier, Laplace...
- meaningful, easy to compute
- linear transformations
- differential
- integral
- etc.
- its quite different from Abstract Algebra by teacher
- whereas I think the way to think is similar
- just LA emphasize more on linearity and dimension etc.
Week 2-2¶
quiz problem¶
- pA : calculation of T-invariant subspace
- pB : let T : R3->R3 be reflex transformation, show T is diagonizable
- hint \(T^2 = I\) + HW
- I was the first one finished
other material¶
- [Invariant Subspace]https://math.okstate.edu/people/binegar/4063-5023/4063-5023-l18.pdf
Week 3-1¶
Nilpotent LT¶
- definition
- T is a nilpotent LT
- \(V = ker_{\infty}(T)\)
- \(T^k = \vec{0} for\ some\ k\)
- use similar technique as T-invariant subspace
- find a matrix rep of T $$ \begin{pmatrix} 0 & 0 & 0 & ...&0 \ 1 & 0 & 0 & ...&0 \ 0 & 1 & 0 & ...&0 \ 0 & 0 & 1 & ...&0 \ ...&...&...&...&0 \end{pmatrix} $$
Theorem¶
Week 3-2 skipped, 3-3(self study@Saturday)¶
!Th. - V can be decomposed with Nilpotent LT(on V)¶
Proof sketch of part 3(2020/3/23, correct by teacher)¶
:::info let T be nilpotent LT on V of index k+1 choose \(v_i\) s.t. \(T^{k}(v_i)\) forms a basis of \(Im(T^{k})\)
Objective : \(V = W\bigoplus cyclic(v_i)\) for some T-invariant subspace \(W\)
Proof: - key : must have dot diagram in mind - Induction on k+1, index of T - \(W\) as the left part removing higher dimension cyclic subspaces - extension of basis is like finding the current longest given the past longest - \(v_i\) past longest - \(u_i\) current longest
- when \(k = 0, T = 0, T^0 = I\), holds trivially
- when k holds
- \(V = ker(T^k) \bigoplus Fv_i\)
- devide V to apply \(T^k\) is 0 or non-zero
- \(V^{'} = ker(T^k)\), \(T^{'} = T|_{V^{'}}\)
- \(T^{'}\) is nilpotent on \(V^{'}\) of index k
- Now we find a basis of \(Im({T^{'}}^{k-1})\), a subspace of \(ker(T^k)\)
- part original:
- from \(v_i\) to \(T(v_i)\)
- \({T^{'}}^{k-1} (T(v_i))\) = \(T^k(v_i)\)
- l.i. subset of \(Im({T^{'}}^{k-1}))\)
- part extended:
- \(u_i\)
- part original:
- \(ker(V) = W_0 \bigoplus cyclic(T(v_i)) \bigoplus cyclic(u_i)\)
- by induction hypotesis
- \(T(v_i) \bigoplus u_i\) forms basis of \(Im({T^{'}}^{k-1})\)
- \(V = ker(T^k) \bigoplus Fv_i\)
- \(V = W_0 \bigoplus cyclic(u_i) \bigoplus cyclic(T(v_i)) \bigoplus Fv_i\)
- \(V = W \bigoplus cyclic(v_i)\)
- since \(cyclic(u_i)\) is T-invariant
- \(V = ker(T^k) \bigoplus Fv_i\)
- by induction, Q.E.D.
- by the proof step can actually see W is cyclic subspaces' direct sum :::
Illustration Diagram ::: spoiler :::
HW3¶
Week 4-1(2020/3/23)¶
!Jordan Form¶
part 1¶
- for a LT T
- \(V = \bigoplus ker_\infty(T-\lambda_i I)\)
- \(V = \bigoplus E_\infty(\lambda_i)\)
part 2¶
- \(T-\lambda I\) is nilpotent on \(E_\infty(\lambda_i)\)
- \(E_\infty(\lambda_i) = \bigoplus cyclic(v_i)\)
part 3¶
- these \(cyclic(v_i)\)s have a representation of
$$ \begin{pmatrix} 0 & 0 & 0 & ...&0 \ 1 & 0 & 0 & ...&0 \ 0 & 1 & 0 & ...&0 \ 0 & 0 & 1 & ...&0 \ ...&...&...&...&0 \end{pmatrix} $$
Conclusion¶
- Jordan Form
Uniqueness?¶
- General Eigenspace Decomposition (YES)
- Cyclic Subsapce Decomposition (No)
- but the dimensions are unique
- generally, the matrix rep. can be said to be unique
Steps to find a Jordan form matrix rep.¶
- find \(f_A(x)\)
- find dot diagram for each \(\lambda\)
- by observe the nulity of \((T-\lambda I)^k\) ex: 0 <- \(T(v_2)\) <- \(v_2\) 0 <- \(v_1\)
Steps to find a Jordan basis¶
- just solve it from basis of \(E\infty(\lambda)\)
Case Study¶
- $$ \begin{pmatrix} 5 & 7 & 1 & 1 & 5 \ -2 & -3 & -1 & -1 & -5 \ -1 & -3 & 1 & 1 & -3 \ 0 & 0 & 3 & 0 & 1 \ 3 & 3 & 1 & 0 & 6 \end{pmatrix} $$ 2. let V be a subspace of real funcitons spanned by \(\alpha = \{x^{-2}e^x, xe^x, e^x\}\) let D be the differential operator find Jordan form of D and its jordan basis
Jordan Chevalley Decomposition¶
\(A = P^{-1}(D+N)P = P^{-1}DP + P^{-1}NP\) where \(P^{-1}DP\) is semi-simple part \(P^{-1}DP\) is nilpotent part
- \(\forall A \in M_n(\mathbb{C})\) there exist unique \(D,N \in M_n(\mathbb{C})\)
- A = D + N
- DN = ND
- D is diagonizanle
- N is nilpotent
Advantage of Jordan Form¶
power of matrix¶
- by \((D+N)^k\) with the fact that N is nilpotent
Realse of HW1 Quiz1 Quiz2¶
- HW1 30/30
- Quiz1 16/20
- Quiz2 20/20 ::: spoiler :::
Week 4-2¶
- Hw and test
- approximate Jordan form with diagonizable matrix
Week 5 - next topic - Inner Product Space¶
Week 5-1 - Inner Prodct Space¶
- Definition over \(\mathbb{R}^N\) and \(\mathbb{C}^N\)
- Definition of orthogonal
- General definition when \(\mathbb{F}\) is \(\mathbb{R}\) or \(\mathbb{C}\)
- Inner Product of
- Continuous Functions
- Discrete Signals
- Discrete Fourier Transformation(DFT)
- meaningful basis
- easy-to-compute basis
- Theorem:
- Every inner product is induced from some basis
Week 5-2 - Normal LT¶
Symmetric and Hermitian¶
Adjoint¶
definition¶
- defined on vector space with inner product - without basis => more inside
self-adjoint¶
- All eigenvalues of T are real.
- T admits a set of eigenvectors which forms an orthonormal basis of V. (Especially, T is diagonalizable.)
- Under an orthonormal basis, the conjugate transpose of the matrix representation of T is equal to the matrix representation of T∗
Normal and Ker/Im¶
- proof of 5
Normal <=> Diagonalizable under orthogonal basis¶
- proof
- <=
- trivial
- =>
- key : general eigenspaces = eigenspaces
HW5¶
Week 6 - fill in week 5¶
- diagonization of symmetric matrix
- Grandsmith and projection
- norm of \(C^2\)
- 2020/04/07 making up HW5
:::spoiler :::
Q: relationship of Rn and Cn Q: induced innerproduct by basis
- finite filed > 0 is not well defined
- positive definite
- compare relation
- inner prouct( >= 0)
- only have = 0
Week 6-1 - Othogonal/Unitary¶
Definition¶
- preserve inner product
Equivalences¶
Theorems revisitted¶
Isometric¶
- \(isometric\) + \(f(\vec{0}) = \vec{0}\) \(\iff\) \(orthogonal\)
- must be LT
- proof: theorem
- proof: LT
general isometry¶
- affine map(LT + bias)
Isometric 2-D case study¶
- all orthogonal matrice in \(\mathbb{R}^2\)
- rotation/reflection by parametric method
Det and Eigenvalues of Orthogomal Matrices¶
- \(det(A) = +-1\) by \(det(AA^T) = det(I) = det(A)^2\)
- \(T\) is ortho LT, \(W\) is \(T-invariant\) implies \(W^\perp\) is too.
- proof is exercise
Orthogonal 3-D case¶
- \(det(A) = +-1\)
- exist \(\lambda = +-1\) => rotate axis or reflecton axis
- by direct sum of T-invariant subspaces(exercise theorem)
- the left part is orthogonal matrice of rank 2 with det = 1, which must be a rotation
Questions¶
Q: relationship of Rn and Cn Q: induced innerproduct by basis Q: orthogonal - normal - symmetric(A*=A) relations
HW6¶
TA hour by MC kang 2020/4/14, learned a lot¶
- HW6 3-2
- more analytic way, discuss det=+1,-1, only on 3D
n-reflections theorem¶
more on Orientation(advanced topic)¶
- isometric LT has 2 type!
- continuous isometric (rotate) vs no away (reflect)
- add dimension, what happen
- spin by dimension
- orientation preserving orthogonal LT
- det +-1
- maintain isometric during continuous changing process!
- orientation is 2 for all R^n by 2 reflection = 1 rotation
- may not cover in class :(
Week 7 - self study - Review Concepts¶
Inner product¶
Q: induced basis and induced inner product¶
self-adjoint, Hermitian¶
conjugate transpose¶
self-adjoint Th¶
- All eigenvalues of T are real.
- T admits a set of eigenvectors which forms an orthonormal basis of V. (Especially, T is diagonalizable.)
self-adjoint, Hermitian¶
- Under an orthonormal basis, the conjugate transpose of the matrix representation of T is equal to the matrix representation of T∗
- A symmetric/Hermitian matrix is a matrix representation of a self-adjoint linear transform under an orthonormal basis.
Normal LT¶
def¶
T∗ and T commute
Properties¶
Theorem¶
A complex linear transformation is diagonalizable under some orthonormal basis if and only if it is normal.
Orthogonal and unitary¶
def¶
T* = T-1
n-reflections¶
Week 7 quiz¶
- prove cannot find a continuous family of iometry for reflection\
- first show b is inrelevant
- by cont. compose cont. (det。F~(t))
Week 7 - Real Canonical Form¶
analysis of real matrix A¶
- pairs of conjecate roots of \(f_A(X)\)
- pairs of eigenvalues
- conclusion
- for a eigenvalue \(\lambda=a-bi\), and \(\vec{v}\) be the corresponding eigenvector
- let \(v = v_1+iv_2\)
- \(A(v_1+iv_2) = (av_1+bv_2) + i(-bv_1 + av_2)\)
- notice that v1, v2 must be l.i.
- else the eigen value is real number
- complex block is $$ \begin{pmatrix} a & -b\ b & a\ \end{pmatrix} $$
analysis of othogonal matrix A¶
- \(\lambda_i = e^{-i\theta_i}=cos\theta_i-isin\theta_i\)
- complex block is $$ \begin{pmatrix} cos\theta & -sin\theta\ sin\theta & cos\theta\ \end{pmatrix} $$
- thus we can see orthogonal matrix as reflextions + 2D-rotations
- notice that \(\beta\)(change of basis) can be chosen to be orthonormal
next topic, quadatic form¶
Week 7 - Quadratic Form¶
Quafratic Form¶
Talor Series Revisited¶
- k-th order approximation - 1st + 2nd-order can be used to detemine local max/min
Example¶
Case: 2 variable, Binary Quadratic form¶
Case: Ternary¶
calculation example
General Case: N¶
Key¶
- Quadratic form is Real Symmetric Matrix
- Diagonizable(R) with orthogonal basis
Week 7 - Conic Sections¶
Purpose of this chapter¶
- zero set
Term¶
- G: \(ax^2 + bxy + cy^2 + dx + ey + f\)
- Q: \(ax^2 + bxy + cy^2 + dxz + eyz + fz^2\)
- H: \(ax^2 + bxy + cy^2\)
Zero Sets of binary Quadratic Form¶
- key: the signs of two eigenvalues
- \(sign(\lambda_1) = sign(\lambda_2)\) => {0, 0}
- \(sign(\lambda_1) \neq sign(\lambda_2)\) => two lines
- one of them zero => one line
deal with below quadratic terms¶
zero set of non-degenerate ternary quadratic form¶
Conix sections¶
Conclusion¶
- let H be \(ax^2 + bxy + cy^2\) - with \(Z(G) = Z(Q)\cap Z(z-1)\) ~= \(Z(Q)\cap Z(z=0) = Z(H)\) - we can judge G by H if non-degenerate
Week 7 - Equivalent Quadratic Forms, Signature¶
Equivalent Quadratic Form¶
- def, two quadratic form is called Euivalent iff
- can obtain each other by change of basis
- but since we want signature => need not to be orthonormal
- Diagonal Form
- note the simbol usage
- \(Q(\vec{x}^t)\)
- more common to use row vector to repr. variable
Review, change of variable to diagonal form¶
- note that \(\vec{y}^tD\vec{y} = \sum_i\lambda_iy_i^2\)
use orthogonal instead of orthonormal basis¶
Standard form : above is to change diagonal matrix to +-1¶
Signature of Real Quadratic Forms¶
- Note: trace of standard form(i.e., signature) + rank can decide standard form
Signature <=>(1-to-1) Eq Quadratic Forms¶
- pf: coming chapters
Example is trivial¶
Q: Why signature
HW 7¶
Apllications of Quadratic Forms(binary and tenary)¶
Taylor expansion and extreme values¶
Zero set, discussion on degenerate form¶
Week 8 - Office hour 2020/4/21¶
Taylor Expansion for determining local min/max¶
- terms of poly of \(dx_i\)
- higher order terms are dominated by lower order ones
- multiply of infinity smalls
- we use diagonalize technique to make thing easy
- chage of variable so that
- only square terms is non-zero
about Trace¶
- \(tr(AB) = tr(BA)\) poof by direct calculation
- for more, ex: ABC
- treat AB as D or BC as D and reduce to 2 matrix case
- \(tr(ABC) \neq tr(BAC)\) in general
- \(tr(A) = PDP^{-1}) = tr(DP^{-1}P) = tr(D)\) in this case
Weel 8 - Positive Definite Quadratic Form¶
Equivalences and proof¶
- Q is positive definite()
- A is positive definite(eigenvalue)
- Q has has unique minimun at 0
Case of semi¶
Theorem of n1 is¶
Sylvester's critirien¶
Induction Proof¶
extreme values at 0?¶
Example¶
- Q, not positive definite, by can be semi-positive definite?
- should check higher order?(the conclusion of not local extreme is too fast IMO)
- A(myself): bad Q, this is not discussion for derivatives(Hessian)
Week 8 - Bilinear Form¶
Definition¶
Matrix Representation¶
- Note taht tenary up has no matrix repr.
Coordinate-Free Quadratic Form¶
- isomorphism between Quadratic form and symmetric bilinear form
Relation with Quadratic form and inner product¶
Non-degenerate¶
Multilinear form, Tensorproduct, Theorem(Extra)¶
Week 9 - 2020/4/27 practice exam+ review¶
Midterm hint¶
- True & False
- Jordan Form
- calculations
- mini poly and possible jordan forms
- Quadratic form
- calculations
- tennary + binary
- local extreme
- zero set discussioni
- positive definite proof
- Symmetric, Normal, etc.
- proof of diagonalizability
- proof of eigenvalues all real(by inner product and real symmetric)
- when have diagonal form(normal, poly with no repitive roots are zero)
- Q: poly
My Q¶
- Quantum Computing learning path
- proof of jordan form
- about nilpotent LT's cyclic decomposition
- about general eigenvalue direct sum
Week 9 - individual office hour @ 2020/4/28 15:00-16:00¶
What does Quntum Computing study¶
- Q: is lie Group/Algebra related?
- Yes, bried introduction
- A: to make "good" universal gates
- traditional bit and gate(\((1,0)^N \to (1,0)^M\))
- now want universal approximation for wave functions
- Me: and apply them efficiently is the algo part
- Q: What should I study
- a variety of fields
- dont need to be deep, but know the essential concepts
- cause there are many isomorphic realations between fields
- think of the problem on ball and design way to higher dimension
Main clairifications¶
- Innerprodect space and basis
- when is inner product well defined
- minimal poly and relation with joran form
- eigenspace decomposition(multiply 0 of blocks)
- restate some inportent fact
- intuition about multilinear form
Summary Before Midterm¶
Jordan Form¶
- general eigenspace decomposition of LT
- T-invariant subspace
- cyclic subspace decompositioin of Nilpotent LT
- nilpotent LT
- Cayley-Hamilton Theorem Revisited
- Real Canonical Form
- Topic:
- Generalized kernel
- Eigen Space discussion
- minimal polinomial and possible jordan form
Inner Product Space¶
- Inner Product is defined on vector spaces that
- F is R or C(or non-finitem, to make >= 0 well defined)
- \(v^tw^{bar}\)
- 3 main concepts, their maxtrix representation, and theorems
- \(A^*\), (inner product space)adjoint - (matrix) conjugate transpose
- Self-Adjoint - \(A = A^*\)
- Normal - \(A, A^*\) commute
- Unitary - \(A^*A = I\)
- Isometric
- Self-adjoint
- Real symmetric or hermitian
- implies:
- real eigenvalues
- diagonalizable under unitary basis
- existence and uniqueness
- Normal
- A complex linear transformation is diagonalizable under some orthonormal basis if and only if it is normal
- proofs are important and interesting
Bilinear Form¶
- Quadratic from
- discussion of zero set
- conic cure
- discussion of extreme value
- Taylor, gradient, Hessian
- EQ quadratic form, signature
- discussion of zero set
- 1-to-1 relation of symmetric bilinear form with Quadratic form
- Inner product as "symmetric" and "positive-definite" "bilinear form"
- Idendity if induced by basis
- multilinear form and tensor product
- example illustration
- positive definite bilinear form
- Sylvester’s critirien
Trick¶
- proof v = 0 by
=0 - proof u-w = 0
- proof space V = W, \(W \subset V\) and \(dim~W = dim~V\)
- proof of practice exam 5.a-b
- induction on dimension
- consider diagonal matrix first!
Pictures¶
::: spoiler :::
self study + Q for night office hour¶
- hermitian > normal
-
- eigenvalues all real?
-
- link: EQ def normal
- normal := A* and A commute
- normal <=> A* is poly of A
- normal <=> A and A∗ can be simultaneously diagonalized
- Actually, \(A^* = P^tD^{bar}P\) by proof in pdf?
- commute <=> Simultaneous Diagonalizability
- Thm. 5.1
- also Thm. 4.11 for equivalence of diagonalizability
online TA hour¶
- diagonal represent as poly <=> poly n->n need n-1 degree
- Lagrange construction
- if A want to be represent as poly(B)
- if B position i, j is same, A pos i, j has to be same
- degree smaller
- commute and both diagonalizable <=> Simultaneous Diagonalizability
-
- AB reltation stated can <=> A can be poly of B
- T and T commute iif T is poly of T is true
- consider after diagonalized(always can do, normal)
- T* is just \(T^{bar}\)
- then use Lagrange construction
- proof trick
- 5-a
- about projection
- pairwise product is zero transformation
- lecture note is wrong
- sum is idendity
- pairwise product is zero transformation
- matrix congruence, quadratic form change of variable
- https://en.wikipedia.org/wiki/Matrix_congruence
- is undser "orthogonal basis"
Remainning Q¶
- taylor expansion for extreme value in genreal
- my guess, 0, +-, 0, +- ...
Week 13 - SVD¶
Best fit subspace¶
motivation and eq defs¶
left singular values¶
best fit subspace and SVD¶
- proof by induction on k
Note on details¶
- \(rank(A^tA) = rank(AA^t) = rank(A)\)
- proof by definition + norm def
- https://math.stackexchange.com/questions/349738/prove-operatornamerankata-operatornameranka-for-any-a-in-m-m-times-n
- \(A^tA\) is symmetic
- \(A^tA\) is semi-positive definite
- \(A^tA\) has orthonormal eigen decomposition
relation with right singular value¶
- \(Av_i = \sqrt{\lambda_i}u_i\)
- can proof this will be left eighebasis for A^t
SVD - the decoomposition¶
- standard basis => alpha{v} => sinvular value diagonal matrix(m*n) => beta => standard basis
- \(U\sum V^t\)
- \(Av_i = \sqrt{\lambda_i}u_i\)
- \(A = \sum _{i=1}^r \sqrt{\lambda_i}u_iv_i^t\)
- obs: sum of rank one matrices
compact SVD¶
- use only \(m*r, r*r, r*n\)
confusion¶
- not famil
SVD view of best fit k-subspace¶
Compression with SVD¶
PCA - best fit affine subspace¶
- max variance subspace
SVD vs PCA¶
- care mean or not
HW 9 @ Week 13¶
- last week take a leaf
- https://hackmd.io/Pcp80W3CT26eKqhUwIxzpw
Week 13 - Spectral Drawing¶
- Vertex => n-dimensional e vectors
- Now want to project to subspace W
- Minimal "Edge Energy Function"
- define as the sum of square of distance of edges in subspace W
- If want minimal => take eigenspaces of Laplance matrix
- can show by definition trivially
- want each c.c. has the same projection is best => eigenspaces
- used on similarity graph => spectral clustering
- can show by definition trivially
- If want minimal + orthogonal to eigenspace of (connected) graph
- this is spectral drawing
HW 10 - Spectral Drawing¶
Week 14 - Matrix Exponential¶
Motivation¶
- differential equation solution
Definition¶
- \(exp(At) = lim_{n\to\infty} \sum_{m=1}^{n}\frac{A^mt^m}{m!}\)
Matrix Limit¶
- Convergence, Abs Convergence, Complex Convergence
- Def: Matrix Limit is Entry-wise
- Product of Matrix Limit
- by elementwise discussion
Discussion of Jordan Block (exp(Jt))¶
Solution of linear system of DE¶
Week 14 - Discrete-Time Markov Chain¶
- the lecture
- capture the eigen-related features of transition matrix
- discuss the asymptotic trend when apply the same P infty time
Positive Matrix¶
- defined by entry
P(Probability) vector and Transition Matrix¶
- P-vector
- entry sum to 1
- T-matrix
- each column is a P-vector
Eigenvalues of Positive Transition Matrix abs "<= 1"¶
- proof
The eigenvalue "1"¶
- 1 is the only (in complex number set) eigenvalue with abs 1
- triangular inequality
Geometric Multiplicity = 1¶
- reminder: geo multiplicity = how many Jordan block = rank of eigenspace
- proof
- to make = 1
- required \(|v_i| = |v_j|~\forall i,j\)
- refer to last section(proof all eigen abs <= 1)
- for replace j with i
- since P is positive
- for take in the abs
- all v_i have same sign
- HW explicitly proof this
- conclusion
- \(v = v_i(1,1,1,...,1) = v_i\vec{1}\) is the only eigenvector of abs 1 eigenvalue
All Jordabn block of eigenvalue one is of size 1¶
- reminder: dot diagram
- proof by contradiction
Asymptotic behavior of \(\vec{\pi}^{(k)}\)¶
\(P^t~and~P\) share the same characteristic poly¶
Decompose Space into Directsum with Jordan form result¶
abs(Eigenvalue) < 1 => go to 0¶
conclusion¶
Conclusion¶
- take kernel of \(P-I\), get the only stationary p-vector!
Week 14 : Q on transition matrix¶
- what happen if the P is "stuck"?
- A, B, C state
- A to A, B to C, C to B
- start A => A
- start B/C => 0.5B + 0.5C
-
ans : "positive" = > rank = 1
-
will eigenvalue of P be all positive? or 0
- 0 means non-trivial nullspace exist
HW 11 - if \(P^k\) is positive, how about P¶
- can use positive transition result
- hint: P eigenvalue \(\lambda\) => P^k eigenvalue \(\lambda^k\)
Week 15 - Page Rank¶
- trivial
Week 15 - Commuting LT¶
Eigenspace of T1 is T2-invariant subspace¶
- proof by \(\forall v \in E_{T_1}(\lambda), T_1T_2v = T_2T_1v=T_2\lambda v\)
- \(E_{T_1}(\lambda)\) is T2 invariant
Simutaneously Diagonalizable¶
- restriction of a diagonalizable LT on an invariant subspace is still diagonalizable
What about Jordan¶
- nilpotent counter example
Spectral drawing and symmetric¶
- \(\sigma : V \to V\) be LT that change the vertices is still same graph
- than \(L\)(the Laplance) and \(\sigma\) is commutable
- pick the eigenspaces as a whole => symmetric!
- since the picked eigenspaces are \(\sigma -invariant\)
Week 15 - Dual Vector Spaces¶
Dual Vector Space¶
Dual basis¶
- note that all isomorphisms are based on a certain basis
Extra structures can be provided by dual space¶
- Q this
Dual space of inner product space has natural dual basis¶
- use f_v(w) =
- if v_i forms orthonormal basis
pullback¶
- let T be a V to W, f be a W to F in W* - get a V to F by V to W and W to F
adjoint¶
- generalized to for V, W - \(T: V \to W\) to \(T^*: W \to V\) by a natural way