3.2 Norm, Dot Product, and Distance in Rn
- norm : length of vector ||v||
- Standard Unit vector : vector of 1 length 1/||v||*v
- distance between u and v : ||u-v||
- Dot Product : Inner Product! > u • v = ||u|| ||v|| cosß (0 ≤ ß ≤ π)
- u • v > 0 : acute (less than 90˚)
- u • v < 0 : obtuse
- u • v = 0 : 90˚
- u • v = u1v1 + u2v2
- Euclidean inner product
- u • v = uvT = vuT
- cosß = u • v / ||u|| ||v||
3.3 Orthogonality
- orthogonal Projection 영사!
- Decompose : w1 + w2 = w1 + (u-w1) = u
- vector component of u along a
- proja u = u•a / ||a||^ *a
- vector component of u orthogonal to a
- u - projau = u - u • a / ||a||^ *a
- norm of projection : ||proja u|| = |u•a| / ||a||
Chapter 4 : General Vector Spaces
Section 4.1 : Real Vector Spaces
- Vector Space V
- u , v in V > u + v in V (inherit)
- u + v = v + u
- (u+v) + w = u + (v+w)
- u+0 = 0 +u =u
- u + (-u) = 0
- k : any scalar , u in V > ku in V (inherit)
- k(u+v) = ku + kv
- (k+m)u = ku + mu
- k(mu) = (km)u
- 1u = u
- u , v in V > u + v in V (inherit)
Section 4.2 : Subspaces
- Subset W is itself a vector space under V
- Subspace Test : Axiom 1 and 6
- closed : 닫혀있는~
- Solution Spaces of Homogeneous Sytem
Section 4.3 : Spanning Sets
- the standard unit vectors Span Rn
- Testing of Spanning
- det(A) ≠ 0 (if det(A) =0, do not span)
- Ax=0 : trivial sol 0
- Ax=b : consistent
Section 4.4 : Linear Independence
- Linear Independence : the vector cannot be explained with the other vector.
- non trivial sol (det(A) =0) > linear dependence
- Linear Independence of the Standard Unit Vectors in Rn
- k1i + k2j + k3k =0 >> (k1,k2,k3) = (0,0,0)
- if i and j are dependent, k1 can be express as k1=-k2 > NOT INDEPENDENT
- a basis to R3?
1. linear independence? > k1i + k2j + k3k =0 : only trivial solution
2. span? det(A) ≠ 0
Section 4.5 : Coordinates and Basis
- S is a basis for V
: 1. S span V
2. S is linearly independent - Uniqueness of Basis Representation
- v = c1v1 + c2v2 + ••• + cnvn
- the group of (c1, c2 ••• cn) is unique!
- coordinates of v : the group of (c1, c2 ••• cn)
Section 4.6 : Dimension
- All bases for vector space have the same number of vectors.
- V : finite-dimensional vector space,
- V > n vectors : Linearly dependent
- V < n vectors : not span V
- dimension : number of vectors in a basis of V
- Plus / Minus Theorem
- S U {v} : still linearly independent
- span(S) = span(S -{v}), if S -{v} remove v from s > still span!
Section 4.8 : Row Space, Column Space, and Null Space
- Row space of A : Rn
- Column space of A : Rm
- Null space of A : null(A)
- Ax=b is consistent!
- The elementary row operation can change the column space of a matrix
- Elementary row operations don't change dependency relationships between column vectors.
- the number of basis
- the number of row space basis = the number of column space basis
Section 4.9 : Rank, Nullity, and Fundamental Matrix Spaces
The row space and the column space of a matrix A have the same dimension.
- common dimension(number of vectors) : rank => rank(A)
- demension of null space : nullity => nullity(A)
- mXn matrix
- rank(A) ≤ min(m,n) for an mXn matrix A
- rank(A) + nullity(A) =n = dim(V)
- [number of leading variables] + [number of free variables] = n
- [number of leading variables] + [number of free variables] = n
- Equivalent Statements of a n x n matrix.
Chapter 5 : Eigenvalues and Eigenvectors
Section 5.1 : Eigenvalues and Eigenvectors
- eigenvalue
- eigenvector
- (λI - A)x =0
- det(λI - A)=0
- features of eigen value
- ∑λ = trace(A)
- πλ = |A|
Section 5.2 : Diagonalization
- B is similar to A
- B=P-1AP
- Fact
1. det(P-1AP) = det(P-1)det(A)det(P) =det(A) = πλ
2. Invertibility : A is invertible, P-1AP is invertible
3. rank(A) = rank(P-1AP)
4. nullity(A) = nullity(P-1AP)
5. trace(A) = trace(P-1AP) = ∑λ
6. Eigenvalues : A and P-1AP have same eigenvalues.
7. Eigenspace dimension : dim(A) = dim(P-1AP) - Diagonal matrix D, Square matrix A : A > diagonalization
=> D=P-1AP <=> AP = PD- An nXn matrix with n distinct eigenvalues is diagonalizable.
λ1≠λ2≠λ3≠•••≠λn - P = {p1, p2, •••, pn}
- An nXn matrix with n distinct eigenvalues is diagonalizable.
Chapter7 : Diagonalizaion and Quadratic Forms
Section 7.1 : Orthogonal Matrices
- orthogonal : its transpose is th same as its inverse!
A-1 = At- AAt = AtA = I
- A is orthogonal
- the row vector of A > inner product =0
- the column vector of A > inner product =0
- transpose A > orthogonal
- inverse of A > orthogonal
- product of orthogonal matrices is orthogonal
- det(A) =1
Section 7.2 : Orthogonal Diagonalization
- orthogonally similar : B = PtAP
- A : n x n matrix => D = PtAP
- orthogonally diagonalizable
- orthonormal set of n eigenvectors
- A : symmetric
- Spectral Decomposition
- A = ∑λiuiuti
'Study' 카테고리의 다른 글
Summary of Regression Analysis (2) | 2024.06.07 |
---|---|
[Linear Regression] Key Terms for Understanding Inferences in Simple Linear Regression / 회귀분석 : 용어정리 (1) (0) | 2024.04.14 |