(A + B)ij = Aij + Bij
(AB)ij = Σ(Aik * Bkj) for k = 1 to n
(A^T)ij = Aji
|A| = |a b| |c d| = ad - bc
For a 3x3 matrix A:
|A| = |a b c| |d e f| |g h i| = a(ei - fh) - b(di - fg) + c(dh - eg)
A A^(-1) = A^(-1) A = I (Identity matrix)
For a 2x2 matrix A:
A^(-1) = (1/|A|) * | d -b| |-c a|
tr(A) = Σ Aii for i = 1 to n
Rank of a matrix: The rank of a matrix A, denoted by rank(A), is the maximum number of linearly independent rows or columns in the matrix.
Eigenvalues and eigenvectors: If A is a square matrix, λ is an eigenvalue and x is an eigenvector of A if:
A x = λ x
The eigenvalues can be found by solving the characteristic equation:
|A - λ * I| = 0
A^T = A^(-1)
Linear independence: A set of vectors {v1, v2, ..., vn} is linearly independent if no vector can be expressed as a linear combination of the others.
Basis and dimension: A basis of a vector space is a set of linearly independent vectors that span the space. The dimension of the vector space is the number of vectors in its basis.
Gram-Schmidt orthogonalization: A method to convert a set of linearly independent vectors into an orthogonal or orthonormal basis.
Singular Value Decomposition (SVD): A factorization of a matrix A into the product of three matrices U, Σ, and V^T, where U and V are orthogonal matrices and Σ is a diagonal matrix containing the singular values of A:
A = U Σ V^T
T(u + v) = T(u) + T(v) T(c u) = c T(u)