2024
MTS

MINING TEST SERIES

- Exam Test Series LLC
it's my avatar

List of important formulas in Linear Algebra

  1. Matrix addition: If A and B are two matrices of the same order (m x n), then their sum, denoted by A + B, is a matrix of the same order obtained by adding the corresponding elements of A and B.

(A + B)ij = Aij + Bij

  1. Matrix multiplication: If A is an m x n matrix and B is an n x p matrix, then their product, denoted by AB, is an m x p matrix, where the elements are given by:

(AB)ij = Σ(Aik * Bkj) for k = 1 to n

  1. Transpose of a matrix: The transpose of a matrix A, denoted by A^T, is obtained by interchanging its rows and columns.

(A^T)ij = Aji

  1. Determinant of a square matrix: The determinant of a 2x2 matrix A is given by:

|A| = |a b| |c d| = ad - bc

For a 3x3 matrix A:

|A| = |a b c| |d e f| |g h i| = a(ei - fh) - b(di - fg) + c(dh - eg)

  1. Inverse of a square matrix: The inverse of a square matrix A, denoted by A^(-1), is a matrix such that:

A A^(-1) = A^(-1) A = I (Identity matrix)

For a 2x2 matrix A:

A^(-1) = (1/|A|) * | d -b| |-c a|

  1. Trace of a square matrix: The trace of a square matrix A, denoted by tr(A), is the sum of its diagonal elements:

tr(A) = Σ Aii for i = 1 to n

  1. Rank of a matrix: The rank of a matrix A, denoted by rank(A), is the maximum number of linearly independent rows or columns in the matrix.

  2. Eigenvalues and eigenvectors: If A is a square matrix, λ is an eigenvalue and x is an eigenvector of A if:

A x = λ x

The eigenvalues can be found by solving the characteristic equation:

|A - λ * I| = 0

  1. Orthogonal matrices: A square matrix A is orthogonal if its transpose is equal to its inverse:

A^T = A^(-1)

  1. Linear independence: A set of vectors {v1, v2, ..., vn} is linearly independent if no vector can be expressed as a linear combination of the others.

  2. Basis and dimension: A basis of a vector space is a set of linearly independent vectors that span the space. The dimension of the vector space is the number of vectors in its basis.

  3. Gram-Schmidt orthogonalization: A method to convert a set of linearly independent vectors into an orthogonal or orthonormal basis.

  4. Singular Value Decomposition (SVD): A factorization of a matrix A into the product of three matrices U, Σ, and V^T, where U and V are orthogonal matrices and Σ is a diagonal matrix containing the singular values of A:

A = U Σ V^T

  1. Linear transformation: A function T: V → W between two vector spaces V and W that preserves the operations of vector addition and scalar multiplication:

T(u + v) = T(u) + T(v) T(c u) = c T(u)

asked by [Topper hu]
asked 2 years ago