Matrix Factorization
There are four very popular factorizations for matrices. In the literature, the word decomposition is often used instead of factorization. Note that we denote special types of matrices with special characters. This convention is as follows:
- denotes a symmetric matrix.
- denotes an orthogonal or orthonormal matrix.
- denotes the diagonal matrix with eigenvalues of the matrix being factored.
- denotes a matrix that is closely related to the rank of the matrix being factored.
LU decomposition
Section titled “LU decomposition”This is the lower-upper (LU) decomposition applicable to square matrices , and the factorization can be viewed as the matrix form of Gaussian elimination. We write
where and are lower and upper triangular matrices.
If is symmetric and positive definite, then we can find and have
This decomposition is used in algorithms to
- solve a square system of linear equations,
- compute the inverse of a matrix, and
- compute the determinant of a matrix.
QR decomposition
Section titled “QR decomposition”Here we factor an matrix into an orthogonal matrix and an triangular matrix . A popular method to compute this factorization is the Gram-Schmidt process. If is square then is unique. We write
where is orthogonal and is upper-triangular.
The QR decomposition makes it easy to solve a system of equations without inverting the matrix . Since is orthogonal, we have , so is equivalent to , which is easier to solve since is triangular.
Spectral decomposition
Section titled “Spectral decomposition”Also called eigendecomposition, this is a factorization of a square matrix into eigenvalues and eigenvectors, , where is a diagonal matrix with the eigenvalues of and the columns of are the corresponding eigenvectors.
Although this factorization is possible for any square matrix with linearly independent eigenvectors, it is usually used for symmetric matrices . Since the eigenvectors can be made orthonormal for a symmetric matrix, the factorization is written as:
where is an orthogonal matrix and is a diagonal matrix with the eigenvalues of on the diagonal.
So, with being the column normalized eigenvectors of , and
with being the eigenvalues of .
Now consider , and view it as
which is a sum of a bunch of rank- matrices. The column of is and the row of is Hence, we can rewrite the above sum as
From this expression, it is easy to see that , and thus are the eigenvalue - eigenvector pairs, since the vectors are orthonormal.
Singular value decomposition
Section titled “Singular value decomposition”We have a separate dedicated section about SVD, but in essence we can factorize any matrix as
where and are orthogonal matrices and is a non-negative diagonal matrix. The values on the diagonal of are called singular values. Like the spectral decomposition above, the idea of an SVD is to find basis directions along which matrix multiplication is equivalent to scalar multiplication, but this is in general for any matrix instead of a square matrix.