Eigenvalues and Eigenvectors
Naturally, eigenvalues are defined only for square matrices because for non-square matrices, the transformation changes the dimensions of the resulting vector, hence for any pair . Normally, for an matrix, there are independent eigenvectors with .
Geometric interpretation
Section titled “Geometric interpretation”One intuitive way to understand eigenvectors is to look at any matrix as a transformation being applied to a vector . A transformation can rotate and/or scale a vector. There’s a very nice video about this by Grant Sanderson where he explains this idea with beautiful visuals. In essence, eigenvectors of a matrix (transformation) are those vectors that are not rotated, but only scaled without changing the direction. The magnitude of that scale is what we call the eigenvalue.
Properties
Section titled “Properties”The key property of eigenvectors can be seen by looking at for some positive integer . Let’s say is an eigenvector of with eigenvalue . Then we have
Thus, if is an eigenvector of with eigenvalue then is also an eigenvector of with eigenvalue . This also tells us another key fact about matrices:
The independent eigenvectors of form a basis for , so any vector can be expressed as a linear combination of the eigenvectors :
Similarity
Section titled “Similarity”A matrix is said to be similar to if for some invertible matrix .
This simple result is very useful in computation. For example, a software like MATLAB will use a sequence of matrices and reduce to where and for . This sequence can be chosen carefully so as to make a diagonal matrix, hence, the eigenvalues show up on the diagonal. This helps us calculate the eigenvalues of a matrix much faster.
For a matrix , the eigenspace associated with a set of eigenvalues is defined to be the subspace spanned by the eigenvectors associated with those eigenvalues.
Some obvious but important facts are as follows:
- The sum of the eigenvalues of is , which is the sum of the elements on the diagonal.
- The product of the eigenvalues of is .
- In general, eigenvalues of or cannot be inferred directly from eigenvalues of and .
The first two facts can be proved easily using the characteristic polynomial
We know that the sum of the roots of such a polynomial is determined by the coefficient of , being the degree of the polynomial. Also, the constant term in the polynomial determines the product of the roots.
Symmetric matrices
Section titled “Symmetric matrices”If is a symmetric matrix, then
- Eigenvalues of are real if is real.
- Eigenvectors of are orthogonal.
- We may have a full set of eigenvectors even if some eigenvalues are repeated. For example, consider the identity matrix. It has only one eigenvalue, , but every vector is an eigenvector.
Let be an symmetric matrix with eigenvalues . If we consider the matrix
we see that and are similar matrices. This means that there must be an such that . It is not hard to see that is the eigenvector matrix (columns of are eigenvectors of ), and we have the spectral decomposition .
General matrices
Section titled “General matrices”For a general square matrix (may not be symmetric), we can factorize it as
where is the eigenvector matrix and is the diagonal eigenvalue matrix. This factorization is another way to look at the fact that the eigenvectors of exponents of are the same as that of , with corresponding exponentiated eigenvalues. It is only when is symmetric that we can use , since the eigenvectors are orthogonal in that case.