Orthogonal Matrices
Let a matrix have orthonormal columns. The most important property of such a matrix is that
But what about ? A little thought reveals that only if is a square matrix, but not in general. In the square case, is called orthogonal (or orthonormal). It’s confusing to say orthogonal when we actually mean orthonormal, but most results rely on the fact that a bunch of orthogonal vectors can be made orthonormal by scaling them to have . Hence, in a lot of places we see the word orthogonal being used when the authors actually mean orthonormal.
Orthogonality
Section titled “Orthogonality”A simple example of a orthogonal matrix is the rotation matrix
or the reflection matrix
The rotation matrix simply rotates a vector by in the anti-clockwise direction, while the reflection matrix reflects a vector about a mirror placed on a line of slope passing through the origin. These reflections are called Householder reflections.
Householder reflections
Section titled “Householder reflections”Start with a unit vector , so . The Householder reflections are defined to be a family of symmetric orthogonal matrices as follows:
It is easy to see that is orthogonal by checking that .
Hadamard matrices
Section titled “Hadamard matrices”Define the -dimensional Hadamard matrix by
Then define for all .
This gives a construction of these matrices for any power of 2. But in general, it is conjectured that Hadamard matrices for any multiple of 4 exist. These matrices are known to be orthogonal.
Utility
Section titled “Utility”Orthogonal matrices are good because they don’t change the magnitude of vectors, so a transformation by an orthogonal matrix will not overflow and keep things nice. Therefore, these matrices are important in computational linear algebra problems. We also know that the eigenvectors of a symmetric matrix are a natural source of orthogonal matrices, and hence a lot of things revolve around eigenvectors.