Skip to content

Orthogonal Matrices

Let a matrix Q\Qv have orthonormal columns. The most important property of such a matrix is that

QTQ=I.\Qv^{\rm T}\Qv = \Iv.

But what about QQT\Qv\Qv^{\rm T}? A little thought reveals that QQT=I\Qv\Qv^{\rm T} = \Iv only if Q\Qv is a square matrix, but not in general. In the square case, Q\Qv is called orthogonal (or orthonormal). It’s confusing to say orthogonal when we actually mean orthonormal, but most results rely on the fact that a bunch of orthogonal vectors vi\vv_i can be made orthonormal by scaling them to have vi=1\lVert \vv_i\rVert = 1. Hence, in a lot of places we see the word orthogonal being used when the authors actually mean orthonormal.

A simple example of a 2×22\times 2 orthogonal matrix is the rotation matrix

Q=(cosθsinθsinθcosθ),\Qv = \begin{pmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{pmatrix},

or the reflection matrix

Q=(cosθsinθsinθcosθ).\Qv = \begin{pmatrix} \cos\theta & \sin\theta \\ \sin\theta & -\cos\theta \end{pmatrix}.

The rotation matrix simply rotates a vector by θ\theta in the anti-clockwise direction, while the reflection matrix reflects a vector about a mirror placed on a line of slope tan(θ/2)\tan(\theta/2) passing through the origin. These reflections are called Householder reflections.

Start with a unit vector u\uv, so uTu=1\uv^{\rm T}\uv = 1. The Householder reflections are defined to be a family of symmetric orthogonal matrices as follows:

H=I2uuT.\Hv = \Iv - 2\uv\uv^{\rm T}.

It is easy to see that H\Hv is orthogonal by checking that HTH=I\Hv^{\rm T}\Hv = \Iv.

Define the 22-dimensional Hadamard matrix by

H2=(1111).\Hv_2 = \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}.

Then define H2n+1=(H2nH2nH2nH2n)\Hv_{2^{n+1}} = \begin{pmatrix} \Hv_{2^n} & \Hv_{2^n} \\ \Hv_{2^n} & -\Hv_{2^n} \end{pmatrix} for all n>1n > 1.

This gives a construction of these matrices for any power of 2. But in general, it is conjectured that Hadamard matrices for any multiple of 4 exist. These matrices are known to be orthogonal.

Orthogonal matrices are good because they don’t change the magnitude of vectors, so a transformation by an orthogonal matrix will not overflow and keep things nice. Therefore, these matrices are important in computational linear algebra problems. We also know that the eigenvectors of a symmetric matrix are a natural source of orthogonal matrices, and hence a lot of things revolve around eigenvectors.