Skip to main content

Section 8.4 Orthogonality and Symmetry

Subsection 8.4.1 Preserving Lengths and Angles

Definition 8.4.1.

An \(n \times n\) matrix \(A\) is called orthogonal if it preserves lengths. That is, for all \(v \in \RR^n\text{,}\) \(|v| = |Av|\text{.}\)
Preservation of lengths is not the only way to define orthogonal matrices. The following proposition shows the rich structure of this group of matrices. Before stating the proposition, here is a useful definition.

Definition 8.4.2.

Let \(i,j\) be two indices which range from 1 to \(n\text{.}\) The Kronecker delta is a twice-indexed set of numbers \(\delta_{ij}\text{,}\) which evaluates to \(0\) whenever \(i \neq j\) and \(1\) whenever \(i=j\text{.}\)
\begin{equation*} \delta_{ij} = \begin{cases} 0 \amp i \neq j \\ 1 \amp i = j \end{cases} \end{equation*}
The Kronecker delta is a convenient short-hand to indicate that matching indices are important. The expressions \(\delta_{5,6}\text{,}\) \(\delta_{4,10}\) and \(\delta_{9,3}\) all evaluate to \(0\) since the indices do not match. The expressions \(\delta_{4,4}\text{,}\) \(\delta_{17,17}\) and \(\delta_{1,1}\) all evaluate to \(1\) since the indices do match.
There are several very surprising statements here. First, preserving angles and preserving lengths are nearly equivalent (preserving angles needs the additional determinant condition). This is a little odd; there isn’t a strong intuition that exactly the same transformations should preserve both properties. Second, there is a convenient algebraic method of identifying orthogonal matrices in property d): I can multiply by the transpose: if I get the identity, I know the matrix is orthogonal. The equivalence of property d) and property f) is seen in simply calculating the matrix multiplication; it involves the dot products of all the columns, which must evaluate to \(0\) or \(1\) exactly matching the Kronecker delta.
Orthogonal matrices can be thought of as rigid-body transformations. Since they preserve both lengths and angles, they preserve the shape of anything formed of vertices and lines: any polygon, polyhedron, or higher-dimensional analogue. They may not preserve the position (they may be moved around, rotated, reflected, etc.), but they will preserve the shape. This makes orthogonal matrices extremely useful since many applications want to use transformations that don’t destroy polygons or polyhedra. Here is a proposition that gathers some other properties of orthogonal matrices.

Proof.

The proofs of these statements will be part of the activity for this week.

Example 8.4.5.

Even in \(\RR^3\text{,}\) orthogonality already gets a bit trickier than just the rotations/reflection of \(\RR^2\text{.}\) Consider the following matrix.
\begin{equation*} \begin{pmatrix} -1 \amp 0 \amp 0 \\ 0 \amp -1 \amp 0 \\ 0 \amp 0 \amp -1 \end{pmatrix} \end{equation*}
This matrix is orthogonal, but it isn’t a rotation or reflection in \(\RR^3\text{.}\) It is some kind of ‘reflection through the origin’ where every point is sent to the opposite point with respect to the origin. (The equivalent in \(\RR^2\) is a rotation by \(\pi\) radians). It isn’t even a physical transformation: it’s not something I can do with a physical object without destroying it. (If the object were, say, an inflatable beach-ball, this transformation would be equivalent to collasping the ball, turning it inside-out, and reinflating it that way). However, the transformations still satisfies the condition of orthogonality.

Example 8.4.6.

Here is another example of an orthogonal matrix in \(\RR^3\text{.}\)
\begin{equation*} \begin{pmatrix} \dfrac{5}{\sqrt{38}} \amp \dfrac{-3}{\sqrt{67}} \dfrac{-23}{\sqrt{2546}} \\[1em] \dfrac{-3}{\sqrt{38}} \amp \dfrac{-7}{\sqrt{67}} \dfrac{-9}{\sqrt{2546}} \\[1em] \dfrac{-2}{\sqrt{38}} \amp \dfrac{3}{\sqrt{67}} \dfrac{-44}{\sqrt{2546}} \\ \end{pmatrix} \end{equation*}
To check the orthogonality, I would just take the dot products of the columns with each other. I should get \(0\) for each of these dot products to show that the columns are perpendicular to each other. Then I would check that the length of each column is \(1\text{.}\)