Skip to main content

Section 8.2 Cofactor Expansion

Subsection 8.2.1 \(2 \times 2\) determinants

The determinant of a (square) matrix is constructed recursively, starting with \(2 \times 2\) matrices. Therefore, I need to first show how to calculate the determinant of a \(2 \times 2\) matrix.
\begin{equation*} \det \left( \begin{matrix} a \amp b \\ c \amp d \end{matrix} \right) = \left| \begin{matrix} a \amp b \\ c \amp d \end{matrix} \right| = ad-bc \end{equation*}

Example 8.2.1.

For the matrix \(\left( \begin{matrix}1 \amp -3 \\ 2 \amp -1 \end{matrix} \right)\text{,}\) the determinant is \((1)(-1) - (2)(-3) = -1 + 6 = 5\text{.}\) Therefore, I know that this matrix multiplies all areas by a factor of \(5\) and preserves orientation.

Subsection 8.2.2 The Cofactor Algorithm

Definition 8.2.2.

The algorithm for calculating determinants is called Co-Factor Expansion. It is a recursive algorithm that reduces determinant calculations to determinants of smaller square matrices, eventually down to \(2 \times 2\) matrices whose determinants have already been defined.
Co-factor expansion proceeds in this way: I choose any column or row of the matrix. I take the coefficients from that column or row. For each of the coefficients, I multiply that coefficient by the determinant of the matrix formed by removing both the column and row containing that coefficient. Then I add up the determinants of these small matrices multiplied by matching coefficients, with a pattern of \(\pm\) signs. That pattern of \(\pm\) signs is a checkerboard pattern, as in the following \(5 \times 5\) matrix.
\begin{equation*} \left( \begin{matrix} + \amp - \amp + \amp - \amp + \\ - \amp + \amp - \amp + \amp - \\ + \amp - \amp + \amp - \amp + \\ - \amp + \amp - \amp + \amp - \\ + \amp - \amp + \amp - \amp + \end{matrix} \right) \end{equation*}
That’s a hard algorithm to intuit from the formal description. I’ll do an example.

Example 8.2.3.

Here is a \(3 \times 3\) example where I choose the first row for co-factor expansion. For each element in the first row, I multiply the number by the determinant of the matrix with the first row and the matching column removed. I multiply by \((-1)\) in the second coefficient to match the checkerboard pattern of signs. After the first step, I can just calculate the three smaller determinants and simplify the arithmetic.
\begin{align*} \amp \begin{vmatrix} 5 \amp -2 \amp 0 \\ -3 \amp 3 \amp -2 \\ 1 \amp -5 \amp 3 \end{vmatrix} = +5 \begin{vmatrix} 3 \amp -2 \\ -5 \amp 3 \end{vmatrix} -(-2) \begin{vmatrix} -3 \amp -2 \\ 1 \amp 3 \end{vmatrix} +0 \begin{vmatrix} -3 \amp 3 \\ 1 \amp -5 \end{vmatrix} \\ \amp = 5((3)(3) - (-2)(-5)) + 2((-3)(3) - (-2)(1)) + (0)((-3)(-5)-(3)(1))\\ \amp = 5(-1) + 2(-7) + 0 = -19 \end{align*}
If I did co-factor expansion of an arbitrary \(3 \times 3\) matrix, I would get a direct expression for the determinant.
\begin{equation*} \left| \begin{matrix} a \amp b \amp c \\ d \amp e \amp f \\ g \amp h \amp i \end{matrix} \right| = aei - ahf - bdi + bfg + cdh - ceg \end{equation*}
I could use this formula for \(3 \times 3\) matrices if I wished. I could do the same for larger matrices, but it starts to get quite complicated. The computational complexity of the recursive determinant algorithm grows very quickly. For a \(3 \times 3\) matrix, I had 3 recursions to \(2 \times 2\) matrices, each with \(2\) multiplication terms in the determinant, giving \(6\) terms. Each term was the multiple of three coefficients, giving \(12\) total multiplications. For a \(4 \times 4\text{,}\) the recursion gives \(24\) terms, each with \(4\) coefficients, so \(24\cdot 3 = 72\) multiplications. For a \(5 \times 5\) matrix, I need to recurse five times to a \(4 \times 4\) matrix, for \(120\) terms each with \(5\) coefficients, which is \(120 \cdot 4 = 600\) multiplications. The pattern continues: there are \(n!\) terms in the determinant of a \(n \times n\) matrix, each with \(n\) coefficients, for \(n! (n-1)\) multiplications. This is computationally terrifying, making determinants of large matrices computationally very difficult.