\(\newcommand{\augmatrix}[2]{\left(\begin{array}{@{}#1 |c@{}} #2 \end{array}\right)} \newcommand{\lt}{ < } \newcommand{\gt}{ > } \newcommand{\amp}{ & } \)

Section6.3The Spectral Theorem

SubsectionThe Assignment

  • Read chapter 6 section 4 of Strang
  • Read the following and complete the exercises below.

SubsectionDiscussion: The Spectral Theorem for Symmetric Matrices

We are now in a position to discuss a major result about the structure of symmetric (square) matrices: The Spectral Theorem.

I think Strang's argument for the truth of this theorem is too terse, and hence confusing for a first time reader. The main points of the argument are these:

  1. If \(X\) is a symmetric matrix, then all of its eigenvalues are real numbers.
  2. If \(X\) is a symmetric matrix and \(v\) and \(w\) are eigenvectors of \(X\) which correspond to different eigenvalues, then \(v\) and \(w\) are orthogonal vectors.
  3. If \(X\) is a symmetric matrix and \(\lambda\) is an eigenvalue of \(X\), then subspace of \(\lambda\)-eigenvectors for \(X\) has dimension equal to the multiplicity of \(\lambda\) as a root of the characteristic polynomial of \(X\). (This point is often stated by saying that the geometric multiplicity of \(\lambda\) is equal to the algebraic multiplicity of \(\lambda\).)

Let's clear up that bit about the different types of multiplicity. We can identify eigenvalues by finding them as roots of the characteristic polynomial \(p_A(t) = \mathrm{det}(A-t\cdot I)\) of \(A\). Of course, an particular root can be a root multiple times. For example, \(5\) is a root of the polynomial \(t^2 - 5t + 25=0\) twice. So we say \(5\) has multiplicity two. In the context of eigenvalues, this multiplicity is called the algebraic multiplicity of an eigenvalue, since it comes out of the consideration of the algebra.

Another way to count up the number of times a number counts as an eigenvalue is to use the number of eigenvectors corresponding to that number. But we only want to count up truly independent directions, so we should use the dimension of the subspace of eigenvectors. This is the geometric multiplicity of an eigenvalue. It is a fact that the geometric multiplicity is not greater than the algebraic multiplicity. But the two can be different. For example, consider this matrix: \begin{equation*}G = \begin{pmatrix} 5 & 1 \\ 0 & 5 \end{pmatrix}.\end{equation*}

Now, how does one understand the Spectral Theorem? It basically guarantees that we can always find (a) enough eigenvalues (as real numbers), and (b) for each eigenvalue, enough eigenvectors. The hardest parts of the proof come from part (b) where you have to produce enough eigenvectors. But in practice, if you have an example of a symmetric matrix, you can find the decomposition mentioned in the theorem pretty easily. First, find the eigenvalues. Then for each eigenvalue \(\lambda\), find an orthonormal basis for the eigenspace \begin{equation*}E_{\lambda} = \mathrm{null}(A-\lambda\cdot I).\end{equation*} That second bit can be done in two steps, first find a basis for \(E_{\lambda}\) (special solutions!) and then apply the Gram-Schmidt algorithm to find an orthonormal basis for \(E_{\lambda}\). Collecting all of these bases together will make a basis for \(\mathbb{R}^n\).

SubsectionSage and the Spectral Theorem

Sage does not have any built-in commands that deal with the spectral decomposition of a symmetric square matrix. But here are a few commands that you might find useful as you hack your solution together by hand:

The first command you might find useful is .change_ring(). This is helpful for those times when you define a matrix over some convenient ring like QQ, but then want to work with eigevalues and eigenvectors and so need a bigger ring that you can take roots in. Using this command doesn't change the matrix, so much as tell Sage to think of it as having entries from a different set of numbers.

The command .jordan_form(transformation=True) will return a pair consisting of a diagonal matrix with the eigenvalues as entries and an invertible matrix consisting of a basis of eigenvectors. These eigenvectors will NOT be an orthonormal basis. You will have to use Gram-Schmidt to fix this to a proper basis promised by the theorem.

Note: the Jordan Form is a generalization of the diagonalization process that works for matrices which might not be symmetric. We'll use it here to short-cut some of the work.

Let's do an example. First, we will make a symmetric matrix. Then we will find the eigenvalues and eigenvectors

In this case, we have three different \(1\)-dimensional eigenspaces, so things are not too hard! If we apply Gram-Schmidt, we will just normalize those vectors.

That is about as close as we can get to displaying the original X.

SubsectionExercises

Task155
From Strang section 6.4, do exercise 3.
Task156
From Strang section 6.4, do exercise 4.
Task157
From Strang section 6.4, do exercise 5.
Task158
From Strang section 6.4, do exercise 6.
Task159
From Strang section 6.4, do exercise 8.
Task160
From Strang section 6.4, do exercise 11.
Task161
From Strang section 6.4, do exercise 12.
Task162
From Strang section 6.4, do exercise 24.