\(\newcommand{\augmatrix}[2]{\left(\begin{array}{@{}#1 |c@{}} #2 \end{array}\right)} \newcommand{\lt}{ < } \newcommand{\gt}{ > } \newcommand{\amp}{ & } \)

Section6.2Diagonalizing Matrices

SubsectionThe assignment

  • Read section 6.2 of Strang (pages 298-307).
  • Read the following.
  • Prepare the items below for presentation.

SubsectionDiagonalizing Matrices

The big result here is this:

The connection between the two conditions is that the matrix \(S\) has as its columns the eigenvectors of \(A\). (In fact, that is really the heart of the proof of this theorem. The rest is just details.)

If a matrix satisfies these two conditions, then we say it is diagonalizable. We should note right away that not all matrices are diagonalizable. We have already seen examples of matrices where the geometric multiplicity of an eigenvalue is less than the algebraic multiplicity, like \(A = \left( \begin{smallmatrix} 5 & 1 \\ 0 & 5 \end{smallmatrix}\right)\). In this case, it becomes impossible to find a basis consisting of eigenvectors.

In a way, this allows us to see something interesting: maybe a matrix really wants to be a diagonal matrix, but we are looking at the transformation \(A\) using “the wrong basis.” By wrong, here I mean that the standard basis is not the most convenient one, and another one makes our lives easier.

SubsectionSage and Diagonalization

Sage has built-in commands about diagonalization. We shall try a few out here. We need a matrix to play with, so we take this one:

We chose to define this matrix over AA because we need to find roots of polynomials when looking for eigenvalues. AA is the set of algebraic numbers, which just means the collection of all roots of polynomials with integer coefficients.

Sage has a command for finding the eigenvector decomposition \(A = S\Lambda S^{-1}\).

As you see, Sage returns a pair of matrices. One of them is diagonal, so that is probably \(\Lambda\). We'll use tuple unpacking to assign the matrices to sensible names.

Note that \(S\) has the eigenvectors of \(A\) as its columns, and the corresponding eigenvalues are lined up as the diagonal entries of \(\Lambda\).

Anyway, now we can check that everything lines up correctly:

SubsectionQuestions for Section 6.2

Task149

Let \(e_1, e_2, e_3\) be the standard basis of \(\mathbb{R}^3\): \begin{equation*} e_1 = \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}, \quad e_2 = \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}, \quad e_3 = \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}. \end{equation*} Make an example of an invertible \(3 \times 3\) matrix \(S\). Write your matrix as a matrix of column vectors. \begin{equation*} S = \begin{pmatrix} | & | & | \\ v_1 & v_2 & v_3 \\ | & | & | \end{pmatrix} \end{equation*} How do you know that the set \(\{ Se_1, Se_2, Se_3 \}\) is a basis for \(\mathbb{R}^3\)?

What is the connection between \(Se_1\), \(Se_2\), \(Se_3\), \(S^{-1}v_1\), \(S^{-1}v_2\), \(S^{-1}v_3\) and the original vectors \(e_1, e_2, e_3, v_1, v_2, v_3\)?

Finally, how do we use this to understand the way that the decomposition \(A = S\Lambda S^{-1}\) works?

Task150
Exercise 1 from section 6.2 of Strang.
Task151
Exercise 2 from section 6.2 of Strang.
Task152
Exercise 3 from section 6.2 of Strang.
Task153
Exercise 13 from section 6.2 of Strang.
Task154
Exercise 19 from section 6.2 of Strang.