# SubsectionThe Assignment

• Read chapter 6 section 7 of Strang
• Read the following and complete the exercises below.

# SubsectionDiscussion: The Singular Value Decomposition

The Singular Value Decomposition is an adaptation of the ideas behind eigenvectors and eigenvalues for non-square matrices. If our matrix $A$ is $n \times m$, the idea is to choose

• An orthonormal basis $\{ v_1, \ldots, v_n\}$ for $\mathbb{R}^n$,
• and an orthonormal basis $\{ u_1, \ldots, u_n \}$ for $\mathbb{R}^m$

so that \begin{equation*}Av_i = \sigma_i u_i, \qquad \text{for $1 \leq i \leq r$}\end{equation*} where $r = rank(A)$. If we pile up all of those equations, we get a statement like this one:

\begin{equation*} A \begin{pmatrix} | & | & \dots & | \\ v_1 & v_2 & \dots & v_r \\ | & | & \dots & | \end{pmatrix} = \begin{pmatrix} \sigma_1 & 0 & \dots & 0 \\ 0 & \sigma_2 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & \sigma_r \end{pmatrix} \begin{pmatrix} | & | & \dots & | \\ u_1 & u_2 & \dots & u_r \\ | & | & \dots & | \end{pmatrix} \end{equation*}

Eventually, this will lead us to a matrix decomposition of the form

\begin{equation*} A = U \Sigma V^T .\end{equation*}

# SubsubsectionHow to do it

So, suppose that $A$ is an $m \times n$ matrix. The key fact we need is that $A^TA$ is a symmetric $n \times n$ matrix. Later, when discussing properties, it is important that $AA^T$ is a symmetric $m \times m$ matrix, and also that $A(A^TA) = (AA^T)A$.

1. Step One: Compute a spectral decomposition of $A^TA$. Since $A^TA$ is a square symmetric matrix, we can find an orthonormal basis $v_1, v_2, \ldots, v_n$ for $\mathbb{R}^n$ which consists of eigenvectors for $A^TA$. If we bundle these together as the columns of a matrix $V$, we can make the spectral decomposition \begin{equation*} A^TA = V D V^T, \end{equation*} where $D$ is a diagonal matrix having the eigenvalues $\lambda_1, \lambda_2, \ldots, \lambda_n$ for entries. NOTE: We will organize things so that the eigenvalues get smaller as we go further in the list.

Now, some of the eigenvaules might be zero, but none will be negative. We will have exactly $r = \mathrm{rank}(A)$ non-zero eigenvalues, and the other $n-r$ will be equal to zero. (Those come from the directions in the null space!) Don't sweat it. Everything is going to be fine.

2. Step Two: For each $0 \leq i \leq r$, set \begin{equation*}\sigma_i = \sqrt{\lambda_i} .\end{equation*} These are the “Singular Values” named in the singular value decomposition.

3. Step Three: for each $0 \leq i \leq r$, set \begin{equation*} u_i = \dfrac{1}{\sigma_i} Av_i. \end{equation*} By the way things are set up, we are sure $\sigma_i$ is not zero! The vectors we just made are a basis for the column space of $A$.

4. Step Four: If $m > r$, Choose an orthonormal basis $u_{r+1}, \ldots, u_m$ for the null space of $A^T$.

5. Step Five: Bundle together the $v_i$'s as columns of an $n \times n$ matrix $V$, and the $u_i$'s as the columns of an $m \times m$ matrix $U$. Both of these are orthogonal matrices. Then place the $\sigma_i$'s on the main diagonal of a rectangular $m \times n$ matrix which is otherwise filled with zeros. Call that new matrix $\Sigma$.

Strang has a good run-down of the neat properties this set up has.

# SubsectionSage and the SVD

Sage has a built in command for the singular value decomposition of a matrix. If you have a matrix A, the command is A.SVD(). But there is a little trick to using it! At present, Sage only has this function implemented for matrices defined over rings of “floating point numbers”. The best way around this is to either define your matrix with entries in the ring RDF, or use the .change_ring(RDF) method on you matrix before you use the SVD.

# SubsectionExercises

Complete exercise 6.7.1 from Strang.
Complete exercise 6.7.2 from Strang.
Complete exercise 6.7.3 from Strang.