\(\newcommand{\augmatrix}[2]{\left(\begin{array}{@{}#1 |c@{}} #2 \end{array}\right)} \newcommand{\lt}{ < } \newcommand{\gt}{ > } \newcommand{\amp}{ & } \)

Section2.7The \(LU\) Decomposition

SubsectionThe Assignment

  • Read section 2.6 of Strang.
  • Watch a video from the YouTube series Essence of Linear Algebra by user 3Blue1Brown.
  • Read the following and complete the exercises below.

SubsectionLearning Goals

Before class, a student should be able to:

  • Use Gaussian Elimination to find the \(LU\) and \(LDU\) decompositions of a matrix.
  • Describe when the process of Gaussian Elimination will fail to produce an \(LU\) decomposition.

Sometime after class, a student should be able to:

  • Solve a system of equations by using the \(LU\) decomposition and two triangular systems.
  • Explain the connection between matrix elimination and the \(LU\) or \(LDU\) factorization of a matrix.

SubsectionDiscussion: The \(LU\) Decomposition of a Matrix

We now look at the ideas behind elimination from a more advanced perspective. If we think about the matrix multiplication form of the forward pass, we can realize it a matrix decomposition theorem:

There are three key observations that make this work:

  • Each of the matrices \(E_{ij}\) that affects a row operation of the form add a multiple of row \(i\) to row \(j\) is an invertible matrix, with an easy to find inverse.
  • If we make a sequence of row operations in the forward pass using matrices \(E_k\), then we are essentially computing a big product \begin{equation*} E_k \dots E_1 A = U \end{equation*} where each of the \(E_i\)'s is a lower triangular matrix and the matrix \(U\) is upper triangular. This can the be rewritten as \begin{equation*} A = \left(E_1^{-1} \dots E_k^{-1} \right) U . \end{equation*} Note that the inverses have to be done in reverse order for things to cancel out properly.
  • Finally, the product \(L = E_1^{-1} \dots E_k^{-1}\) is really easy to compute, because its entries are simply the negatives of the multipliers we used to do the operations in the forward pass.

SubsubsectionA Nice Computational Result

One important output of this comes into play when we want to compute solutions to equations like \(Ax = b\). Since we can write \(A = LU\), then our equation can be split into two (big) steps:

  1. First find the solution to the equation \(Ly = b\).
  2. Then find the solution to the equation \(Ux = y\).

First, note that this is a good thing because both of the systems \(Ly = b\) and \(Ux = y\) are triangular. They can be solved by back substitution. In \(Ly = b\) you work from the top down, and in \(Ux=y\) you work from the bottom up.

Second, this works because following this process gives us a vector \(x\) which will satisfy this: \begin{equation*} Ax = (LU)x = L (Ux) = Ly = b. \end{equation*}

Third, this doesn't really save time when you only want to solve one equation \(Ax= b\). But if you have lots of different values of \(b_i\), and you want to solve all of the equations \(Ax = b_i\), it becomes a lot faster to factor the matrix \(A= LU\) once and do two back substitutions for each value of \(b_i\).

SubsectionSageMath and The \(LU\) Decomposition

A neat feature of linear algebra is that simple facts about solving equations have several different incarnations. This section contains the first big example: Gaussian Elimination leads to a multiplicative decomposition (a factorization) for matrices.

Each step of Gaussian elimination is a simple row operation, and if we do the process in the standard order, then the \(LU\) decomposition can be read out directly, without any extra computation.

First, let us recall how SageMath can help us check bits of the three key observations above.

Consider a matrix which performs an elementary row operation of the form “add a multiple of one row to another”. The matrix \(E\) below performs the operations add \(-4\) times row 2 to row 3.

Note that the inverse just came from changing the sign of that one entry. This makes sense for the following reason: the opposite operation to “add \(-4\) times row 2 to row 3” should be “Add \(4\) times row 2 to row 3”. That is the way you undo the operation!

SubsubsectionStudy Break: Try it yourself

Make your own \(3\times 3\) matrix and check the whole procedure.

SubsubsectionSageMath Commands to short-cut the process

Here is the basic command for getting SageMath to compute the \(LU\) decomposition directly.

Hold on, the output is three matrices. Not two, but three. One is upper triangular, one is lower triangular, but the first one is a permutation matrix. (It switches rows 1 and 3.) What is going on? If you perform a search in the SageMath documentation, you find this page. There is a description of the command, and the first bit is something about a “pivoting strategy” and row swaps. But we don't want row swaps.

By reading carefully, we can see what the way through is, too. We can specify our pivoting strategy by adding the keyword argument pivot="nonzero" inside the parentheses. Then the algorithm used will match the one Strang describes.

(If you are using SMC, you can access the help using many other ways. But a Google search for SageMath "topic" will hit the documentation pretty reliably.)

Aaah! There we go, now the permutation part is the identity. Note that the command returns a “tuple”. This is a collection of things, kind of like a list. (Technical Python details omitted here.) To grab the information out, we assign the parts of that output to different names so we can use them.

Those parts should be factors of \(A\). We can check:

And we can have SageMath check if they are really equal.

SubsubsectionWhat about the \(LDU\) decomposition?

For now, SageMath has no built-in \(LDU\) decomposition.

SubsubsectionAn insurmountable obstacle

Some matrices require permutations of rows. In these cases, we have to have some pivoting strategy must be employed. Consider this example.

This has a row-swap permutation matrix, and it must. Since the (1,1) entry of B is zero, but numbers below that are not zero, we cannot use zero as a pivot. We'll sort out how to handle this in the next section.


Keep this in mind. The computations are simple, but tedious. Perhaps you want to use an appropriate tool.


Consider the following system of 3 linear equations in 3 unknowns. \begin{equation*}\left\{ \begin{array}{rrrrrrr} x & + & y & + & z & = & 5 \\ x & + & 2y & + & 3z & = & 7 \\ x & + & 3y & + & 6z & = & 11 \end{array}\right. \end{equation*} Perform the forward pass of elimination to find an equivalent upper triangular system. Write down this upper triangular system. What three row operations do you need to perform to make this work?

Use the information you just found to write a matrix decomposition \(A = LU\) for the coefficient matrix \(A\) for this system of equations. (Be sure to multiply the matrices \(L\) and \(U\) to check your work.)


Solve the two systems \(Ly = b\) and \(Ux=y\) obtained in the last exercise.

Solve the system \(Ax=b\) directly using Gauss-Jordan elimination (hint: use SageMath) and make sure that the results are the same.

Consider the matrix \(A\) below. Find the matrix \(E\) which transforms \(A\) into an upper triangular matrix \(EA = U\). Find \(L = E^{-1}\). Use this to write down the \(LU\) decomposition \(A= LU\) of \(A\). \begin{equation*} A = \begin{pmatrix} 2 & 1 & 0 \\ 0 & 4 & 2 \\ 6 & 3 & 5 \end{pmatrix} \end{equation*}
The matrix below is symmetric, because if you flip it across its main diagonal you get the same thing. Find the \(LDU\) triple decomposition of this symmetric matrix. \begin{equation*} B = \begin{pmatrix} 2 & 4 \\ 4 & 11 \end{pmatrix} \end{equation*}
The matrix below is symmetric, because if you flip it across its main diagonal you get the same thing. Find the \(LDU\) triple decomposition of this symmetric matrix. \begin{equation*} C = \begin{pmatrix} 1 & 4 & 0 \\ 4 & 12 & 4 \\ 0 & 4 & 0 \end{pmatrix} \end{equation*}
The matrix below is symmetric, because if you flip it across its main diagonal you get the same thing. Find the \(LU\) decomposition of this symmetric matrix. \begin{equation*} D = \begin{pmatrix} a & a & a & a \\ a & b & b & b \\ a & b & c & c \\ a & b & c & d \end{pmatrix} \end{equation*} What conditions on the variables \(a\), \(b\), \(c\), and \(d\) will guarantee that this matrix has four pivots?
Find an example of a \(3\times 3\) matrix \(A\) which has all of its entries non-zero, so that the \(LU\) decomposition has \(U = I\), where \(I\) is the identity matrix, or explain why no such example exists.