The Palindrome

Share this post

Epsilons, no. 5: The QR decomposition

thepalindrome.org

Epsilons, no. 5: The QR decomposition

matrices + the Gram-Schmidt process = magic

Tivadar Danka
May 9, 2023
19
Share this post

Epsilons, no. 5: The QR decomposition

thepalindrome.org
Share

I start all my posts about matrix decompositions by saying that “matrix factorizations are the pinnacle results of linear algebra“. These few words perfectly express their place in mathematics, science, and engineering: it is hard to overestimate the practical importance of matrix decompositions.

Understanding mathematics is a superpower. Subscribing to The Palindrome will instantly unlock it for you. For sure. (Or at least help you get there, step by step.)

Behind (almost) every high-performance matrix operation, there is a matrix decomposition. This time, we’ll look at the one that enables us to iteratively find the eigenvalues of a matrix: the QR decomposition.

The QR decomposition

In essence, the QR decomposition factors an arbitrary matrix into the product of an orthogonal and an upper triangular matrix.

How does it work? We’ll illustrate everything with the 3 x 3 case, but everything works as is in general as well.

First, some notations. Every matrix can be thought of as a sequence of column vectors. Trust me, this simple observation is the foundation of many-many Eureka-moments in mathematics.

A matrix as column vectors

Why is this useful? Because this way, we can look at matrix multiplication as a linear combination of the columns.

Check out how matrix-vector multiplication looks from this angle. (You can easily work this out by hand if you don’t believe me.)

Matrix-vector product

In other words, a matrix times a vector equals a linear combination of the column vectors.

Similarly, the product of two matrices can be written in terms of linear combinations.

Matrix-matrix product

So, what’s the magic behind the QR decomposition? Simple: the vectorized version of the Gram-Schmidt process.

In a nutshell, the Gram-Schmidt process takes a linearly independent set of vectors and returns an orthonormal set that progressively generates the same subspaces.

The Gram-Schmidt process

The output vectors (qᵢ) can be written as the linear combination of the input vectors (aᵢ).

The output of the Gram-Schmidt process

In other words, using the column vector form of matrix multiplication, we obtain that in fact, A factors into the product of two matrices.

The QR decomposition

As you can see, one term is formed from the Gram-Schmidt process’ output vectors (qᵢ), while the other one is upper triangular.

However, the matrix of qᵢ-s is also special: as its columns are orthonormal, its inverse is its transpose. Such matrices are called orthogonal.

Orthogonal matrices

Thus, any matrix can be written as the product of an orthogonal and an upper triangular one, which is the famous QR decomposition.

The QR decomposition
The QR decomposition

When is this useful for us? For one, it is used to iteratively find the eigenvalues of matrices. This is called the QR algorithm, one of the most important methods of numerical linear algebra. (It was even named as one of the top 10 algorithms of the 20th century.)


If you are interested in the most important matrix factorization methods, check out my other posts on the subject. This little post is a part of my series covering the most important ones:

  1. The LU decomposition,

  2. The QR decomposition (this one),

  3. the spectral theorem (upcoming),

  4. and the Singular Value Decomposition (upcoming).

19
Share this post

Epsilons, no. 5: The QR decomposition

thepalindrome.org
Share
Previous
Next
Comments
Top
New
Community

No posts

Ready for more?

© 2023 Tivadar Danka
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing