#### Discover more from The Palindrome

I start all my posts about matrix decompositions by saying that â€œmatrix factorizations are the pinnacle results of linear algebraâ€œ. These few words perfectly express their place in mathematics, science, and engineering: it is hard to overestimate the practical importance of matrix decompositions.

Behind (almost) every high-performance matrix operation, there is a matrix decomposition. This time, weâ€™ll look at the one that enables us to iteratively find the eigenvalues of a matrix: the QR decomposition.

In essence, the QR decomposition factors an arbitrary matrix into the product of an orthogonal and an upper triangular matrix.

How does it work? Weâ€™ll illustrate everything with the 3 x 3 case, but everything works as is in general as well.

First, some notations. Every matrix can be thought of as a sequence of column vectors. Trust me, this simple observation is the foundation of many-many Eureka-moments in mathematics.

Why is this useful? Because this way, we can look at matrix multiplication as a linear combination of the columns.

Check out how matrix-vector multiplication looks from this angle. (You can easily work this out by hand if you donâ€™t believe me.)

In other words, a matrix times a vector equals a linear combination of the column vectors.

Similarly, the product of two matrices can be written in terms of linear combinations.

So, whatâ€™s the magic behind the QR decomposition? Simple: the vectorized version of theÂ Gram-Schmidt process.

In a nutshell, the Gram-Schmidt process takes a linearly independent set of vectors and returns an orthonormal set that progressively generates the same subspaces.

The output vectors (**qáµ¢)** can be written as the linear combination of the input vectors (**aáµ¢**).

In other words, using the column vector form of matrix multiplication, we obtain that in fact, *A* factors into the product of two matrices.

As you can see, one term is formed from the Gram-Schmidt processâ€™ output vectors (**qáµ¢**), while the other one is upper triangular.

However, the matrix of **qáµ¢**-s is also special: as its columns are orthonormal, its inverse is its transpose. Such matrices are called *orthogonal*.

Thus, any matrix can be written as the product of an orthogonal and an upper triangular one, which is the famous QR decomposition.

When is this useful for us? For one, it is used to iteratively find the eigenvalues of matrices. This is called the QR algorithm, one of the most important methods of numerical linear algebra. (It was even named as one of the top 10 algorithms of the 20th century.)

If you are interested in the most important matrix factorization methods, check out my other posts on the subject. This little post is a part of my series covering the most important ones:

The QR decomposition (this one),

the spectral theorem (upcoming),

and the Singular Value Decomposition (upcoming).