Matrix factorizations are the pinnacle results of linear algebra.

Factorizations enable both theoretical and numerical results, ranging from inverting matrices to dimensionality reduction of feature-rich datasets. Check out my Linear Algebra for Machine Learning book if you don’t believe me.

In this mini-series within the Epsilon series, we’ll take a quick look at four of the most important matrix factorization methods:

the LU decomposition,

the QR decomposition,

the spectral theorem,

and the Singular Value Decomposition.

Let’s start with the first one: the LU decomposition, that is, the factorization of a matrix into the product of an upper and lower triangular one.

Why is such a decomposition useful? There are two main applications:

computing determinants,

and inverting matrices.

For instance, check out how the LU decomposition simp…

## Keep reading with a 7-day free trial

Subscribe to The Palindrome to keep reading this post and get 7 days of free access to the full post archives.