A matrix decomposition is a way to express a matrix as a combination of other matrices, which are ostensibly simpler matrices. The combination is often a product of two or three matrices, though it can be a sum, as in the case of a rank one decomposition ([provisional cross-reference: rank one decompositions]). The constituent matrices are simpler because they have many zero entries, or have many strategically placed entries that are one, or the nonzero entries lie on the diagonal (or close by) or …. Furthermore, the constituent matrices may be simpler because they have desirable properties that make them eaier to work with, such as being nonsingular or triangular or Hermitian or …. We will see examples of all of this behavior in this chapter.
There is a “Big Five” of matrix decompositions, which you will come to know as the LU, QR, SVD, Schur and Cholesky. Every student of advanced linear algebra should become intimately familiar with these basic decompositions. There are many other ways to decompose a matrix, and we will see these at other junctures. Encyclopedic texts like Horn & Johnson, [provisional cross-reference: horn-johnson] or Watkins [provisional cross-reference: watkins] are good places to begin exploring more.