\square Summary: Square matrix of size 5. Singular, nullity 2. 2 distinct eigenvalues, each of “high” multiplicity.
\square A square matrix (Definition SQM). Notice how the following analysis parallels the analysis of the coefficient matrix of the linear systems in previous archetypes, yet there is no system discussed explicitly for this archetype.\begin{bmatrix} -2 & -1 & -2 & -4 & 4 \\ -6 & -5 & -4 & -4 & 6 \\ 10 & 7 & 7 & 10 & -13 \\ -7 & -5 & -6 & -9 & 10 \\ -4 & -3 & -4 & -6 & 6 \\ \end{bmatrix}
\square Row-equivalent matrix in reduced row-echelon form (Definition RREF).\begin{bmatrix} \leading{1} & 0 & 0 & 1 & -2 \\ 0 & \leading{1} & 0 & -2 & 2 \\ 0 & 0 & \leading{1} & 2 & -1 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix}
\square Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system.\begin{align*}r&=5&D&=\set{1,\,2,\,3}&F&=\set{4,\,5}\end{align*}
\square Is the matrix nonsingular or singular? (Consider Theorem NMRRI. At the same time, examine the sizes of the sets D and F for the analysis of the reduced row-echelon version of the matrix.)
Singular.
\square The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogenous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in D versus entries for indices in F.\begin{align*}\spn{\set{\colvector{-1\\2\\-2\\1\\0},\,\colvector{2\\-2\\1\\0\\1}} }\end{align*}
\square The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set D above (Theorem BCS).\begin{align*}\spn{\set{\colvector{-2\\-6\\10\\-7\\-4},\,\colvector{-1\\-5\\7\\-5\\-3},\,\colvector{-2\\-4\\7\\-6\\-4}} }\end{align*}
\square The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix L is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of L, computed as according to Theorem FS and Theorem BNS. When r=m, the matrix L has no rows and the column space is all of \complex{m}.\begin{align*}L&=\begin{bmatrix}1&0&-2&-6&5\\0&1&4&10&-9\end{bmatrix}\end{align*}\begin{align*}\spn{\set{ \colvector{-5\\9\\0\\0\\1},\, \colvector{6\\-10\\0\\1\\0},\, \colvector{2\\-4\\1\\0\\0} } }\end{align*}
\square The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space.\begin{align*}\spn{\set{\colvector{ 1\\0\\0\\\frac{9}{4}\\\frac{5}{2}},\,\colvector{0\\1\\0\\\frac{5}{4}\\\frac{3}{2}},\, \colvector{0\\0\\1\\\frac{1}{2}\\1}} }\end{align*}
\square Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS)\begin{align*}\spn{\set{\colvector{1\\0\\0\\1\\-2},\,\colvector{0\\1\\0\\-2\\2},\,\colvector{0\\0\\1\\2\\-1}} }\end{align*}
\square Inverse of the matrix, if it exists (Definition MI). By Theorem NI an inverse exists only if the matrix is nonsingular.
\square Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC.\begin{align*}\text{Rank: }3&&\text{Nullity: }2&&\text{Matrix columns: }5&\end{align*}
\square Determinant of the matrix. The matrix is nonsingular if and only if the determinant is nonzero (Theorem SMZD).\begin{align*}\text{Determinant: }0\end{align*}
\square Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check.\begin{align*}\eigensystem{L}{-1}{\colvector{-5\\9\\0\\0\\1},\,\colvector{6\\-10\\0\\1\\0},\,\colvector{2\\-4\\1\\0\\0}}\\ \eigensystem{L}{0}{\colvector{2\\-2\\1\\0\\1},\,\colvector{-1\\2\\-2\\1\\0}} \end{align*}
\square Geometric and algebraic multiplicities (Definition GME, Definition AME).\begin{align*}\geomult{L}{-1}&=3&\algmult{L}{-1}&=3\\ \geomult{L}{0}&=2&\algmult{L}{0}&=2 \end{align*}
\square Diagonalizable (Definition DZM)?Yes, full eigenspaces, Theorem DMFE.
\square The diagonalization (Theorem DC).\begin{align*} \begin{bmatrix}4&3&4&6&-6\\7&5&6&9&-10\\ -10&-7&-7&-10&13\\-4&-3&-4&-6&7\\-7&-5&-6&-8&10 \end{bmatrix}\begin{bmatrix} -2 & -1 & -2 & -4 & 4 \\ -6 & -5 & -4 & -4 & 6 \\ 10 & 7 & 7 & 10 & -13 \\ -7 & -5 & -6 & -9 & 10 \\ -4 & -3 & -4 & -6 & 6 \\ \end{bmatrix} \begin{bmatrix}-5&6&2&2&-1\\9&-10&-4&-2&2\\ 0&0&1&1&-2\\0&1&0&0&1\\1&0&0&1&0 \end{bmatrix} &= \begin{bmatrix}-1&0&0&0&0\\0&-1&0&0&0\\ 0&0&-1&0&0\\0&0&0&0&0\\0&0&0&0&0 \end{bmatrix} \end{align*}