Skip to main content

Section 1.8 Positive Semi-Definite Matrices

Positive semi-definite matrices (and their cousins, positive definite matrices) are square matrices which in many ways behave like non-negative (respectively, positive) real numbers. These results will be useful as we study various matrix decompositions in Chapter Chapter 2.

Definition 1.8.1. Positive Semi-Definite Matrix.

A square matrix \(A\) of size \(n\) is positive semi-definite if \(A\) is Hermitian and for all \(\vect{x}\in\complex{n}\text{,}\) \(\innerproduct{\vect{x}}{A\vect{x}}\geq 0\text{.}\)

For a definition of positive definite replace the inequality in the definition with a strict inequality, and exclude the zero vector from the vectors \(\vect{x}\) required to meet the condition. Similar variations allow definitions of negative definite and negative semi-definite.

Our first theorem in this section gives us an easy way to build positive semi-definite matrices.

We will give the proof for the first matrix, the proof for the second is entirely similar. First we check that \(\adjoint{A}A\) is Hermitian, with an appeal to Definition HM,

\begin{equation*} \adjoint{\left(\adjoint{A}A\right)} =\adjoint{A}\adjoint{\left(\adjoint{A}\right)}=\adjoint{A}A \end{equation*}

Second, for any \(\vect{x}\in\complex{n}\text{,}\) Theorem AIP and Theorem PIP give,

\begin{equation*} \innerproduct{\vect{x}}{\adjoint{A}A\vect{x}} =\innerproduct{\adjoint{\left(\adjoint{A}\right)}\vect{x}}{A\vect{x}}=\innerproduct{A\vect{x}}{A\vect{x}}\geq 0 \end{equation*}

which is the second condition for a positive semi-definite matrix.

A statement very similar to the converse of this theorem is also true. Any positive semi-definite matrix can be realized as the product of a square matrix, \(B\text{,}\) with its adjoint, \(\adjoint{B}\text{.}\) (See Exercise [provisional cross-reference: unwritten exercise about positive semi-definite and adjoints] after studying this entire section.) The matrices \(\adjoint{A}A\) and \(A\adjoint{A}\) will be important later when we define singular values in Section [provisional cross-reference: section-SVD].

Positive semi-definite matrices can also be characterized by their eigenvalues, without any mention of inner products. This next result further reinforces the notion that positive semi-definite matrices behave like non-negative real numbers.

First notice first that since we are considering only Hermitian matrices in this theorem, it is always possible to compare eigenvalues with the real number zero, since eigenvalues of Hermitian matrices are all real numbers (Theorem HMRE).

(⇒) Let \(\vect{x}\neq 0\) be an eigenvector of \(A\) for \(\lambda\text{.}\) Since \(A\) is positive semi-definite,

\begin{equation*} \lambda\innerproduct{\vect{x}}{\vect{x}}=\innerproduct{\vect{x}}{\lambda\vect{x}} =\innerproduct{\vect{x}}{A\vect{x}}\geq 0 \end{equation*}

By Theorem PIP we know \(\innerproduct{\vect{x}}{\vect{x}}\gt 0\text{,}\) so we conclude that \(\lambda\geq 0\text{.}\)

(⇐) Let \(n\) denote the size of \(A\text{.}\) Suppose that \(\scalarlist{\lambda}{n}\) are the eigenvalues of the Hermitian matrix \(A\text{,}\) each of which is non-negative. Let \(B=\set{\vectorlist{\vect{x}}{n}}\) be a set of associated eigenvectors for these eigenvalues. Since a Hermitian matrix is normal (Definition 1.7.1), Theorem OBNM allows us to choose \(B\) to also be an orthonormal basis of \(\complex{n}\text{.}\) Choose any \(\vect{x}\in\complex{n}\) and let \(\scalarlist{a}{n}\) be the scalars guaranteed by the spanning property of the basis \(B\text{,}\) so \(\vect{x}=\sum_{i=1}^{n}a_i\vect{x}_i\text{.}\) Since we have presumed \(A\) is Hermitian, we need only check the second condition of the definition. The use of an orthonormal basis provides the simplification for the last equality.

\begin{align*} \innerproduct{\vect{x}}{A\vect{x}} &=\innerproduct{\sum_{i=1}^{n}a_i\vect{x}_i}{A\sum_{j=1}^{n}a_j\vect{x}_j}\\ &=\innerproduct{\sum_{i=1}^{n}a_i\vect{x}_i}{\sum_{j=1}^{n}a_jA\vect{x}_j}\\ &=\innerproduct{\sum_{i=1}^{n}a_i\vect{x}_i}{\sum_{j=1}^{n}a_j\lambda_j\vect{x}_j}\\ &=\sum_{i=1}^{n}\sum_{j=1}^{n}\innerproduct{a_i\vect{x}_i}{a_j\lambda_j\vect{x}_j}\\ &=\sum_{i=1}^{n}\sum_{j=1}^{n}\conjugate{a_i}a_j\lambda_j\innerproduct{\vect{x}_i}{\vect{x}_j}\\ &=\sum_{i=1}^{n}\conjugate{a_i}a_i\lambda_i\innerproduct{\vect{x}_i}{\vect{x}_i}+\sum_{i=1}^{n}\sum_{\substack{j=1\\j\neq i}}^{n}\conjugate{a_i}a_j\lambda_j\innerproduct{\vect{x}_i}{\vect{x}_j}\\ &=\sum_{i=1}^{n}\conjugate{a_i}a_i\lambda_i \end{align*}

The expression \(\conjugate{a_i}a_i\) is the modulus of \(a_i\) squared, hence is always non-negative. With the eigenvalues assumed non-negative, this final sum is clearly non-negative as well, as desired.

As positive semi-definite matrices are defined to be Hermitian, they are then normal and subject to orthonormal diagonalization (Theorem OD). Now consider the interpretation of orthonormal diagonalization as a rotation to principal axes, a stretch by a diagonal matrix and a rotation back (Subsection OD.OD). For a positive semi-definite matrix, the diagonal matrix has diagonal entries that are the non-negative eigenvalues of the original positive semi-definite matrix. So the “stretching” along each axis is never a reflection.