Lecture 18

[[lecture-data]]

2024-10-09

Readings

  • a
 

4. Chapter 4

Interlacing II / Inclusion Principle

Suppose is hermitian and is an principal submatrix. Then for all ,

Proof (via Courant-Fisher) Say comes from , deleting rows and columns .

\lambda_{k}(A) &= \max_{y_{1}, y_{2}, \dots, y_{k-1} \in \mathbb{C}^n} \min_{x \in \mathbb{C^n}\neq 0, x \perp y_{i} \forall i} \frac{x^*Ax}{x^*x} \\ &= \max_{y_{1}, \dots, y_{k-1}} \min_{x \perp y_{i} \forall i, x \perp e_{i_{1}},\dots, e_{i_{n-r}}} \frac{x^*Ax}{x^*x} \\ (*) &= \max_{v_{1}, v_{2}, \dots, v_{k-1} \in \mathbb{C}^r} \min_{z \in \mathbb{C}^r \neq 0, z \perp v_{1}, \dots, v_{k-1}} \frac{z^*Bz}{z^*z} = \lambda_{k}(B) \;\;\;\;\text{ by Courant-Fischer} \end{aligned}$$ - Note that the $e_{i_{k}}$ are the standard basis vectors with the $1$ in the index of each $i_{k}$ - The $z$s are the $x$s without the $i_{k}$ components (since they are orthogonal to those standard basis vectors) - We can then "perform surgery" on the $y$s also to get rid of those coordinates to get $v_{1},\dots,v_{k-1} \in \mathbb{C}^r$

(see inclusion principle)

Corollary

Suppose hermitian and is a principal submatrix. Then

(see the eigenvalues of a hermitian matrix and a principal submatrix one smaller alternate in magnitude)

Corollary

For any hermitian, we have .

This follows immediately from the inclusion principle, since each diagonal entry is a principal submatrix

(see the diagonal entries of a hermitian matrix are bounded by the eigenvalues)

Majorization

Let . Say we can order the components of so that and the same for the components of so we have . We say majorizes precisely when

  • For all and
  • equality holds when

(see majorization)

Theorem

Let be hermitian. Then the vector of diagonal elements of , call it majorizes the vector of ordered eigenvalues of , call it

Proof via induction on Any case where is trivially true. Assume the theorem holds for all matrices of size up to some fixed . Consider the case when .

Let be a submatrix of obtained by deleting one row and corresponding column where the diagonals of are ordered .

For all , we have by interlacing 2. Then by the induction hypothesis, we have that . But for the case when , we have that Thus the theorem holds

(see diagonal elements of a hermitian matrix majorize its eigenvalues)

Corollary

Let be hermitian and . Then

and also

And claim that this implies the previous result ( just take take as the identity )

Proof / Intuition with orthonormal columns, extend Gram-Schmidt to get unitary. Then when we do the multiplication So by interlacing 2, we get that So sum over to get So we have the desired lower bound, and need to show we can achieve equality.

Given any

Let be the orthonormalized eigenvectors associated with . (the sum of the first eigenvalues)

(see the sum of the first least eigenvalues is the minimum of the trace of orthonormal multiplications)