[[lecture-data]]2024-09-16
Readings
- a
2. Chapter 2
Note
Suppose I have two block matrices
\begin{bmatrix} 0 & * \ 0 & T \end{bmatrix} \begin{bmatrix} D_{1} & * \ 0 & D_{2} \end{bmatrix} = \begin{bmatrix} 0 & | \begin{bmatrix} 0 \ \vdots \ 0 \end{bmatrix} & * \ 0 & | \begin{bmatrix} 0 \ \vdots \ 0 \end{bmatrix} & * \ \end{bmatrix} = \begin{bmatrix} 0 & * \ 0 & T_{*} \end{bmatrix}$$
Where is upper triangular and each is diagonal. Then the resulting matrix is with also upper triangular.
Cayley-Hamilton
Let . Then
Note
This is a homework problem for diagonalizable matrices, but you are going to do it using the tools that you already have.
(see Cayley-Hamilton)
where is unitary and upper triangular per Schur's theorem. Say that we have characteristic polynomial
Let
p_{A}(A) &= Up_{A}(T) U^* \ &= U[(T-\lambda_{1}I)(T-\lambda_{2}I)\dots(T-\lambda_{n}I)]U^* \ &= U \begin{bmatrix} 0 & * & * \ 0 & * & * \ 0 & 0 & \ddots \end{bmatrix}
\begin{bmatrix}* & * & * \ 0 & 0 & * \ 0 & 0 & \ddots \end{bmatrix} \begin{bmatrix}* & * & * \ 0 & \ddots & * \ 0 & 0 & 0 \end{bmatrix}U^* \ &= U [0] U^* \ &= 0 \end{aligned}$$
Corrolary
Suppose is invertible and Then
invertible and . By Cayley-Hamilton, we have
A[A^{n-1}+a_{n-1}A^{n-2}+a_{n-2}A^{n-3}+\dots+a_{1}I] &= -a_{0}I \ A\left[ \frac{1}{-a_{0}} [A^{n-1}+a_{n-1}A^{n-2}+a_{n-2}A^{n-3}+\dots+a_{1}I] \right] &= I \ \implies \left[ \frac{1}{-a_{0}} [A^{n-1}+a_{n-1}A^{n-2}+a_{n-2}A^{n-3}+\dots+a_{1}I] \right] &= A^{-1} \end{aligned}$$
Theorem
Suppose . Then , there exists invertible and diagonal, and with such that
” is almost similar to a diagonal matrix”
where unitary, upper triangular per Schur. For all , we can define
Let
\delta^{-1} & 0 & \dots & 0 \ 0 & \delta^{-2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{-n} \end{bmatrix} T \begin{bmatrix} \delta^{1} & 0 & \dots & 0 \ 0 & \delta^{2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{n} \end{bmatrix}$$
and will have th entry equal to
A &= UTU^* \ &= UITIU^* \ &= U\begin{bmatrix} \delta^{-1} & 0 & \dots & 0 \ 0 & \delta^{-2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{-n} \end{bmatrix}\begin{bmatrix} \delta^{1} & 0 & \dots & 0 \ 0 & \delta^{2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{n} \end{bmatrix}T\begin{bmatrix} \delta^{-1} & 0 & \dots & 0 \ 0 & \delta^{-2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{-n} \end{bmatrix}\begin{bmatrix} \delta^{1} & 0 & \dots & 0 \ 0 & \delta^{2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{n} \end{bmatrix}U^* \ &= U \begin{bmatrix} \delta^{-1} & 0 & \dots & 0 \ 0 & \delta^{-2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{-n} \end{bmatrix} Q \begin{bmatrix} \delta^{1} & 0 & \dots & 0 \ 0 & \delta^{2} & \ddots & \vdots \ \vdots & \ddots & \ddots & 0 \ 0 & \dots & 0 & \delta^{n} \end{bmatrix} U^* \ &= S Q S^{-1} \end{aligned}$$
Where Note that Which is what was to be shown.
Theorem
Let . Then for all , there exists such that and is diagonalizable
“every matrix is almost diagonalizable”
Caution
This is similar to the last result, but NOT THE SAME.
by Schur. Then there exists a diagonal matrix such that and has distinct diagonals. Then
Let
A + UDU^* &= UTU^* + UDU^* \ &= UTU^* + UDU^* \ &= U(T+D)U^* \ \end{aligned}$$
and this has all eigenvalues distinct by construction, and thus is diagonalizable! So let . Then since is unitary
Last main topic of Chapter 2: Normal Matrices
Normal Matrices
is normal precisely when
Example
- diagonal matrices
- hermitian matrices (duh)
- unitary matrices
(see normal matrix)
Lemma
Let be upper triangular. Then is normal if and only if is diagonal.
Proof (informal) is normal upper triangular. Then since it is normal.
Suppose
&= \begin{bmatrix} | & | & \dots & | \ c_{1} & c_{2} & \dots & c_{n} \ | & | & \dots & |\end{bmatrix}\begin{bmatrix} - c_{1}^* -\ - c_{2}^-\ \vdots \ - c_{n}^ -\end{bmatrix} \end{aligned}$$
Where are increasing in “nonzero length”
Then the th diagonal of , call it where is the the COLUMN of
And the th diagonal of , call it where is the th ROW of (this is why we can do the multiplication like this to get a scalar)
So looking at , we know that the th column and the th row must have the same length.
- So for the first column, the length will be the first element. Which means the other elements of the first row must be zero.
- if we continue in the same manner through the rest of the matrix, we realize that the rest of the non-diagonal elements must be 0 also.