Lecture 30

[[lecture-data]]

2024-11-11

 

Nearing the end of:

5. Chapter 5

Last time, we saw how condition number determines the amount of noise in a linear system. We also saw a similar thing when we are inverting a matrix with some noise thrown in!

Absolute Vector Norm

The vector norm on is called absolute precisely when for all it holds that where is taken component wise

norms are absolute since

All

(see absolute vector norm)

Monotone Vector Norm

We call a vector norm monotone precisely when for all , again, where is taken component wise.

norms are also monotone!

All

(see monotone vector norm)

Theorem

Suppose is a norm on . The following are equivalent:

  1. is monotone

  2. is absolute

  3. The matrix norm induced by satisfies the following: for all diagonal this is called the “funky diagonal property” 😊

Example

Proof

Suppose is monotone. For all , we have that

Read the book :)

Suppose is monotone. Let be diagonal. For any , . So . So by monotonicity, we have

&= \max_{i}|d_{i i}|\cdot \lvert \lvert x \rvert \rvert \\ \implies \frac{\lvert \lvert Dx \rvert \rvert }{\lvert \lvert x \rvert \rvert } \leq \max_{i} |d_{ᵢ}| \end{aligned}$$ Equality is attained with $x = e_{k}$ the standard basis vector with $1$ in the $k$th position, and where $k$ is the index of the largest $|d_{i,i}|$. Thus we get $\lvert \lvert D \rvert \rvert' := \max\frac{\lvert \lvert Dx \rvert \rvert}{\lvert \lvert x \rvert \rvert} = \max_{i} | d_{i i}|$

Suppose induces with the funky diagonal property. Suppose such that . We define the following diagonal matrix :

  • for , let So, and . So
\lvert \lvert x \rvert \rvert =\lvert \lvert Dy \rvert \rvert \leq (*)\lvert \lvert D \rvert \rvert' \lvert \lvert y \rvert \rvert \leq (**)\lvert \lvert y \rvert \rvert \end{aligned}$$ Where $(*)$ is by definition of the [[Concept Wiki/matrix norm\|induced matrix norm]] and $(**)$ is because $\lvert \lvert D \rvert \rvert' \leq 1$. And the result follows!

Thus the result follows :)

(see funky diagonal property)

Berer-Fike Theorem

Let be a matrix norm on induced by a monotone vector norm. Let be such that is diagonalizable as .

Then for all , there exists a such that

If is also normal, then

. (otherwise, the result is trivial). is singular (since is an eigenvalue of ). So is also singular. But note that is invertible (since ) and this inverse has diagonal entries .

Assume

S^{-1}[\lambda I - [A+\Delta A]]S &= \lambda I - D - S^{-1}\Delta AS\\ \implies (\lambda I-D)^{-1}[\lambda I-D-S^{-1}\Delta AS] &= I - [\lambda I-D]^{-1}S^{-1}\Delta AS \text{ is singular} \\ \implies 1 &\leq \lvert \lvert (\lambda I - D)^{-1}S^{-1}\Delta AS \rvert \rvert \;\;(*) \\ \implies 1 &\leq \lvert \lvert (\lambda I - D)^{-1} \rvert \rvert \cdot \lvert \lvert S^{-1}\Delta AS \rvert \rvert\\ &= \frac{1}{|\lambda-\tau|} \lvert \lvert S^{-1}AS \rvert \rvert \;\;\;\; (* *)\\ \end{aligned}$$ - $(*)$ follows since [[Concept Wiki/matrices with norm less than 1 define an infinite series inverse]], but we know that the matrix $I - [\lambda I-D]^{-1}S^{-1}\Delta AS$ is singular. - $(* *)$ follows by the [[Concept Wiki/funky diagonal property]]! We take the largest element in our definition of $[\lambda I - D]^{-1}$ If $A$ is normal, then $S$ can be chosen to be [[Concept Wiki/unitary matrices\|unitary]]! So $$\lvert \lvert S \rvert \rvert^2_{2,2} = \rho(S^*S) = \rho(I) = 1$$ So ${\kappa_{\lvert \lvert \cdot \rvert \rvert}}_{2,2}(S) = 1\cdot1=1$

(see Berer-Fike Theorem)

Note

If are hermitian, Weyl’s Theorem says

\lambda_{1}(\Delta A) + \lambda_{k}(A) &\leq \lambda_{k}(A+\Delta A) \leq \lambda_{n}(\Delta A) + \lambda_{k}(A) \\ \implies \lambda_{1}(\Delta A) &\leq \lambda_{k}(A + \Delta A) - \lambda_{k}(A) \leq \lambda_{n}(A) \\ \implies \lvert \lambda_{k}(A+\Delta A) - \lambda_{k}(A) \rvert &\leq \rho(\Delta A) \leq \lvert \lvert \Delta A \rvert \rvert _{2,2} \end{aligned}$$ ie, we can say WHICH eigenvalue each eigenvalue of the perturbed matrix is "close to"