Random Matrix Lecture 01
[[lecture-data]]
:LiArrowBigLeftDash: No Previous Lecture | Random Matrix Lecture 02 :LiArrowBigRightDash:
Class notes pgs 2-7 (1.1 - 1.3)
- Random vector theory
- Properties of Gaussian random vectors
- Concentration inequalities
1. Random Vector Theory
1.1 Natural Random Models and Orthogonal Invariance
Interpretations of vectors
- List of numbers
- Magnitude and direction (with respect to a basis of the vector space they belong to)
Corresponding interpretations for random vectors:
-
Entries (each of the numbers) are as independent as possible
ie entries are iid from
-
Magnitude and direction are as independent as possible
- Take
and to be independent - The magnitude is then any random non-negative scalar and the direction is
ie a random vector drawn uniformly from the unit sphere
- Take
See random vector
Let
see also invariant
The law is the distribution of the random vector.
We sometimes use "model" instead of "law" or "distribution" to remind us that our choice includes assumptions and judgements.
- We are modelling a situation we might encounter in applications
There exists a unique probability measure supported on
see orthogonally invariant distribution on the unit sphere
Suppose that
is independent of .
- [p] Entry-wise interpretation of random vectors is described in these distribution definitions
- [p] magnitude/direction interpretation is also described
- [c] It is hard to swap between the two interpretations
- What do the entries look like for a vector
? - What are the magnitude and direction of an iid random vector ?
1.2 Gaussian Random Vectors
The most natural of the iid random vectors is the multivariate Gaussian
The multivariate normal or Gaussian
With respect to the Lebesgue measure on the row space of
is the Moore-Penrose inverse is the product of all non-zero eigenvalues
When
is the ordinary determinant - the Lebesgue measure is on all of
If a random vector
Let
ie, the law of a gaussian random vector is determined by its mean and covariance (or its linear and quadratic moments)
Let
ie, gaussian random vectors are closed under linear transformations.
see gaussian random vectors are closed under linear transformations
The two above facts means that a standard gaussian random vector is both an iid random vector and an orthogonally invariant random vector!
Suppose
In particular, this does not depend on
For
Now, consider a special case where
see standard gaussian random vectors are orthogonally invariant
If
The result follows immediately from standard gaussian random vectors are orthogonally invariant and the independence proposition for orthogonally invariant distribution on the unit sphere.
1.3 Concentration of Gaussian Vector Norms
The direction of a gaussian random vector is uniformly distributed.
- [?] So we've addressed the direction. What about the magnitudes?
If
Since
By using Markov's inequality and Chebyshev's inequality, we can get something close to this.
By Markov, we have
And by Chebyshev we get
Where
This isn't quite what we want since both results above depend a lot on the value of
Let
We deal with only one side of the distribution; the desired inequality is achieved by simply multiplying by 2 to account for the other tail.
Note that
Where
is from Markov's Inequality is because the are independent
Now, define
As the cumulant generating function (wikipedia) of
Where we get
- Can plot
and the bound (see notes page 6; figure 1.1) - Verify via algebra that
for all
Assuming
Now,
- If
, then and we can safely set .
- If
, then , so we instead set .
Which is the desired result
see concentration inequality for magnitude of standard gaussian random vector
The important part of the proof is finding some
This result is characteristic of sums of
- Bernstein's inequality is general tool for expressing this type of behavior (see notes pg. 7 for more + reference)
Review
The
What does
-?-
The surface of the sphere of radius
The {asa||direction||thing} of a {sas||gaussian random vector} is {asa||uniformly distributed on
With high probability, the {1||magnitude} of a gaussian random vector is {1||
TODO
Created 2025-09-04 ֍ Last Modified 2025-09-11