Lecture 24

[[lecture-data]]

2024-10-28

5. Chapter 5

we are talking about norms.

Theorem

Suppose that V is a finite dimensional vector space over a field K. Then all norms are equivalent. ie, for any two norms, say |||| and ||||, on V, there exist m,MR such that for all xV we have

m||x||||x||M||x||

note also that we also have then that

1M||x||||x||1m||x||

And norm equivalence is an equivalence relation!
(see norm equivalence)

Why is this important? for convergence purposes, this means that we can use ANY norm (that works and exists on the space) to prove convergence.

Example

{xk}k=1V,xV and we have that limxxk=x then this means that

||xkx||0

and since norms are equivalent then we can see that

||xkx||m||xkx||

so we can show convergence for any norm and get any other norm

ALSO, the underlying topology generated by one norm will be the same as any other norm. (same open sets, closed sets, bounded sets, etc - so long as we are in a metric space. once we talk about convergence, each of these follow)

Example

Suppose we have Kn on K. Suppose xk is a sequence in Kn and xKn. Then xkx if and only if for all j=1,2,,n we have (xk)jxj ie convergence only happens if we have coordinate-wise convergence.

Proof

Let |||| be a norm on V=Kn. Then ||xkx||0||xkx||0 for all j we have that |(xk)jxj|0

(see convergence only happens if we have coordinate-wise convergence (for vector spaces))

Linear Transformation

Let V,W be vector spaces over K and let T:VW. T is a linear function means that for all α,βK and x,y,V we have $$T(\alpha x+\beta y) = \alpha T(x) + \beta T(y)$$

Example

  • AMm,n:CnCm
  • ddx:c1[a,b]c0[a,b]

(see linear function)

Isomorphic / Isomorphisc

Say V is a vector space over K with finite dimension and with basis B={b1,b2,,bn}. Then V is isomorphic to Kn over K (denoted VKn) if for all xV there exists unique α1,α2,,αnK such that x=i=1nαibi. We can call these coefficients "xB"

ie there is a one-to-one correspondence between V and coefficients that form a vector in Kn.

(see isomorphism)

Theorem

Say V,W are finite vector spaces with respective bases BV={b1,,bn} and BW={b1,,bm}. Then T:VW is linear function if and only if there exists an AMm,n(K) such that for all xV and yW, we have

T(x)=yAxBV=yBW
Linear function

Suppose V is a vector space over K. A linear function T:VK is called a linear function (ie the range is only 1 dimension)

Note

note linear functionals on Kn over K are precisely of the following form:
T:KnK where T(x)=[a fixed 1×n matrix]x.

(see linear functional)

Vector's Linear Functional

Let wKn and define w^ the linear functional w^:KnK where for all xKn we have w^(x):=wx.

(see vector linear functional)

Theorem

Let V,W be NLS with respective norms ||||V,||||W. Then T is continuous if and only if

supxV0||Tx||W||x||V<
Note

(1) T(0)=0 since T(0x)=0T(x)=0
(2) $$\begin{aligned}
\sup_{x \in V \neq 0} \frac{\lvert \lvert Tx \rvert \rvert_{W} }{\lvert \lvert x \rvert \rvert {V}} &= \sup{x \in V \neq 0} \left\lvert \left\lvert T \frac{1}{\lvert \lvert x \rvert \rvert_{V} } x \right\rvert \right\rvert_{W} \
&= \sup_{z \in V : \lvert \lvert z \rvert \rvert_{V}=1 }\lvert \lvert Tz \rvert \rvert {W} ;;;;;(*)
\end{aligned}$$
Where () follows by letting z=1||x||Vx and observing that $$\lvert \lvert z \rvert \rvert = \left\lvert \left\lvert \frac{1}{\lvert \lvert x \rvert \rvert
} x\right\rvert \right\rvert_{V} = \frac{1}{\lvert \lvert x \rvert \rvert _{V}} \lvert \lvert x \rvert \rvert _{V} = 1 $$

Proof

() Suppose for all xV0 we have ||Tx||W||x||VM,M<. In particular, for any yz we have $$\frac{\lvert \lvert T(y-z) \rvert \rvert }{\lvert \lvert y-z \rvert \rvert } \leq M \implies \lvert \lvert Ty - Tz \rvert \rvert \leq M \lvert \lvert y-z \rvert \rvert $$

ie, T is continuous. In fact, it is Lipschitz continuous!

() Suppose T is continuous. In particular, there exists δ>0 such that for all yV such that ||y0||δ||TyT0||1. (if continuous, then continuous at the origin. Set ϵ>0 as 1). Thus, for all xV0 we have

||Tx||||x||=1δ||T(δ1||x||x)||1d1<

(see continuity for linear functions)