Lecture 01

[[lecture-data]]

2024-08-26

Readings

0. Chapter 0

Field

A field is a set on which addition, subtraction (additive inverse), multiplication, and division (multiplicative inverse) exist and behave "nicely"

Example

R or C

(see field)

Vector Space

We can add vectors and scale vectors by scalars and they are still in the space. And we have a zero vector :)

Example

Rn, Cn, C[a,b]
For arbitrary field, we denote this F

(see vector space)

Matrix notation

In this class we have Mm,n(R) (textbook) is a real m×n matrix. No parentheses indicates complex, and only one subscript indicates a square matrix

Linear Combination

Let V be a vector space over a field F. We say that the vector consisting of any sum of the vectors v1,v2,,vk with scalings α1,α2,,αk is a linear combination of the vectors vi.

α1v1+α2v2++αkvk

(see linear combination)

Span

The span {v1,v2,,vk}:={β1v1+β2v2++βkvk;βiF}

That is, all linear combinations of the vectors vi

(see span)

Note

Note that the zero vector is always in the span of some set of vectors

Linearly Dependent

v1,v2,,vk are linearly dependent means that γ1,γ2,,γkF (not all zero) such that γ1v1+γ2v2++γkvk=0

(They are linearly independent if there exist no such γ1,γ2,,γk ; ie that γ1v1+γ2v2++γkvk=0γ1==γk=0)

(see also linear dependence)

Theorem

v1,v2,,vkV are linearly dependent if and only if one of the vectors vi is a linear combination of the others.

(see linear dependence)

Basis

v1,v2,,vk are a basis for V means two things:

  1. v1,v2,,vk are linearly independent
  2. Span{v1,v2,,vk}=V

(see basis)

Note

If v1,,vk is a basis for V, then for every vector wV there exist UNIQUE scalars δ1,δ2,,δkF such that $$\delta_{1}v_{1} + \dots + \delta_{k}v_{k} = w$$

(see bases create unique definitions for vectors in their space)

Intuition (uniqueness)
Suppose we have ϵv1+ϵv2++ϵvk=w and also δ1v1++δkvk=w. Then we have that 0=ww=(ϵ1δ1)v1++(ϵkδk)vk=0
But since v1,,vk is a basis, this implies that we must have ϵiδi=0i

Note that the number of vectors in the vector space depends only on the vector space V itself (this is called "dimension", lets call it k). This means that V is isomorphic with the field Fk. That is, that there exists a bijection between V and Fk (but this requires axiom of choice)

There is a dual nature to matrices Mm,n(F)

  1. Algebraic structure ("vector-like")
  2. Analytic structure ("function-like")
    We associate any matrix AMm,n(F)A:FnFm
    And we say xFn we have A(x):=Ax

Such a function is linear.

Linear Function

A function T:VW linear means that x,yV,α,βF, we have

T(αx+βy)=αT(x)+βT(y)

(see linear function)

Linear Function

Suppose our field is C[a,b] (differentiable functions on the interval [a,b]). What is an example of a linear operator here?

The derivative!

[αf(t)+βg(t)]=αf(t)+βg(t)

Suppose I have two vector spaces V and W with finite dimensions n and m respectively. Now suppose we have a linear function T:VW. We can think of V as column vectors with n coordinates and W as column vectors with m coordinates.

We can come up with a matrix that describes the function by taking the standard basis for V and taking its mapping in W. We can then use the mappings as the columns of the matrix.