All Articles

A Brief summary of Linear Algebra

Chapter 1

1.1 Vector

Origin of trigonometric formulas: (dot product— θ\theta is the angle between two vectors)

cosθ=cos(βα)=cosαcosβ+sinαsinβ\cos\theta=\cos(\beta-\alpha)=\cos\alpha\cos\beta+\sin\alpha\sin\beta

![[Pasted image 20240924101929.png|200]]

right multiplication vs left multiplication column operation vs row operation ![[Pasted image 20241209191936.png#||500]] ![[Pasted image 20241209192003.png||500]]

1.2 Dot Product

Inequality formulas:

vwvwv+wv+w\begin{aligned}&|v\cdot w|\leq\left\|v\right\|\left\|w\right\|\\&\|v+w\|\leq\|v\|+\|w\| \end{aligned}

For v=(a,b),w=(b,a)v=(a,b),w=(b,a) => 2aba2+b22ab \le a^2+b^2 if x=a2y=b2x=a^{2} \quad y=b^2 => xyx+y2.\sqrt{xy}\leq\frac{x+y}2. (geometric mean \le arithmetic mean)

1.3 Matrices

x(t)=t2x(t)=t^2 Forward: x(t+1)x(t)=2t+1x(t+1)-x(t)=2t+1 Backward: x(t)x(t1)=2t1x(t)-x(t-1)=2t-1 Centered: x(t+1)x(t1)2=2t\frac{x(t+1)-x(t-1)}{2}=2t

Chapter 2

row picture hyperlanes meeting at a single point column picture column vectors combination to produce the target vector b

elimination ---- back substitution U just keeps the diagonal of A when A is lower triangular.

[300620921][xyz]=[389]>[300020001][xyz]=[322]\begin{bmatrix} 3 & 0 & 0 \\ 6 & 2 & 0 \\ 9 & -2 & 1 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 3 \\ 8 \\ 9 \end{bmatrix} \quad- > \quad \begin{bmatrix} 3 & 0 & 0 \\ 0 & 2 & 0 \\ 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} x \\ y \\ z \end{bmatrix} = \begin{bmatrix} 3 \\ 2 \\ 2 \end{bmatrix}

A is factored into LU A=LU u is upper triangular L is lower triangular Notice: how to get pivot? 2 - 3/6 * 0=2

[!this is elimination without row exchanges!] first we can conclude that EA=UA=LU(L=E1)EA=U \quad A=LU (L=E^{-1})

A=PLU P is permutation matrix ----- row exchange for pivot A=LDU D is diagonal matrix. symmetric

Gauss-Jordan

Multiply[AI]byA1toget[IA1]Multiply \quad \begin{bmatrix} A & I \end{bmatrix} \quad by \quad A^{-1} \quad to \quad get \quad \begin{bmatrix} I & A^{-1} \end{bmatrix}

Diagonally dominant matrices are invertible. Each aii|a_{ii}| dominates its row.