MTH 102A - Part 1 Linear Algebra 2019-20-II...

33
MTH 102A - Part 1 Linear Algebra 2019-20-II Semester Arbind Kumar Lal 1 January 20, 2020 1 Indian Institute of Technology Kanpur

Transcript of MTH 102A - Part 1 Linear Algebra 2019-20-II...

MTH 102A - Part 1Linear Algebra

2019-20-II Semester

Arbind Kumar Lal1

January 20, 2020

1Indian Institute of Technology Kanpur

• A set V over a field F is a vector space ifx + y is defined in V for all x, y ∈ Vand αx is defined in V for all x ∈ V, α ∈ F s.t.

1. + is commutative, associative.2. Each x ∈ V has a inverse w.r.t +.3. Identity element 0 exists in V for +.4. 1x = x holds ∀x ∈ V, where 1 is the multiplicative identity of

F.5. (αβ)x = α(βx), (α+ β)x = αx + βx hold ∀α, β ∈ F, x ∈ V.6. α(x + y) = αx + αy holds ∀α ∈ F,∀x, y ∈ V.

• R3 is a vector space (VS) over R. Here(x1, x2, x3) + α(y1, y2, y3) := (x1 + αy1, x2 + αy2, x3 + αy3).

• Mm,n := {Am×n : aij ∈ C} is a VS over C. Here[A + αB]ij := aij + αbij .

• RS := {functions from S to R} is a VS over R. Here(f + αg)(s) := f (s) + αg(s).

• R[x ]= {p(x) | p(x) is a real polynomial in x} is a VS over R.

• R[x ; 5]= {p(x) ∈ R[x ] | degree of p(x) ≤ 5} is a VS over R.

• The set {0} is the smallest VS over any field.

• The set of all real sequences (being RN) is a VS over R.

• The set of all bounded real sequences is a VS over R.

• The set of all convergent real sequences is a VS over R.

• {(an) | an ∈ R, an → 0} is a VS over R. Change 0 to 1???ColormycolNo.

• C((a,b),R) := {f : (a, b)→ R | f is continuous} is a VS over R.

• C2((a,b),R) := {f : (a, b)→ R | f ′′ is continuous} is a VS over R.

• C∞((a,b),R) := {f : (a, b)→ R | f infinitely differentiable} is aVS.

• {f : (a, b)→ R | f ′′ + 3f ′ + 5f = 0} is a VS over R.

• {An×n | aij ∈ R, a11 = 0} is a VS over R.

• {An×n | aij ∈ R, A upper triangular} is a VS over R.

• {A ∈Mn×n(R) | A is symmetric} is a VS over R. What about theskew symmetric ones? What about AT = 2A???

• Let W and V be VSs over F. We call W a subspace of V ifW ⊆ V and they have the same operations.Thm: Let V be a VS over F and W ⊆ V.

‘for all x, y ∈W we have αx + y ∈W for all α ∈ F’.

• {0} and V are subspaces of any VS V, called the trivialsubspaces.

• The VS R over R does not have a nontrivial subspace.Take a subspace W 6= {0}. Then ∃a ∈W s.t. a 6= 0. So αa ∈Wfor each α ∈ R. So W = R.

• The VS R over Q has many nontrivial subspaces. Two of them areQ[√2] = {a + b

√2 : a, b ∈ Q} and Q[

√3] = {a + b

√3 : a, b ∈ Q}.

• Fix a real Am×n. The set Col(A) = {Ax : x ∈ Rn} is a subspace ofRm, called the range/column space of A. It is basically the set ofall vectors obtained by adding and subtracting real multiples ofcolumn vectors of A.

• The set Row(A) = {xT A : x ∈ Rm} is a subspace of Rn, called therow space of A.

• The set Null Sp(A) = {x ∈ Rn : Ax = 0} is a subspace of Rn,called the null space of A.

• Col(A), Row(A), Null Sp(A), Null Sp(AT ) are the fourfundamental subspaces related to a real matrix A.

• Let a ∈ R2, a 6= 0. Then, {x|aT x = 0} is a nontrivial subspace ofR2. Geometrically, it represents a straight line passing through theorigin. In fact, these are the only nontrivial subspaces of R2.

• {x ∈ R2 | x1x2 = 0} is NOT a subspace of R2.

• Important: union of axes; So, union of subspaces NEED NOTbe a subspace.

• Let xi ∈ V, αi ∈ F, i = 1, · · · , k. We callk∑

i=1αixi a linear

combination (lin.comb) of x1, · · · , xk . The set {∑k

i=1 αixi |αi ∈ F} is a subspace of V.

• Let S ⊆ V. Then LS(S) = {m∑

i=1αixi | αi ∈ F, xi ∈ S} is a

subspace of V, called the span of S.Convention: LS(∅) = {0}.Let ei = (0, . . . , 0, 1, 0, . . . , 0)T ∈ Rn with 1 at the i-th coordinateand 0 at other places. Then LS(e1, e2, . . . , en) is Rn.

• Let xT =[1 2 1

]and yT =

[1 −2 3

]. Then,

LS(x, y) =

a

121

+ b

1−23

: a, b ∈ R

=

a + b2a − 2ba + 3b

: a, b ∈ R

=

xyz

: 4x − y − 2z = 0

as 4(a + b)− (2a − 2b)− 2(a + 3b) = 0.Note that we have obtained conditions on x , y and z such that thesystem a + b = x , 2a − 2b = y and a + 3b = z has a solution forall choices of a, b and c.

• If U,W are subspaces of V then,U + W := {u + w | u ∈ U,w ∈W} is a subspace of V, called thesum of U and W. When U ∩W = {0}, it is called the internaldirect sum of U and W. Notation: U⊕W.Internal: they are inside V; external: will come later;x -axis + y -axis in R2, R3; x -axis + xy -plane in R3; xy -plane +xz-plane in R3.

• Thm: Let U,W ⊆ V. Then, U + W is the smallest subspace of Vwhich contains both U and W. !!Important: Is U ⊆ U + W? Is W ⊆ U + W? Let X ⊆ V s.t.U,W ⊆ X. Is U + W ⊆ X? So, ’Smallest’.

• If U ⊆W and W ⊆ V, then U ⊆ V.

• Let {Ut |Ut ⊆ V} be a nonempty class. Then⋂tUt is a

subspace. !!

• Let U,W ⊆ V. Then U ∪W ⊆ V if and only if either U ⊆W orW ⊆ U. !!Ques: If U ∪W ⊆ V, is it necessary that U ⊆ V?

• Let V,W be VSs over F. Then V×W is a VS over F with(v1,w1) + (v2,w2) := (v1 + v2,w1 + w2) andα(v ,w) = (αv , αw).

Notation: V⊕W. Called external direct sum of V and W.Ex Give a bijection f : R3 → R[x ; 2] s.t.f (αu + v) = αf (u) + f (v) for each u, v ∈ R3. Take 3 subspaces ofR3. Observe their image in R[x ; 2].

• Ex Give vector spaces such that U1 ⊃ U2 ⊃ U3 ⊃ · · · .

• Ex Let U,W ⊆ V. Then V = U⊕W (int/ext?) if and only if foreach v ∈ V, ∃ unique u ∈ U and unique w ∈W s.t. v = u + w.

• Put V=M2(R), W=

x1 x2

x3 0

: xi ∈ R

,

U=

x1 00 x2

: xi ∈ R

.

• Are W,U subspaces of V? Yes.

• Is V = W + U? Yes.

• Is V = W⊕ U? No.

• What is the subspace W ∩ U?

x 00 0

: x ∈ R

.

Thm: Let V be a VS and S ⊆ V. Let L = {U | U ⊆ V, S ⊆ U}.Then LS(S) =

⋂U∈L

U. !!

• Ex: Let A be row-equivalent to B. Then Row(A) = Row(B).Converse?

• Let x1, . . . , xk ∈ V. If a nonzero lin.comb of xi ’s becomes 0, thenwe say x1, . . . , xk are linearly dependent (lin.dep).

• We say x1, . . . , xk are linearly independent, if they are NOTlin.dep.

• In R2, the vectors x =

12

, e1 =

10

and e2 =

01

are lin.dep,

as x− e1 − 2e2 = 0.

• v1 =

30−3

, v2 =

−112

, v3 =

42−2

, v4 =

211

are lin.dep in R3,

as 2v1 + 2v2 − v3 + 0v4 = 0.

• v1 =

111

, v2 =

−111

are lin.ind in R3. How?

• Answering the previous question: the vectors will be linearlydependent (lin.dep) if we can findαβ

6= 0 s.t. α

111

+ β

−111

= 0. That is, if

1 −11 11 1

αβ

=

000

.

• Apply GJE, if there are no free variables, then only solution isαβ

=

00

. In that case we conclude lin.ind.

• If there are free variables, then we have nonzero solutions for

αβ

.In this case we conclude lin.dep.

• v1 =

111

, v2 =

−111

, v3 =

000

are lin.dep. Why? As the

variable for v3 will be free. Or 0 · v1 + 0 · v2 + αv3 = 0 for allα ∈ R.

• Thm: Vectors {v,w, z} in R2 are linearly dependent.Proof: Solving αv + βw + γz = 0 means solvingv1 w1 z1

v2 w2 z2

α

β

γ

=

00

.That is, we have to find the RREF of

v1 w1 z1 0v2 w2 z2 0

.Applying GJE, as there are only two rows, the number of pivots isat most 2. As the number variables is 3, we will have a freevariable. Hence we will have a nonzero solution.

• We call a set S ⊆ V linearly dependent if it contains a finitelin.dep set.

• Thus ∅ is lin.ind and any set containing 0 is lin.dep.

• So S ⊆ V is lin.ind if and only if each finite subset of S islin.ind.

• Take Am×n, m > n. Then R = RREF(A) has last row 0. That is,rows of A are lin.dep. How? GJE: R = FA. Then∑n

i=1 fmiA[i , :] = 0T .

• Thm: Let S = {v1, · · · , vn} ⊆ V and T = {w1, · · · ,wm} ⊆ LS(S)s.t. m = |T | > |S|. Then T is linearly dependent.Proof: Write wi = ai1v1 + ai2v2 + · · ·+ ainvn. So

wi = A[i , :]

v1...

vn

and

w1...

wm

=

a11v1 + · · ·+ a1nvn

...am1v1 + · · ·+ amnvn

=

a11 · · · a1n...

...am1 · · · amn

v1...

vn

.

• As m > n, we see that rows of A = (aij) are lin.dep. That is, thereexists αi , not all zero, s.t.

m∑i=1

αiA[i , :] = 0T . Thus

m∑i=1

αiwi =m∑

i=1αi

A[i , :]

v1...

vn

=( m∑

i=1αiA[i , :]

)v1...

vn

= 0T

v1...

vn

= 0.

• Cor Any n + 1 vectors in Rn is lin.dep.Proof Follows as Rn = linear span (e1, . . . , en).

• Thm: An×n is invertible if and only if columns of A are linearlyindependent.Proof: It is a restatement of ‘A is invertible if and only if Ax = 0has a unique solution’.

• Thm: An×n is invertible if and only if rows of A are linearlyindependent.Proof: Follows as ‘A is invertible if and only if AT is invertible’.

• Let u1, · · · ,um ∈ V be lin.ind. Choose α1, . . . , αm ∈ F. Putu =

∑m1 αiui . Can you find another set of βi ∈ F s.t.

u =∑m

1 βiui?

• No. If so, we have∑

(αi − βi)ui = 0. Then ui ’s are lin.dep., Acontradiction.

• Let S be lin. ind. Then any vector in LS(S) is a UNIQUE LINEARCOMBINATION of elements of S.

• Ex u1,· · ·,uk ∈Rn are lin.ind if and only if Au1,· · ·,Auk are lin.indfor any invertible An.Ex Let u, v ∈ V. Then u, v are lin.ind if and only if u + v,u− v arelin.ind.

• Thm: Let S ⊆ V be lin.ind, x ∈ V. Then S ∪ {x} is lin.dep if andonly if x ∈ LS(S).Proof: Let S ∪ {x} be lin.dep. Then some

m∑i=1

αisi + αm+1x = 0,

where α (the vector of αis) is not 0. Note: αm+1 cannot be 0,otherwise S is lin.dep. So x =

m∑i=1

(− αiαm+1

)si . So x ∈ LS(S).

Conversely, let x ∈ LS(S). So x =k∑

i=1αisi , for some αi ∈ F, si ∈ S,

not all zero. So x−k∑

i=1αisi = 0. So S ∪ {x} is lin.dep.

• Cor Let S ⊆ V be lin.ind. Then LS(S) = V if and only if eachproper superset of S is lin.dep. !!

• Thm: Let A = ER, where R = RREF(A) and E is invertible. Thenrows of A corresp. to the pivotal rows of R are lin.ind. Alsocolumns of A corresp. to the pivotal columns of R are lin.ind.Proof: Pivotal rows of R are lin.ind due to the pivotal 1s.Let the pivotal rows of R be R[1, :], . . . ,R[k, :] and R1 be thesubmatrix formed by these rows. Let A1 be the submatrix of Awhich gives R1.So R1 = RREF(A1). That is, A1 = E1R1, for some invertible E1.Suppose there exists x0 6= 0 s.t. xT

0 A1 = 0. So xT0 E1R1 = 0. Note

that xT0 E1 = yT

0 6= 0. So yT0 R1 = 0, a contradiction.

• Other part: note that pivotal columns in R are lin.ind, due to thepivotal 1s. Let R[:, j1], . . . ,R[:, jk ] be the pivotal columns of R.Since E is invertible, A[:, j1] = ER[:, j1], . . . ,A[:, jk ] = ER[:, jk ] arelin.ind.

• Thm: Let x1, · · · , xn be lin.dep, x1 6= 0. Then, there exists k > 1s.t. xk is a lin.comb of x1, · · · , xk−1.Proof: Consider {x1}, {x1, x2}, · · · , {x1, x2, · · · , xn} one by one.Take the smallest k > 1 s.t. Sk = {x1, · · · , xk} is lin.dep. So Sk−1

is lin.ind. In that case xk ∈ LS(Sk−1).

• Let S = {v1, . . . , vk} ⊆ Rn be lin.dep, v1 6= 0. One may take helpof GJE to find out the first vi which is a lin.comb of the earlierones and to find a lin.ind subset T ⊆ S s.t. LST = LSS.

• v1 =

111

, v2 =

101

, v3 =

1−11

, v4 =

012

.

A =

1 1 1 01 0 −1 11 1 1 2

.

• RREF(A) =

1 0 −1 00 1 2 00 0 0 1

. T = {v1, v2, v4}.