Part I: Questions
Part I: Questions
Min Ru, University of Houston In this notes, T is always a linear operator on V , V is a always a vector space, A is always an n n matrix, . . .
Part I: Questions
What are the deinitions of eigenvalues and eigenvectors How do you compute them? What is the denition of eigenspace E . How do determine whether you can put a matrix (or linear transformation) in a diagonal form? If so how do you do it? Let f (t) = an tn + an1 tn1 + + a0 be the characteristic polynomial of A. Determine an , an1 and a0 . What is Cayley-Halmiltons theorem? Sketch its proof. What is the denition of T -invariant subspace W ? If we pick a basis v1 , . . . , vk for W and extend it to a basis = {v1 , . . . , vk , . . . vn }, what does the matrix [T ] look like? What is the purpose to introduce the concept of T -invariant subspace? What is the T -cyclic space generated by v? How to determine it? What is its basis looks like? What is the Gram-Schmidt process? How do you do it? If {e1 , . . . , em } are orthogonal, and x = a1 e1 + am em , ai =? for 1 i m. What is projW (x) and its geometric mean? What is S ? What is T , the conjugate of T ? Does T always exists? What does [T ] look like?
A nonzero vector v V with T v = v (or (T I)v = 0 is an eigenvector, and is an eigenvalue. The way to nd eigenvalue is to solve f (t) = 0, where f (t) = det(A tI) is the characteristic polynomial. After you get , solve (A tI)v = 0 to get v. To nd eigenvalue , let A = [T ] . Diagonalizability: Let T be a linear operator on the nite-dimensional space V . T is diagonalizable if there is a basis for V consisting of eigenvectors of T . Let v1 , . . . , vn be nonzero eigenvectors of distinct eigenvalues 1 , . . . , n , then v1 , . . . , vn is linearly independent. Alternative Statement. If T (where T is a linear operator on V with dim V = n) has n distinct eigenvalues 1 , . . . , n , then T diagonalizable. (Proof is in the exercises). Let 1 , . . . , k be eigenvalues of T . Then T diagonalizable if and only if its characteristic polynomial splits and dim E = mi for 1 i qk. Invariant Subspaces: T Invariant Subspaces: Let T : V V be a LO (linear operator), and W V is a subspace. We say that W is a T -invariant subspace of V if T (W ) W , i.e w W T (w) W . The restriction TW : For a T -invariant subspace W of V , we can then, for a T -invariant subspace W , consider the LO: TW : W W dened by TW (w) = T (w) w W. The matrix of TW and T : Let W be a T -invariant subspace of V , be a basis for W , and be an extension of to V , then [T ] = [TW ] 0 B1 B2 .
T -cyclic subspaces: (1) The T -cyclic subspace W of V generated by v is the smallest T -invariant subsapce of V that contains v, or equivalently W = Span{T l (v) | l 0}. 2
(2) There is unique k n for which = {T l v | l < k} is a basis of W (note: k indeed is the largest positive integer for which {v, T (v), . . . , T k1 (v)} is linearly independent. (3) If from part (2) is taken as basis of W , then
[TW ] =
0 0 . . . 0 0 0 1 . . .
0 0 . . .
a0 a1 . . .
1 ak1
(4) The characteristic polynomial for TW is, from the above matrix, is g(t) = a0 + a1 t + + ak1 tk1 tk . We have g(T )(v) = 0 Polynomials Applied to Operators: For any polynomial g(t), we can apply to T to get g(T ). Cayley-Hamilton Theorem: Let f (t) = chT be the characteristic polynomial of T . Then f (T ) = 0. Outline of proof: To prove the theorem, we must show that f (T )(v) = 0 for v V . This is immediate if v = 0, so assume v = 0 and let W = Span{T l (v) | l 0}, the T -cyclic subspace W of V generated by v. From above, be a basis for W , and be an extension of to V , then [T ] = [TW ] 0 B1 B2 .
Hence chT = chTW chB2 . So chW |chT . Since we know, from (e), chW (T )(v) = 0, we have that chT (T )(v) = 0, which proves the theorem. Cayley-Hamilton Theorem for matrices: Let f (t) = det(A tI) be the characteristic polynomial of A. Then f (A) = 0. Inner-Product 3
The three important properties for complex inner products are: 1) < x, x >> 0 unless x = 0. 2) < x, y >= < y, x >. 3) For each y V , the map x < x, y > is linear. Norm: x
2
=< x, x >.
Outline of Proof. Use 0 < x cy, x cy > and choose c =< x, y > / y 2 . Triangle Inequality: x + y x + y . Orthonormal Bases: Important Lemma: Let e1 , . . . , en be orthonormal. Then e1 , . . . , en are linearly independent and any element x span{e1 , . . . , en } has the expansion x =< x, e1 > e1 + + < x, en > en .
Orthogonal Projection: The orthogonal projection of a vector x onto a nonzero vector y is dened by projy (x) = (x, x) y. y 2
Let W V be a fnite dimensional subspace of an innter product space V , and let {e1 , . . . , em } be a an orthonormal basis for W . Then the orthogonal projection of a vector x onto W is given by projW (x) =< x, e1 > e1 + < x, em > em . 4
Gram-Schmidt procedure: Given a linearly independent set x1 , . . . xm in an inner product space V it is possible to construct an orthonormal collection e1 , . . . , em such that span{x1 , . . . , xm }=span{e1 , . . . , em }: x1 e1 = . x1 z2 . z2 = x2 projx1 (x2 ) = x2 proje1 (x2 ) = x2 < x2 , e1 > e1 , e2 = z2 zk+1 zk+1 = xk+1 < xk+1 , e1 > e1 < xk+1 , ek > ek ek+1 = . zk+1 Orthogonal Complements: Let S V be a subset. The orthogonal complement of S in V is the subspace S = {x V ; < x, z >= 0 z S}. Theorem: Let V be an inner product space. If W V be a nite dimensional subspace. Then (1) V = W + W , i.e. x = y + z where y = projW (x) W, z = x projW (x) W . (2) W W = {0}. Outline of Proof. By Gram-Schmidt process, we can obtain an orthonormal basis {e1 , . . . , em }. Then projW (x) =< x, e1 > e1 + < x, em > em . Theorem. a) W (W ) . b) If W is a subset of a nite-dimensional space V , then W = (W ) . Proof: Do it by yourself! Theorem: x v > x projW (x) for any v = projW (x) in V . Practice Problem. Let P be the plane spanned by vectors v1 = (1, 1, 0) and v2 = (0, 1, 1). (i) Find the orthogonal projection of the vector x = (4, 0, ?1) onto the plane P. (ii) Find the distance from x to P . Theorem: Let A be an n n matrix. Then N (A) = R(At ) , N (At ) = R(A) . T , the conjugate of T : See the book. 5