0% found this document useful (0 votes)
8 views

6-Eigenvalues-handout

The document discusses eigenvalues and eigenvectors in linear algebra, explaining their definitions and how to compute them using the characteristic polynomial. It provides examples of matrices, their eigenvalues, and corresponding eigenspaces, along with theorems regarding the linear independence of eigenvectors. Additionally, it covers the conditions under which a matrix can be diagonalized, emphasizing the relationship between eigenvalues, eigenvectors, and diagonal matrices.

Uploaded by

waterchicken29
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

6-Eigenvalues-handout

The document discusses eigenvalues and eigenvectors in linear algebra, explaining their definitions and how to compute them using the characteristic polynomial. It provides examples of matrices, their eigenvalues, and corresponding eigenspaces, along with theorems regarding the linear independence of eigenvectors. Additionally, it covers the conditions under which a matrix can be diagonalized, emphasizing the relationship between eigenvalues, eigenvectors, and diagonal matrices.

Uploaded by

waterchicken29
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Mathematics and Statistics II - Linear Algebra

Week 6: Eigenvalues and Eigenvectors

Andrés Perea

Maastricht University

Period 2

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 1 / 20


Eigenvalues and Eigenvectors

Consider a linear transformation T : Rn ! Rn .

Then, there is an n n matrix A such that T (v̂ ) = A v̂ for all


v̂ 2 Rn .

A vector v̂ 6= 0̂ is an eigenvector of T (or A) if there is a number


λ 2 R such that
A v̂ = λ v̂ .

In this case, the number λ is the eigenvalue that corresponds to v̂ .

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 2 / 20


Computing Eigenvalues and Eigenvectors

Suppose that v̂ 6= 0̂ is an eigenvector of A with eigenvalue λ.

Then, A v̂ = λ v̂ = λ I v̂ , and hence (A λ I ) v̂ = 0̂.

Hence, v̂ 6= 0̂ is in the nullspace of A λ I.

This can only be if A λ I is not invertible.

Hence, det(A λ I ) = 0.

Theorem
The eigenvalues of a square matrix A are exactly the numbers λ for which
det(A λ I ) = 0.

Here, det(A λ I ) is a polynomial of degree n in λ, and it is called


the characteristic polynomial of A.

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 3 / 20


Theorem
The eigenvalues of a square matrix A are exactly the numbers λ for which
det(A λ I ) = 0.

2 1
Example: Take A = . Compute the eigenvalues for A.
0 1

2λ 1
We have: A λ I = and hence
0 1 λ
det(A λ I ) = (2 λ)( 1 λ).

Here, det(A λ I ) is called the characteristic polynomial of A.

By setting det(A λ I ) = 0, we …nd λ1 = 2 and λ2 = 1.


Eigenvalues.

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 4 / 20


2 1
Example continued: For A = , the eigenvalues are λ1 = 2
0 1
and λ2 = 1.

How to compute the eigenvectors that belong to these eigenvalues?

Vector v̂ 6= 0̂ is an eigenvector with eigenvalue λ = 2 precisely when


A v̂ = 2 v̂ .

Hence,

0 1 x 0
(A 2 I ) v̂ = 0̂ () =
0 3 y 0
() y = 0.

Thus, eigenvectors with eigenvalue 2 are all vectors


x 1 1
=x . This is a subspace with basis .
0 0 0
This is the eigenspace for λ = 2.
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 5 / 20
2 1
Example continued: For A = , the eigenvalues are λ1 = 2
0 1
and λ2 = 1.

Compute eigenspace for λ = 1.

We must solve
3 1 x 0
(A ( 1) I ) v̂ = 0̂ () =
0 0 y 0
() 3x + y = 0.

By choosing x as the free variable, the eigenspace for λ = 1


x 1
contains all vectors =x .
3x 3
1
Hence, eigenspace for λ = 1 has basis .
3

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 6 / 20


2 0
Example: Take A = . Then,
0 2
2 λ 0
det(A λ I ) = det = (2 λ )2 .
0 2 λ
Only eigenvalue is λ = 2. Eigenspace for λ = 2 :

0 0 x 0
(A 2 I ) v̂ = 0̂ () = .
0 0 y 0

1 0
Eigenspace for λ = 2 has basis , .
0 1

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 7 / 20


2 1
Example: Take B = . Then,
0 2
2 λ 1
det(B λ I ) = det = (2 λ )2 .
0 2 λ
Only eigenvalue is λ = 2. Eigenspace for λ = 2 :

0 1 x 0
(B 2 I ) v̂ = 0̂ () =
0 0 y 0
() y = 0.

1
Eigenspace has basis .
0

Two matrices with the same characteristic polynomial may have


di¤erent eigenspaces.

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 8 / 20


Theorem
Let A be a square matrix, and let v̂1 , ..., v̂n be eigenvectors with
eigenvalues λ1 , ..., λn .
If all these eigenvalues are di¤erent, then fv̂1 , ..., v̂n g is linearly
independent.
3 2
3 4 7
Example: Take A = 4 0 1 8 5 . Then,
0 0 2
2 3
3 λ 4 7
A λ I =4 0 1 λ 8 5.
0 0 2 λ
Eigenvalues: det(A λ I ) = (3 λ)(1 λ)(2 λ) = 0. Hence,
λ1 = 3, λ2 = 1 and λ3 = 2.

Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 9 / 20


Theorem
Let A be a square matrix, and let v̂1 , ..., v̂n be eigenvectors with
eigenvalues λ1 , ..., λn .
If all these eigenvalues are di¤erent, then fv̂1 , ..., v̂n g is linearly
independent.
2 3
3 4 7
Example: Take A = 4 0 1 8 5 . Eigenspace for λ1 = 3 :
0 0 2
2 3 2 3 2 3
0 4 7 x 0
(A 3 I ) v̂ = 0̂ () 0 4 2 8 5 4 y 5 = 0 5
4
0 0 1 z 0
() y = 0 and z = 0.
82 39
< 1 =
Eigenspace for λ1 = 3 has basis 4 0 5 .
: ;
0
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 10 / 20
Theorem
Let A be a square matrix, and let v̂1 , ..., v̂n be eigenvectors with
eigenvalues λ1 , ..., λn .
If all these eigenvalues are di¤erent, then fv̂1 , ..., v̂n g is linearly
independent.
2 3
3 4 7
Example: Take A = 4 0 1 8 5 . Eigenspace for λ2 = 1 :
0 0 2
2 3 2 3 23
2 4 7 x 0
4
(A 1 I ) v̂ = 0̂ () 0 0 8 5 4 y 5=4 0 5
0 0 1 z 0
() z = 0 and 2x + 4y = 0.
82 39
< 2 =
Eigenspace for λ1 = 1 has basis 4 1 5 .
: ;
0
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 11 / 20
Theorem
Let A be a square matrix, and let v̂1 , ..., v̂n be eigenvectors with
eigenvalues λ1 , ..., λn .
If all these eigenvalues are di¤erent, then fv̂1 , ..., v̂n g is linearly
independent.
2 3
3 4 7
Example: Take A = 4 0 1 8 5 . Eigenspace for λ2 = 2 :
0 0 2
2 3 2 3 2 3
1 4 7 x 0
(A 2 I ) v̂ = 0̂ () 0 4 1 8 5 4 y 5 = 0 5
4
0 0 0 z 0
() y = 8z and x = 39z.
82 39
< 39 =
Eigenspace for λ1 = 2 has basis 4 8 5 .
: ;
1
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 12 / 20
Theorem
Let A be a square matrix, and let v̂1 , ..., v̂n be eigenvectors with
eigenvalues λ1 , ..., λn .
If all these eigenvalues are di¤erent, then fv̂1 , ..., v̂n g is linearly
independent.
2 3
3 4 7
Example: Take A = 4 0 1 8 5.
0 0 2
The eigenvalues are λ1 = 3, λ2 = 1 and λ3 = 2.

The
82 bases
39for8the
2 corresponding
39 8eigenspaces
2 39 are
< 1 = < 2 = < 39 =
4 0 5 , 4 1 5 and 4 8 5 .
: ; : ; : ;
0 0 1
Hence, three eigenvectors for λ1 = 3, λ2 = 1 and λ3 = 2 will be
linearly independent.
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 13 / 20
Diagonalizing a Matrix
2 3
3 4 7
4
Example: Take A = 0 1 8 5 with characteristic polynomial
0 0 2
(3 λ)(1 λ)(2 λ).
The eigenvalues are λ1 = 3, λ2 = 1 and λ3 = 2.
The
82 bases39for8the
2 corresponding
39 8eigenspaces
2 39 are
< 1 = < 2 = < 39 =
4 0 5 , 4 1 5 and 4 8 5 .
: ; : ; : ;
0 0 1
82 3 2 3 2 39
< 1 2 39 =
In fact: B = 4 0 5,4 1 5,4 8 5 is a basis of
: ;
0 0 1
eigenvectors. 32
3 0 0
Hence, [T ]BB = 4 0 1 0 5 . Diagonal matrix.
0 0 2
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 14 / 20
In general, let A be an n n matrix with a basis of eigenvectors
B = fv̂1 , ..., v̂n g, and corresponding eigenvalues λ1 , ..., λn .
If A represents the linear transformation T , then
2 3
λ1 0 0
6 7
[T ]BB = 4 0 . . . 0 5 = D.
0 0 λn
Let C be the standard basis for Rn . Then, [T ]BB = PB 1 [T ]CC PB .
Since [T ]CC = A, we have that PB 1 A PB = D.

Theorem
Let A be an n n matrix.
(a) If A has n di¤erent eigenvalues, then there is a basis of Rn that
consists of eigenvectors of A.
(b) If there is a basis of eigenvectors, then there is an invertible matrix P
such that P 1 A P = D is a diagonal matrix.The diagonal entries in D
are the eigenvalues, and the columns in P constitute the basis of
eigenvectors.
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 15 / 20
Matrix A can be diagonalized if there is an invertible matrix P and a
diagonal matrix D such that P 1 A P = D .

Hence, matrix A can be diagonalized precisely when there is a basis of


eigenvectors.
2 3
2 0 0
Example: Consider the matrix A = 4 3 0 1 5 . Can A be
0 1 0
diagonalized?

Characteristic polynomial:
2 3
2 λ 0 0
det(A λI ) = det 4 3 λ 1 5 = (2 λ ) ( λ2 1) =
0 1 λ
(2 λ)(λ 1)(λ + 1).
Three di¤erent eigenvalues λ1 = 2, λ2 = 1 and λ3 = 1.

There is a basis of eigenvectors, and A can be diagonalized.


Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 16 / 20
2 3
2 0 0
Example continued: A = 4 3 0 1 5 has eigenvalues λ1 = 2,
0 1 0
λ2 = 1 and λ3 = 1.
3 2
2 0 0
Find matrix P such that P 1 A P = D = 4 0 1 0 5.
0 0 1
82 39 82 39
< 1 = < 0 =
The three eigenspaces have bases 4 2 5 , 4 1 5 and
: ; : ;
1 1
82 39
< 0 =
4 1 5 . Verify this.
: ;
1
2 3
1 0 0
Hence, P = 4 2 1 1 5.
1 1 1
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 17 / 20
Similar Matrices
Two n n matrices A and B are called similar if there is an invertible
matrix P such that B = P 1 A P.
Theorem
Two similar matrices A and B have the same characteristic polynomials,
and hence the same eigenvalues.

Proof: We have
1 1 1
B λI = P A P λI = P A P P λI P
1
= P (A λI ) P.
Hence, characteristic polynomial of B is
1
det(B λI ) = det(P (A λI ) P )
1
= det(P ) det(A λI ) det(P )
1
= det(A λI ) det(P ) = det(A λI ).
det(P )
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 18 / 20
Representation of Linear Transformations
Suppose that two n n matrices A and B are similar.
Then, there is an invertible matrix P such that B = P 1 A P.
Let C be the standard basis for Rn and T : Rn ! Rn the linear
transformation with [T ]CC = A.
Since P is invertible, the columns in P constitute a basis
D = fd̂1 , ..., d̂n g for Rn .
Hence, PD = P.
Hence,
B=P 1
A P = PD 1 [T ]CC PD = [T ]D
D .

Theorem
Two n n matrices A and B are similar, if and only if, there is a linear
transformation T , and two di¤erent bases C and D for Rn , such that
A = [T ]CC and B = [T ]DD .
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 19 / 20
Example: Consider the linear transformation T : R3 ! R3 with
02 31 2 3 02 31 2 3 02 31 2 3
1 2 0 3 0 1
@4 0 5A = 4 0 5 , T @4 1 5A = 4 1 5 , T @4 0 5A = 4 1 5 .
0 3 0 4 1 1
82 3 2 3 2 39
< 1 1 1 =
Consider the basis B = 4 1 5 , 4 1 5 , 4 0 5 . What is [T ]BB ?
: ;
1 0 0
If C is the standard basis for R3 , then [T ]BB = PB 1 [T ]CC PB .
2 3 2 3
1 1 1 2 3 1
We have: PB = 1 1 4 0 5 and [T ]CC = 4 0 1 1 5 .
1 0 0 3 4 1
2 3 2 3 2 3
0 0 1 2 3 1 1 1 1
Hence, [T ]BB = 4 0 1 1 5 4 0 1 1 5 4 1 1 0 5=
1 1 0 3 4 1 1 0 0
2 3
2 1 3
4 0 0 3 5.
0 0 2
Andrés Perea (Maastricht University) Eigenvalues and Eigenvectors Period 2 20 / 20

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy