eigenvectors1 (1)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Quick Notes on Eigenvalues

Holden Milne

1 Overview
Normally, I like to give an intuition about what an Eigenvalue actually is, but I’m going to step away from
that for now. Spectral theory is a complicated beast that requires quite a bit of knowledge to really get
what the point of it all is. BUT we’ll just talk about the parts of the vital equation of an eigenvalue.

A⃗v = λ⃗v

Note, for a second that the right hand side here is what we should be familiar with from matrices and
systems of linear equations:
A⃗x = ⃗b

Here, however, we have that ⃗x has been replaced with ⃗v and this very odd property where we actually
have ⃗v on both sides.
This is the special feature of eigenvalues. λ is the eigenvalue and ⃗v is the eigenvector. All the eigenvalues
and eigenvectors are value and vector pairs that satisfy this equation. One thing we might be able to see
from this equation is that if A is a transformation matrix, then the direction of the solution vector ⃗v
doesn’t change direction. In other words, eigenvectors do not change direction under the multiplication of
A. A only causes ⃗v to get scaled up. Let’s make this concrete with an example. ! !
1 2 1
First, let’s look at the case when ⃗v is not an eigenvector. let A = , ⃗v = .
2 −2 1
!
3
If we do A⃗v we get A⃗v = . The observation to make here is that ⃗v was pointing up and to the right
0
[(1, 1)] while the vector after the transformation
! is now pointing straight up. The direction has changed.
2
Now let’s say A is the same but ⃗v = ; 2 units right for every one unit up.
1
!
4
Now the multiplication A⃗v = . All that’s happened under this transformation is that we’ve scaled
2
⃗v by 2. In other words, A⃗v = 2⃗v . This is because ⃗v (as I’ve chosen it to be) is an eigenvector, and the
associated eigenvalue is λ = 2. There are a lot of reasons why this is useful, but I won’t go into it. Maybe
you can already start to see why knowing the vectors that do not change direction under the transformation
could be a useful piece of knowledge. At very least it’s nice in math to be able to assume or know more
than to know less.

1
2 Finding Eigenvalues, Eigenvectors and Eigenspaces
2.1 Summary of Steps
 
v1
 v2 
 
Assume A is n × n and ⃗v = 
 .. 

.
vn

1. Find the Eigenvalue(s)

(a) Find A − λI
(b) Take the determinant of A − λI
(c) Set that determinant equal to zero
(d) Solve for λ to get the 1 or more λ values

2. Find the Eigenvector(s)

(a) For each λ = λi


i. Compute A − λi I
ii. Compute (A − λi I)⃗v to get equations on the variables v1 , v2 , ..., vn .
iii. Those equations will have free variables, find the free variables s1 , s2 , ..., sk (this can also
be done using row-echelon on A − λi I).
iv. Split the vector ⃗v = s1 u⃗1 + s2 u⃗2 + ... + sk u⃗k .
v. The eigenvectors of λi are u⃗1 , u⃗2 , ..., u⃗k

3. Find the Eigenspace(s)

(a) For each λ = λi


i. If V is the set of eigenvectors associated with λi we found before, then the Eigenspace
associated with λi is Si = span{V }

2.2 Finding Eigenvalues


The first step to finding eigenvectors is usually finding the eigenvalues. Thankfully, it’s a pretty consistent
algorithm you can use. Before we jump right in, we’re going to go back to that vital equation.

A⃗v = λ⃗v

Consequently, subtract the RHS from the LHS and we get

A⃗v − λ⃗v = ⃗0

2
(A − λI) ⃗v = ⃗0

(Note: the I in λI comes from the fact that we need it to be a matrix of the same size as A otherwise
we can’t subtract it from A (what does a matrix minus a real even mean??))
Something to mention here is that A needs to be square. Trust me on that, but try to figure out why
that has to be the case. Importantly, we should also observe, hey, if ⃗v is not the zero vector, how do we go
from (A − λI) ⃗v to the zero vector? Well, it doesn’t have to be that the A − λI is a matrix of zeroes, but
it does have to be the case that A − λI is not invertible. Or, det(A − λI) = 0.
This will be the first step always. Note though, we could equivalently have det(λI − A) = 0 as well.
This is commonly used for, as far as I can tell historical reasons. You will make less mistakes if you use
det(A − λI) because all you have to do is subtract lambda from the diagonal entries in A. The other way,
you have to negate all of A first. Ew.
So step 1, we’re going to take a determinant of A − λI. This will give us a little equation of a bunch of
constants and a variable λ. Often times, λ will be a polynomial such as λ2 − 2λ + 1 and that will always
equal zero. We need some kind of machinery from pre-calc. Usually the quadratic formula will work. If
it’s cubic or quartic (or dear god hopefully not) quintic or larger, you’ll need some heavier machinery.
So let’s look at this step in an example:
!
1 −2
A=
−1 0

Find the eigenvalue(s) of A.


Step 1, find A − λI. This way means we can just subtract λ from the diagonal entries of A.
!
1−λ −2
A − λI =
−1 −λ
Now we take the determinant.

det(A − λI) = (1 − λ)(−λ) − (−2)(−1) = λ2 − λ − 2

This is called the “characteristic equation”, though this name is used for a lot of different things...
Now we set that characteristic equation equal to zero and solve for lambda using synthetic division,
polynomial division or the quadratic formula. I’ll use the quadratic formula.

λ2 − λ − 2 = 0

Quadratic formula will be: √


−b ± b2 − 4ac
λ=
2a
So we get: p
−(−1) ± (−1)2 − 4(1)(−2)
λ=
2(1)

3

1± 1+8
=
2

1± 9
=
2
1±3
=
2
This gives us 2 possible values for λ. We’ll subscript them to keep track of them separately.
λ1 = −2/2 = −1 λ2 = 4/2 = 2
There we go, our two eigenvalues. Note, it is possible we could get “1” eigenvalue here. Since we have a
quadratic, (ie: the highest power on lambda is 2) we will always have 2 eigenvalues. So we say that in that
case, if we only get 1 eigenvalue it is “repeated” or we say it has multiplicity 2. As an example, you could
get the equation (λ − 3)2 = 0 and your two eigenvalues would be λ1 = 3 and λ2 = 3. The process to find
eigenvectors will be the same and produce the same result for both so you only need to do it once. But
you should still write out λ1 = λ2 = 3 in this case to get full marks. Eigenvalues can also be imaginary D:

2.3 Finding Eigenvectors


Once we know our Eigenvalues we need to find our Eigenvectors. So our next step is for each eigenvalue
we need to go through an plug that λi in to

(A − λi I)⃗v = ⃗0

and find ⃗v .
This process will always result in a free variable meaning we might get something that looks like
⃗v = s1 u⃗1 + s2 u⃗2 + ... + si u⃗i depending on how many free variables we’ve got. Those ⃗u vectors are going to
be our eigenvectors. !
v1
So, continuing with our above example, since A is 2 × 2 ⃗v must be in R2 . So let ⃗v =
v2
Recall this. !
1−λ −2
A − λI =
−1 −λ
We’re going to do the same process for each of the eigenvalues. First:

When λ = λ1 = −1 ! !
1 − (−1) −2 2 −2
A − λI = =
−1 −(−1) −1 1

Quick observation: The rows are linearly dependant! This is because we assumed that det(A − λI) = 0,
meaning that it’s not invertible! This will always happen and it means when we multiply by ⃗v we’re going
to get a free variable. !
v1
So let’s consider (A − λI)⃗v = ⃗0. This will give us a system of linear equations, recalling that ⃗v = .
v2

4
So for us we’ll get the equations 2v1 − 2v2 = 0 and −1v1 + 1v2 = 0 but these are the same equation (the
second is − 21 the first). So if we take the first equation v2 will be a free variable.
So let’s set v2 = s where s ∈ R. We now have two equations:

2v2
v1 = = v2 = s
2
v2 = s

or ⃗v = (s, s). If we had 2 free variables we might see something like ⃗v = (5s + t, s − 2t) or whatever.
If we had more free variables we’d have more variables in our equations, but the main thing is to replace
our components v1 , v2 , ..., vn with the corresponding equations.
Now, we need to split this up into a sum of vectors so that only one free variable is contained in
each vector. For us, that’s already done (s, s) only
! has one ! free variable, but if we had something like
5s t
⃗v = (5s + t, s − 2t) we would need to get ⃗v = + .
s −2t
!
1
Then we factor our the free variables, where we get ⃗v = s
1
For each free variable we have a constant vector attached to that free variable. These
( !) constant vectors
1
form the set of Eigenvectors associated with that eigenvalue. So in our case Vλ1 =
1

When λ = λ2 = 2 Take a moment to try this for yourself. I’ll wait.


................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................

5
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................
................................................................................................................................................................

6
Good job, I’m proud of you.

7
I’ll run through this quickly.
!
−1 −2
A − λi I =
−1 −2
REF:
!
−1 −2
0 0
v2 is our free variable. Let v2 = t where t ∈ R

v1 = −2v2 = −2t

! !
−2t −2
⃗v = =t
t 1
Therefore the set eigenvectors associated with λ2 is
( !)
−2
Vλ2 =
1

So we have two sets of eigenvectors


( !) ( !)
1 −2
Vλ1 = and Vλ2 =
1 1

2.4 Finding Eigenspaces


Now that we have our eigenvectors Vλi finding the eigenspaces Sλi is literally just, for each set of vectors,

Sλi = span {Vλi }

You just need to list them like ( !)


1
Sλ1 = span
1
( !)
−2
Sλ2 = span
1
Note: sometimes you’ll see Eλi for eigenspaces. I just like to use V for Eigenvector and S for Eigenspace.
Instead of E for Eigenspace and ? for eigenvector? I think it’s more descriptive, but it’s up to you, it
doesn’t really matter...

8
3 Diagonalization
The process of diagonalization is literally just finding Eigenvalues and Eigenvectors. We want to convert
A into the form P DP −1 . D is a diagonal matrix with the eigenvalues on the diagonal (note, this is where
the repetition of eigenvalues may come in)
 
λ1 0 0 ... 0
 
0 λ2 0 ... 0
 
D=0 0 λ3 ... 0


. .. .. 
. .
. .

0 0 0 ... λk
h i
and if all of the eigenvalues are v1 , v2 , v3 , ..., vn then P = v1 , v2 , ..., vn .
Where you place which vectors matters, if v2 corresponds to λ2 and you put λ2 in the first column for
some reason (where λ1 should go) then v2 also needs to go into the first column.
What matters here is that the total number of unique eigenvectors is equal to the number of eigenvalues
(not necessarily unique). If you find that λ1 = λ2 = 3 and there is only one eigenvector associated, this
! 3 and Vλ1 = {v1 , v2 } (and lets assume that’s all of the
isn’t going to work out. However, if λ1 = λ2 =
3 0 h i
eigenvalues and eigenvectors), then D = and P = v1 v2 .
0 3
Once we have P we find the inverse however we’d like and BAM, A = P DP −1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy