Dynamical Systems UDGRP2k24

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Introduction to Dynamical Systems

UDGRP 2024

Subhojit Maji

Indian Statistical Institute

November 29, 2024

1 / 10
Table of Contents

1 Some Tools

2 Main Problem

3 Conclusion

2 / 10
Some Tools

Similiar Matrices

Definition
Two operators R and T on a vector space V are similiar if there exists an
invertible operator S on V such that R = S−1 TS.

Properties of Similiar Matrices


Rank(R) = Rank(T)
Eigen-Values of R and T are same.

3 / 10
Some Tools

What is an Eigen-Vector !?

y Eigen Vectors :
2⃗v Define a Linear Transformation.
When fixed a basis we know that it is
T (⃗v ) = A⃗v essentially a Matrix Multiplication.
⃗v T : Rn → Rn , T (v) = Av
x
There are some vectors that have preserved
their directional nature and these are called
as eigen vectors !!

4 / 10
Main Problem

Question Discussion

Main Problem
An n×n-matrix A is diagonalizable if and only if A has n linearly
independent eigen vectors.

Moreover, in this case, let P be the invertible matrix whose columns are n
linearly independent eigen vectors of A, and let D be the diagonal matrix
whose diagonal entries are the corresponding eigen values.Then
P−1 AP = D.

5 / 10
Main Problem

Claim 1...

Claim
Distinct Eigen-Values correspond to linearly independent Eigen-Vectors.

6 / 10
Main Problem

Proof to Claim :-
Proof: Apply induction on k. It is clear for k = 1. Suppose k ≥ 2 and
c1 v1 + · · · + ck vk = 0 for some scalars c1 , c2 , . . . , ck . Hence

c1 Av1 + c2 Av2 + · · · + ck Avk = 0.

Hence
c1 λ1 v1 + c2 λ2 v2 + · · · + ck λk vk = 0.
Hence

λ1 (c1 v1 + c2 v2 + · · · + ck vk ) − (λ1 c1 v1 + λ2 c2 v2 + · · · + λk ck vk )

= (λ1 − λ2 )c2 v2 + (λ1 − λ3 )c3 v3 + · · · + (λ1 − λk )ck vk = 0.


By induction, v2 , v3 , . . . , vk are linearly independent. Hence
(λ1 − λj )cj = 0 for j = 2, 3, . . . , k. Since λ1 ̸= λj for j = 2, 3, . . . , k,
cj = 0 for j = 2, 3, . . . , k. Hence c1 is also zero. Thus v1 , v2 , . . . , vk are
linearly independent.
7 / 10
Main Problem

Proof to the Main Problem :-

Proof. Assume that A has n linearly independent eigenvectors v1 , . . . , vn .


Let λ1 , . . . , λn be the corresponding eigenvalues, so that

Avi = λi vi

for all i = 1, . . . , n. Let P be the matrix that has v1 , . . . , vn as its columns.


Then P is invertible because v1 , . . . , vn are linearly independent. Let D be
the diagonal matrix that has λ1 , . . . , λn as its diagonal entries. By the
column method of matrix multiplication, the ith column of AP is Avi .
Also by the column method of matrix multiplication, the ith column of PD
is λi vi . Therefore, the matrices AP and PD have the same columns, i.e.,

AP = PD.

It follows that P −1 AP = D, as desired.

8 / 10
Conclusion

Summary

What We Learned:
Similar Matrices and some of its Properties.
Geometric Definition to EigenVectors.
Diagonalizable Matrices
Distinct Eigen Value correspond to Distinct EigenVectors

9 / 10
Conclusion

Thank You!
Questions?

10 / 10

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy