0% found this document useful (0 votes)
32 views16 pages

Lecture 5

The document discusses principal component analysis (PCA) and linear discrimination analysis (LDA) techniques for dimensionality reduction. It provides examples of applying PCA to calculate eigenvectors and transform data into a new coordinate system. It also demonstrates how LDA can be used to maximize separation between classes by finding vectors that differentiate data with the greatest variance between groups and least variance within groups.

Uploaded by

Malaika Khalil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views16 pages

Lecture 5

The document discusses principal component analysis (PCA) and linear discrimination analysis (LDA) techniques for dimensionality reduction. It provides examples of applying PCA to calculate eigenvectors and transform data into a new coordinate system. It also demonstrates how LDA can be used to maximize separation between classes by finding vectors that differentiate data with the greatest variance between groups and least variance within groups.

Uploaded by

Malaika Khalil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

PCA and LDA

Lecture 5
DATA ( ) 74
72
293
325
67 240
70 280
60 190
86 330
82 272
83 300
65 290
= 75 76 270 = 241
68 190
72 190
60 150
76 249
74 156
80 200
74 144
94 290
86 240
84 220
Data ( )
Normalized Data ( )

Normalized Values
=( , )
For = 1 →
Mean Subtracted Data ( − )
Covariance 74
72
293
325
67 240
70 280
60 190
• = 86 330
82 272
83 300
• = . − × − 65 290
76 270
81 214
• = 68 190
214 3378 72 190
60 150
76 249
74 156
80 200
74 144
94 290
86 240
84 220
Principle Component Analysis PCA
• Eigen Values
•| − |=0

81 214 0
• − =0
214 3378 0

• − 3459 + 227822 = 0

• = 3392, = 67
PCA
• Eigen Vector
• =

81 214
• =
214 3378

• = 15.5 → , = −0.065 →

0.064 1
• =
1 −0.064
PCA: Transformed Data
( ) 51.9
83.8
-4.4
-8.5
-1.4 -8.0
• Transformation 38.7 -7.6
-51.9 -11.8
• =( − )× 89.7 5.1
31.4 4.8
59.5 4.0
48.4 -13.2 2
1 29.1 -1.0
-51.4 -3.8
-51.1 0.1
81 214 -91.9 -9.3
=
214 3378 8.1 0.3
-85.0 4.2
-40.6 7.4
3406 2.3
= -97.0 5.0
2.3 67 50.2 15.7
-0.2 10.9
-20.3 10.1
Transformed Data
2

1
DATA ( ) 74
72
293
325
67 240
70 280
60 190
86 330
82 272
83 300
65 290
= 75 76 270 = 241
68 190
72 190
60 150
76 249
74 156
80 200
74 144
94 290
86 240
84 220
Mean Subtracted
Data ( − )
Transformed Data ( )
Linear Discrimination Algorithm (LDA)
• Maximize
• =

• =
• = =( −1) × ( )+( − 1) × ( )/( + − 2)
• = = − ( = = )
LDA
• Pooled Variance
82 295
• = =
295 1957
• Between group variance
81 214 82.4 295 −1.4 −81
• = − = − =
214 3378 295 1957 −81 1421
• Separation
0.3 −8
• = =
−0.08 1.9
LDA
• | − |=0

0.3 − −8
• =0
−0.08 1.9 −

• = 2.2, = −0.05

• =

0.95 1
• =
−0.3 0.04

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy