0% found this document useful (0 votes)
28 views25 pages

Materi 5 - 2

The document discusses linear classification and linear discriminant analysis. It explains that linear classification involves decision boundaries that are linear in the input feature space. Linear discriminant analysis assumes that conditional class densities are Gaussian and covariance is equal for every class. It uses training data to estimate parameters like prior probabilities, class means, and the covariance matrix. These parameters are then used to calculate discriminant functions for classification. New data points are assigned to the class with the maximum discriminant function value.

Uploaded by

murteeh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views25 pages

Materi 5 - 2

The document discusses linear classification and linear discriminant analysis. It explains that linear classification involves decision boundaries that are linear in the input feature space. Linear discriminant analysis assumes that conditional class densities are Gaussian and covariance is equal for every class. It uses training data to estimate parameters like prior probabilities, class means, and the covariance matrix. These parameters are then used to calculate discriminant functions for classification. New data points are assigned to the class with the maximum discriminant function value.

Uploaded by

murteeh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Llnear Classler

1eam Leachlng
2
Llnear Classlcauon
WhaL ls meanL by llnear classlcauon?
1he declslon boundarles ln Lhe ln Lhe feaLure
(lnpuL) space ls llnear
Should Lhe reglons be conuguous?
8
1

8
2

8
3

8
4

!
1

!
2

lecewlse llnear declslon boundarles ln 2u lnpuL space
3
Llnear Classlcauon.
1here ls a dlscrlmlnanL funcuon !
"
(#) for each
class "
Classlcauon rule:
ln hlgher dlmenslonal space Lhe declslon
boundarles are plecewlse hyperplanar
8emember LhaL 0-1 loss funcuon led Lo Lhe
classlcauon rule:
So, can serve as !
"
(#)
)} ( max arg : { x k x R
j
j
k
! = =
)} | ( max arg : { x X j G P k x R
j
k
= = = =
) | ( X k G P =
4
Llnear Classlcauon.
All we requlre here ls Lhe class boundarles [#:!
"
(#) =
!
%
(#)} be llnear for every (", %) palr
Cne can achleve Lhls lf !
"
(#) Lhemselves are llnear or
any monoLone Lransform of !
"
(#) ls llnear
An example:
x
x X G P
x X G P
x
x X G P
x
x
x X G P
T
T
T
T
! !
! !
! !
! !
+ =
= =
= =
+ +
= = =
+ +
+
= = =
0
0
0
0
]
) | 2 (
) | 1 (
log[
) exp( 1
1
) | 2 (
) exp( 1
) exp(
) | 1 (
Llnear
So LhaL
3
Llnear ulscrlmlnanL Analysls
!
=
= = =
K
l
l l
k k
x f
x f
x X k G
1
) (
) (
) | Pr(
"
"
Lssenually mlnlmum error 8ayes classler

Assumes LhaL Lhe condluonal class denslues are (muluvarlaLe) Causslan

Assumes equal covarlance for every class
osLerlor probablllLy
"
"
ls Lhe prlor probablllLy for class "

&
"
(#) ls class condluonal denslLy or llkellhood denslLy
Appllcauon of
8ayes rule
)) ( ) (
2
1
exp(
| | ) 2 (
1
) (
1
2 / 1 2 /
k
T
k
p
k
x x x f
!
" " " =
"
!
!
6
LuA.
)
2
1
(log )
2
1
(log
log log
) | Pr(
) | Pr(
log
1 1 1 1
l
T
l l
T
l k
T
k k
T
k
l
k
l
k
x x
f
f
x X l G
x X k G
! !
!
!
" " " "
" + " " + =
+ =
= =
= =
! ! ! !
) (x
l
! ) (x
k
!
) ( max arg ) (

x x G
k
k
! =
) | Pr( max arg ) (

x X k G x G
k
= = =
Classlcauon rule:
ls equlvalenL Lo:
1he good old 8ayes classler!
7
LuA.
k
k g
i k
N x
i
/
!
=
=
N N
k k
/

= !
) /( ) )( (

1
K N x x
K
k g
T
k i k i
i
! ! ! =
" "
=
!
1ralnlng daLa uullzed Lo esumaLe

rlor probablllues:


Means:


Covarlance maLrlx:
When are we golng Lo use Lhe Lralnlng daLa?
N i x g
i i
: 1 ), , ( =
1oLal ' lnpuL-ouLpuL palrs
'
"
number of palrs ln class "
1oLal number of classes: (
8
LuA: Lxample
LuA was able Lo avold masklng here
SLudy case
lacLory A8C produces very expenslve and
hlgh quallLy chlp rlngs LhaL Lhelr quallues are
measured ln Lerm of curvaLure and dlameLer.
8esulL of quallLy conLrol by experLs ls glven ln
Lhe Lable below.
!"#$%&"#' )*%+'&'# ,"%-*&. !/0&#/- 1'2"-&
2.93 6.63 assed
2.33 7.79 assed
3.37 3.63 assed
3.37 3.43 assed
3.16 4.46 noL passed
2.38 6.22 noL passed
2.16 3.32 noL passed
As a consulLanL Lo Lhe facLory, you geL a Lask Lo
seL up Lhe crlLerla for auLomauc quallLy conLrol.
1hen, Lhe manager of Lhe facLory also wanLs Lo
LesL your crlLerla upon new Lype of chlp rlngs LhaL
even Lhe human experLs are argued Lo each
oLher. 1he new chlp rlngs have curvaLure 2.81
and dlameLer 3.46.
Can you solve Lhls problem by employlng
ulscrlmlnanL Analysls?
Soluuons
When we ploL Lhe feaLures, we can see LhaL
Lhe daLa ls llnearly separable. We can draw a
llne Lo separaLe Lhe Lwo groups. 1he problem
ls Lo nd Lhe llne and Lo roLaLe Lhe feaLures ln
such a way Lo maxlmlze Lhe dlsLance beLween
groups and Lo mlnlmlze dlsLance wlLhln group.
x = feaLures (or lndependenL varlables) of all
daLa. Lach row (denoLed by ) represenLs one
ob[ecL, each column sLands for one feaLure.
? = group of Lhe ob[ecL (or dependenL
varlable) of all daLa. Lach row represenLs one
ob[ecL and lL has only one column.

x= y=
!
2.95
2.35
3.57
3.16
2.58
2.16
3.27
6.63
7.79
5.65
5.47
4.46
6.22
3.52
"
#
$
$
$
$
$
$
$
$
$
%
&
'
'
'
'
'
'
'
'
'
!
1
1
1
1
2
2
2
"
#
$
$
$
$
$
$
$
$
$
%
&
'
'
'
'
'
'
'
'
'
x
k
= daLa of row k, for example x
3
=

g=number of gropus ln y, ln our example, g=2
x
l
= feaLures daLa for group l . Lach row
represenLs one ob[ecL, each column sLands for
one feaLure. We separaLe x lnLo several
groups based on Lhe number of caLegory ln y.
!
3.57 5.65
[ ]



x
1
= x
2
=
!
2.95
2.53
3.57
3.16
6.63
7.79
5.65
5.47
"
#
$
$
$
$
%
&
'
'
'
'
!
2.58
2.16
3.27
4.46
6.22
3.52
"
#
$
$
$
%
&
'
'
'

l
= mean of feaLures ln group l, whlch ls average
of x
l


1
= ,
2
=
= global mean vecLor, LhaL ls mean of Lhe whole
daLa seL.
ln Lhls example, =
!
2.67 4.73
[ ]
!
3.05 6.38
[ ]
!
2.88 5.676
[ ]
= mean correcLed daLa, LhaL ls Lhe
feaLures daLa for group l, x
l
, mlnus Lhe global
mean vecLor

= =
!
"0.305
"0.732
0.386
"1.218
0.547
"2.155
#
$
%
%
%
&
'
(
(
(
!
0.060
"0.357
0.679
0.269
0.951
2.109
"0.025
"0.209
#
$
%
%
%
%
&
'
(
(
(
(
!
x
i
0
!
x
1
0
!
x
2
0
Covarlance maLrlx of group l =





C
1
= C
2
=




!
c
i
=
(x
i
0
)
T
x
i
0
n
i
!
0.166
"0.192
"0.192
1.349
#
$
%
&
'
(
!
0.259
"0.286
"0.286
2.142
#
$
%
&
'
(


= pooled wlLhln group covarlance maLrlx. lL ls
calculaLed for each enLry ln Lhe maLrlx. ln our
example, 4/7*0.166 + 3/7*0.239=0.206 ,
4/7*(-0.192)+3/7*(-0.286)=-0.233 and
4/7*1.349+3/7*2.142=1.689 , Lherefore
!
C(r, s) =
1
n
n
i
c
i
(r, s)
i=1
g
"

C =

1he lnverse of covarlance maLrlx ls :

C
-1
= !
0.206
"0.233
"0.233
1.689
#
$
%
&
'
(
!
5.745
0.791
0.791
0.701
"
#
$
%
&
'
= prlor probablllLy vecLor (each row
represenL prlor probablllLy of group ). lf we do
noL know Lhe prlor probablllLy, we [usL
assume lL ls equal Lo LoLal sample of each
group dlvlded by Lhe LoLal samples, LhaL ls
p = =


!
0.571
0.429
"
#
$
%
&
'
!
4 / 7
3/ 7
"
#
$
%
&
'
dlscrlmlnanL funcuon

We should asslgn ob[ecL k Lo group l LhaL has
maxlmum f
l

f
i
=
i
C
!1
x
k
T
!
1
2

i
C
!1

i
T
+ln( p
i
)
LuA

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy