Lecture 8 Course
Lecture 8 Course
Lecture 8 Course
Lecture 8
Linear DiscriminantFunction
+
wy
g(x) w..
+
=
G,Rd
E w, wO3, EI
=
X2
(i) **x.
0
gyx) =
+
wy
g(x) w..
+
=
W X,
\
w0 0
+ =
xy
X
I
+
w Ei
-
+ wo =
0.
ZT(X, -
x) 0
=
↓
separating hyperplane
to
a = -
g(x) 115112
DIII]
0
=
I X. -
xp y.
=
>
w+zy w.in.
IIZ1k
Sizp3 =
+
wT
g(X0 =
& =
1. i y.
=
1151k
sxrgcE
11211
Xp x.
|
=
-
-
g(x) xp angmin 115-5.1
penmaia
=
0
=
S.t. Ex wo 0.
+ =
\ xyz =
*8. - x5 =
3
(z ws)
+
7x2 = -
x +
0.
=
0 fXx xw)
+
5x +
0
=
+
w.
+
x0Tw =
+
0 2
=
30 X111' 0.=X
+
+ -
=
same as
srvious
1 xi 3. =
-
pp. /
******
X2
↑-***
Not
linery separable
Not are
↑
Ghearly Separable
Separating Hyperplane Theorem.
↑I
-> convex
Let C, and G be two closed
8
convex
sets.
nonconvex
such thatC,12
0. Then,
=
there exists
linear
a
function
g(x)
+
1 x wo.
= +
C2
/
such that.
91x330.Vxec..
g(x)<0.0xta.
159
xx
x
Linear DiscriminantAnalysis.
1X
Generative Methods Discriminative Methods.
08 (
xx
x
x
x
x x X
determine
①
model of the data
①
separating boundary
② separating boundry.
Generative Approaches
g(x) wx
=
w. 0.
+ =
N(x((i.zi)
Prior distribution.
P.(i) Hi
=
X. R.V. XnN(4.2).
PX (x) = =
exp
f(2π)9(2)
-
z(x -
zz" ,x -
m).
ETX,]
i E(X) =
↑ 1. Y
=
#XaD
&
VarTX.3. CorTX,Xc]...
z (((x
=
⑦: I always
- M)(x -
y) =
j !-
positive semi-definite.
-
Special Case:
2
G,
I =
↑ 0.%] 1. =>
Xi.X;independent.
Vizj.
PX(x) 0z,exp3
=
-
z(5 -
iz ",x -
m).
# S. "9.***'3
Appendix
g (x) = w T (x x 0)
✓ ◆
⇤ ⇤ T x⇤ + y⇤
= (x y ) x
2
⇤
kx k 2 ky ⇤ k2
= (x ⇤ y ⇤ )T x
2
According to picture, we want g (x) > 0 for all x 2 C1 .
Suppose not. Assume
kx ⇤ k2 ky ⇤ k2
g (x) = (x ⇤ y ⇤ )T x < 0.
2
See if we can find a contradiction.
c Stanley Chan 2020. All Rights Reserved.
27 / 34
Proof of Separating Hyperplane Theorem
C1 is convex.
Pick x 2 C1
Pick x ⇤ 2 C1
Let 0 1
Construct a point
x = (1 )x ⇤ + x.
Convex means
x 2 C1
So we must have
kx y ⇤k kx ⇤ y ⇤k
c Stanley Chan 2020. All Rights Reserved.
28 / 34
Proof of Separating Hyperplane Theorem
kx y ⇤ k2 = k(1 )x ⇤ + x y ⇤ k2
= kx ⇤ y ⇤ + (x x ⇤ )k2
= kx ⇤ y ⇤ k2 + 2 (x ⇤ y ⇤ )T (x x ⇤) + 2
kx x ⇤ k2
= kx ⇤ y ⇤ k2 + 2 w T (x x ⇤) + 2
kx x ⇤ k2 .
Remember: w T (x x 0 ) < 0.
c Stanley Chan 2020. All Rights Reserved.
29 / 34
Proof of Separating Hyperplane Theorem
kx y ⇤ k2 = kx ⇤ y ⇤ k2 + 2 w T (x x ⇤) + 2
kx x ⇤ k2
< kx ⇤ y ⇤ k2 + 2 (w T x 0 w T x ⇤ ) + 2 kx x ⇤ k2
✓ ⇤ 2 ◆
kx k ky ⇤ k2
= kx ⇤ ⇤ 2
y k +2 wT x⇤
2
+ 2
kx x ⇤ k2
= kx ⇤ y ⇤ k2 kx ⇤ y ⇤ k2 + 2
kx x ⇤ k2
| {z } | {z }
=A =B
⇤ ⇤ 2 2
= kx y k A+ B
⇤ ⇤ 2
= kx y k (A B).
Now, pick an x such that A B > 0. Then (A B) < 0.
A kx ⇤ y ⇤ k2
< = .
B kx x ⇤ k2 c Stanley Chan 2020. All Rights Reserved.
30 / 34
Proof of Separating Hyperplane Theorem
Therefore, if we choose such that A B > 0, i.e.,
A kx ⇤ y ⇤ k2
< = ,
B kx x ⇤ k2
kx y ⇤ k2 < kx ⇤ y ⇤ k2 (A B)
< kx ⇤ y ⇤ k2
Conclusion:
If x 2 C1 , then g (x) > 0.
By symmetry, if x 2 C2 , then g (x) < 0.
And we have found the separating hyperplane (w , w0 ).
c Stanley Chan 2020. All Rights Reserved.
31 / 34
Q&A 1: What is a convex set?