Probability Slides
Probability Slides
cdf
pmf
10
expectation
For a discrete r.v. X with p.m.f. p(•), the expectation of X, aka expected
value or mean, is
average of random values, weighted
E[X] = Σx xp(x) by their respective probabilities
For the equally-likely outcomes case, this is just the average of the
possible random values of X
For unequally-likely outcomes, it is again the average of the possible
random values of X, weighted by their respective probabilities
Ex 1: Let X = value seen rolling a fair die p(1), p(2), ..., p(6) = 1/6
11
properties of expectation
Linearity of expectation, I
For any constants a, b: E[aX + b] = aE[X] + b
Proof:
21
properties of expectation–example
A & B each bet $1, then flip 2 coins: Let X = A’s net gain: +1, 0, -1, resp.:
HH A wins $2
P(X = +1) = 1/4
HT Each takes
back $1 P(X = 0) = 1/2
TH
P(X = -1) = 1/4
TT B wins $2
What is E[X]?
0
e2
E[X] = 1•1/4 + 0•1/2 + (-1)•1/4 = 0
slid
What is E[X2]?
m
Fro
E[X2] = 12•1/4 + 02•1/2 + (-1)2•1/4 = 1/2
What is E[2X+1]?
E[2X + 1] = 2E[X] + 1 = 2•0 + 1 = 1
22
properties of expectation
Note:
Linearity is special!
It is not true in general that
27
variance
28
what does variance tell us?
The variance of a random variable X with mean E[X] = μ is
Var[X] = E[(X-μ)2], often denoted σ2.
µ=0
σ=1
f(x)
0.2
0.1
0.0
-3 -2 -1 0 1 2 3
32
properties of variance
NOT linear;
Var[aX+b] = a2 Var[X] insensitive to location (b),
quadratic in scale (a)
Ex:
E[X] = 0
Var[X] = 1
Y = 1000 X
E[Y] = E[1000 X] = 1000 E[X] = 0
Var[Y] = Var[103 X]=106Var[X] = 106
39
independence
and .
joint .
distributions
41
variance of independent r.v.s is additive
(Bienaymé, 1853)
Theorem: If X & Y are independent, (any dist, not just binomial) then
Var[X+Y] = Var[X]+Var[Y]
Alternate Proof:
slide 60
93