Lecture 06
Lecture 06
Expectation
Chapter 4 sections
4.1 Expectation
4.2 Properties of Expectations
4.3 Variance
4.4 Moments
4.5 The Mean and the Median
4.6 Covariance and Correlation
4.7 Conditional Expectation
SKIP: 4.8 Utility
Summarizing distributions
Median, Moments
Variance, Interquartile Range (IQR), Covariance, Correlation
If the sum or integral does not exists we say that the expected value
does not exist.
The mean is often denoted with µ.
STA 611 (Lecture 06) Expectation 4 / 20
Chapter 4 4.1 Expectation
Examples
Recall the distribution of Y = the number of heads in 3 tosses
(coin toss example from Lecture 4)
y 0 1 2 3
fY (y ) 18 38 38 18
then
1 3 3 1 12 3
E(Y ) = 0 + 1 + 2 + 3 = = = 1.5
8 8 8 8 8 2
Examples
Recall the distribution of Y = the number of heads in 3 tosses
(coin toss example from Lecture 4)
y 0 1 2 3
fY (y ) 18 38 38 18
then
1 3 3 1 12 3
E(Y ) = 0 + 1 + 2 + 3 = = = 1.5
8 8 8 8 8 2
Expectation of g(X )
Theorem 4.1.1
Let X be a random variable with p(d)f f (x) and g(x) be a real-valued
function. Then
X discrete: X
E (g(X )) = g(x)f (x)
All x
X continuous: Z ∞
E (g(X )) = g(x)f (x) dx
−∞
Expectation of g(X , Y )
Theorem 4.1.2
Let X and Y be random variables with joint p(d)f f (x, y ) and let g(x, y )
be a real-valued function. Then
X and Y discrete:
X
E (g(X , Y )) = g(x, y )f (x, y )
All x,y
X and Y continuous:
Z ∞ Z ∞
E (g(X , Y )) = g(x, y )f (x, y ) dx dy
−∞ −∞
Example: Find E X +Y
2 where X and Y are independent and
X ∼ Uniform(a, b) and Y ∼ Uniform(c, d).
Properties of Expectation
Theorems 4.2.1, 4.2.4 and 4.2.6:
E(aX + b) = aE(X ) + b for constants a and b.
Let X1 , . . . , Xn be n random variables, all with finite expectations
E(Xi ), then
n n
!
X X
E Xi = E(Xi )
i=1 i=1
Examples
If X1 ,P
X2 , . . . , Xn are i.i.d. Bernoulli(p) random variables then
Y = ni=1 Xi ∼ Binomial(n, p).
E(Xi ) = 0 × (1 − p) + 1 × p = p for i = 1, . . . , n
n n n
!
X X X
⇒ E(Y ) = E Xi = E(Xi ) = p = np
i=1 i=1 i=1
Def: Variance
Let X be a random variable (discrete or continuous) with a finite mean
µ = E(X ). The Variance of X is defined as
Var(X ) = E (X − µ)2
p
The standard deviation of X is defined as Var(X )
1 3
Var(Y ) = (0 − 1.5)2 + (1 − 1.5)2
8 8
23 1
+ (2 − 1.5) + (3 − 1.5)2
8 8
= 0.75
Var(aX + b) = a2 Var(X )
Examples
If X1 ,P
X2 , . . . , Xn are i.i.d. Bernoulli(p) random variables then
Y = ni=1 Xi ∼ Binomial(n, p).
E(Xi ) = p for i = 1, . . . , n
E(Xi2 ) = 0 × (1 − p) + 12 × p = p
2
for i = 1, . . . , n
⇒ Var(Xi ) = E(Xi2 ) − [E(Xi )]2 = p − p2 = p(1 − p)
n n n
!
X X X
⇒ Var(Y ) = Var Xi = Var(Xi ) = p(1 − p)
i=1 i=1 i=1
= np(1 − p)
Def: Moments
Let X be a random variable and k be a positive integer.
The expectation E(X k ) is called the k th moment of X
Let E(X ) = µ. The expectation E (X − µ)k is called the k th
central moment of X
Theorem 4.4.2
Let X be a random variables whose m.g.f. ψ(t) is finite for t in an open
interval around zero. Then the nth moment of X is finite, for
n = 1, 2, . . ., and
dn
E(X n ) = n ψ(t)
dt t=0
Example
1
f (x) = x n−1 e−x/β for x > 0
(n − 1)!β n
Find the m.g.f. of X and use it to find the mean and the variance of X .
Properties of m.g.f.
Example
(x − µ)2
1
f (x) = √ exp −
σ 2π 2σ 2
t 2σ2
ψ(t) = exp µt +
2