Mathematical Expectation
Mathematical Expectation
Contents ...
9.1 Introduction
in Bivariate Distribution
9.2 Mathematical Expectation of a Function
9.3 Theorems on Expectation
9.4 Covariance
Còmbination of Random Variables
9.5 Variance of a Linear
9.6 Correlation Coefficient
9.7 Independence Versus Uncorrelatedness
9.8 Conditional Expectation
Distribution
9.9 Raw and Central Moments of a Bivariate
Key Words :
p.m.f., Joint c.d.f.
Joint p.m.f., Marginal p.m.f., Conditional
Objectives :
random variable and a function of a
Learn how to compute expectation of a bivariate
bivariate random variable.
quantities for bivariate
Learn to compute variance, covariance and other related
random variable.
9.1 Introduction
In the previous chapter, we discussed the joint, marginal as well as conditional
probubility distributions related to a two dimensional diserete rv. (X, Y). However, only the
knowledge of above probability distributions Is not suflicient to get the idea about the joint
behaviour of (X. Y). As in the case of univariate probability disributions, we might be
(9.1)
Mathematical
Expectation (
F.Y.B.Sc. Statistics
interested in some (Paper
9.2
- ) such as a measure of association between the two variables, or
quantities (Bivariate)
variance of linear combination of the variables etc. This is facilitated through the
he
mathematical expectation of a real valued function of (X, Y); which
development of
with in this chapter. Bivariate
Mathematical Expectation of a Function in
9.2
Distribution
a two dimensional discrete r.v. with probability distrik
Definition : Let (X, Y) be Y) be a real valued function (X. Y) Tn
..., n). Let g (X,
{(Xi, yj. Pij): i=1, ..., m; j = 1, 2,
expectation of g (X, Y) is given by m n
Theeorems on Expectation
B3
Theerem l:Let Xand Y be two diserete
ers. Then.
E(X +Y = E(X) + E(Y)
Pal: Let ( Pgki= I. . m:j=1. .. n) be the probability distribution of
i=l j=1
n
Y)2 P
n
i=1
xP +j=l2 yjP.j
shere. P and P.are marginal probabilities of (X = x) and (Y = y) respectively. Therefore,
I P=E (X) andj=l yP.j=E(M.
This leads to. E(X+ Y = E(X) +E(Y)
expectations.
Ihus, expectation of sum of two discrete r.v.s. is equal to the sum of their
Kemarks :1. The above theorem can be extended to more than two variables
AXX: where X. X. .... X are discrete r.v.s.
EX,+X,+ ... +X) = E(X,) +E (X;) + ... +E(X)
-Theorem I is a very powerful tool in Statistics. This can be seen from the following
mplications of the theorem.
E(aX +bY) = aE (X) + bE(): a, b real constants
(Proof left as an exercise)
Using this, we can compute.
E(X- Y) = E(X)-E(Y)
E(2X- 3Y) = 2E (X)- 3E() etc.
) Whenever, we have to obtain expectation of an expression containing several terms,
We can compute it by separating the expectations,
or example, E(3X-4X+5) = 3E(X)-4E (X) +5
EOoY+ 3X'Y - XY +6) m E(X')+ 3E (X'Y)-E(XY)+6 ete.
an we write E(XIY)=E(X)E)or E (XY) =E (X) E(Y)in the above expression ?
To know answer, let us read the next theorem.
F.Y.B.Sc. Statistics (Paper -)
9.4 Mathematical Expectation (Bivariate
discrete r.v.S. then
Theoremn 2 : Let X and Y be two independent
E(XY) = E(X)· E(Y)
n) represent the joint probability
Proof : Let ((xi. yi Pi): i = 1, .... m: j = 1,
distribution of (X, Y). Also let P = Pij and P. j=2 Pj represent the marginal
= i=l
X j= I Xiy; P;. P.j .X andY are independent
m
= i=1
xP. j=12 y Pj
E(XY) = E(X)E(Y)
Thus, expectation of product of two r.V.s. is equal to the product of their expectations if
the two variables are independent.
3. If X, X, ., Xç are kindependent r.v.s, then
E(X, X, ... , X) = E(X) E(X) ... E (X)
4. Converse of Theorem 2 is not in general true. That is, if E (XY) =E (X) E(Y).
then X and Y may not be independent. This can be seen from the following example.
Example 9.2 : Let (X, Y) h¡ve the following joint p.m.f.
Y -1 1 Pi
X
0 1/6 0 1/6
1 1/6 0 1/6 1/3
2 0 1/2 1/2
P.i 1/6 2/3 1/6
It is obvious that Xand Yare not independent. For, 1 1
(iv)
76 F 36
9.5 Variance of a Linear Combination of Random Variables
Theoremn 3: Suppose X and Y are two discrete r.v.S. then,
Var (aX + bY) = a² Var (X) + b² Var (Y) + 2ab Cov (X, Y)
(ii) Var (aX -bY) = a? Var (X) + b² Var (Y) - 2ab Cov (X, Y)
where a, b are real constants.
Li=1 i=lj=1
i<j
n n
n
Var ax i=1
Cov (Xi, X) =0 for ij
M=
2. Suppose X,, X,, ..., X are independent r.v.s. with oË = G²; i = 1, ..., n, then,
X;
i=1
Var (X ) = Var
n
Cov (u. v
p(U. V) =
E(Y) = 3
The joint p.m.f. of (X, Y) is tabulated below.
Y 0 1
X
0 1/3
/3 0
1/3
Obviously, X and Y are not independent.
But,
Cov (X, Y) = E(XY)-E(X) E() =- + =0
X and Y are uncorrelated.
Illustrative Examples
Example 9,4: For two discrete r.v.s. X, Y; Var (X) =Var (Y) =1. Cov (X, Y) =:
Find (6) var (4X - 3Y), (i) p ,
(i) Also prove that U=X +Y and V=X- Y are
uncorrelated.
F.Y.B.Sc. Statistics (Paper - I) 9.12 Mathematical Expectation (Bivariates
Solution : () Var (4X-3Y)= 4 Var (X) + 32 Var ()
-2.4.3 Cov (X, Y... by theorem 3.
= 13.
Cov (X, Y)
VVar (X) VVar (Y)
(11) In order to prove thatU=X+ Y and V = X-Y are uncorrelated, it is enouchs
prove that Cov (U, V) =0.
Cov (U, V) = E(UV) - E (U) E (V)
= E{(X+ Y) (X- Y} - (E (X) +E (D} (E (X)-E (Y)
= E(X?) -E(Y)-(E (X))?+ (E (Y}P
= E (X?) - {E (X)}2 - [E (Y) (E (Y))]
= Var (X) Var (Y)
= 1-1 =0
Example 9.5 : Suppose X,, Xz, X, are three discrete r.v.s. with means 10, 20 and 40 and
s.d.s. 2, 4, and 6 respectively. Further, p (X, X,) =i = p X, X) and p (X, X) =
Find (i) E (4X, + 2X, -3X,): (ii) Var (3X, -2X, + X); (ii) Var (X, -X, - X);
(iv) Cov (2X, +3,4 - 3X,); (v) p (3X,-3, X,+ 1): (vi) p (4 + X,, 5 2X,).
Solution:
1) E (4X,+ 2X,- 3X,) = 4E (X,) + 2E (X)- 3E (X,)
= -40
(ii) Using Theorem 4, we write
Var (3X, - 2X,+ X,) = 3oí +22 o + ¡ - 12 Cov (X, X)
+6 Cov (X, X;) 4 cov (X,, X) . .()
Now, Cov (X,, X,) = o, o,p (X, X,)
= 2x4xi =2
Cov (X, X,) = o, o,p (X, X) =3
Cov (X. X) = O, G,p(X,, X,) = 12
Hence (1) becomes,
Var (3X- 2X,+ X) = 36 +64 +36- 24+
18-48 = 82
(iii) Var (X- X- X1)= G +o; +oi +2 Cov (X, X,)
-2 Cov (X,, X)- 2 Cov (X, X)
44 16+ 36 +4-6- 24 =3O
I) Mathematical Expectation (Bivariate)
EY.8.Sc. Statistics (Paper- 9.13
0,4045
correlated.
Xand Y are negatively
D.8 Conditional Expectation distribution
dimensional discrete rv. with probability
Suppose (X, Y) is a two
function is nothing but the expectation of the r.v.
Conditional expectation of a FV. or its
conditional distribution.
sTunction using the appropriate
FY.B.Sc. Statistics (Paper -) 9.14 Mathematical Expectation (Bivariate)
(a) Conditional expcctation of Xgiven Y yj : We have seen carlier, that the
conditional distribution of XgivenY= y, is as follows :
X Km
P(N)Y =y) P Pa P Pmj
P., P. P. P.j
= 2 Pij
i=1
Xi P.
and E (XIY = y) is the conditional mean of X given Y = y;.
Conditional variance of Ygiven X = X;: The conditional variance of Y given
(d)
X= x;is defined on the similar lines.
Var (YIX= x;) = E(Y'/X = x;) (E (YIX= x))?
1
of Y given X= Xf.
and E (Y|X = x;) is the conditional mean
Distribution
Moments of a Bivariate
9.9 Raw and Central
a bivariate distribution are. defined on similar lines of univariate
Moments for
distribution. represent the joint probability distribution
i = 1, 2, ...,m; j=1,2, .., n)
Let ((xi. y, pii);
of (X, Y).
(X, Y) is denoted by and is defined
(a) Raw Moments : The (r, sth raw moment of
= P
= E(X)
(b) Central Moments : The (r, s)th central moment of (X, Y) is denoted by ur and is
defined as
cov (X, Y
Vvar (X) var (Y)
Points to Remember
Exercise 9 (A)
Theory Questions :
1 Define mathematical expectation of a function of a discrete bivariate r.v. (X, Y).
) Suppose Xand Yare two discrete r.v.s. with joint probability distribution.
(Xi. y;. Pij): i= 1, ..., m: j= 1, ...,n)
Prove that
(i) E(X+ Y) =E(X) + E (Y)
(ii) E(XY) =E(X) -E(Y) if Xand Yare independent.
3. Give an example of joint probability distribution of (X, Y) where
E(XY)=E(X)·E (Y) holds, but X and Y are not independent.
4 Define covariance between two r.v.s. What does it measure ?
5. Does Cov (X, Y) =0 imply that Xand Yare independent ? Justify.
6. Prove that :
() Cov (aX, bY) = ab Cov (X, Y)
(i) Cov (X + c, Y + d) = Cov (X, Y)
(ii) Cov (X, X) = Var (X).
a, b, c, d are constants.
7. Suppose X and Y are two discrete r.v.s. Prove that
() Var (aX + bY) =a² Var (X) + b² Var (Y) + 2ab Cov (X, Y)
(i) Var (aX - bY) =a? Var (X) + b² Var (Y) - 2ab Cov (X, Y)
8. For X, Y two independent discrete r.v.s. prove that
) Var (X+ Y) = Var (X) + Var (Y)
(i) Var (X-Y) = Var (X) + Var (Y).
9. Derive the expression for variance of linear combination of n discrete r.v.s. X;, X,
10. If x, X2, ..., Xn are independent discrete r.V.S. with common variance G², show that
Var (X) =
n
i=1
where, X =
11. 1x, X)s ***s Xn are discrete r.v.s. with Var (x;) = g for i= 1, 2, ...n and correlation
Coefficient between xË and x; as p (i # j). Then show that
n
and
VHzoHo2
Exercise 9 (B)
Numerical Problems:
X
1 0.06 0.15 0.09
2 0.14 0.35 0.21
Find (i) E (2X- Y), (ii) E (XY), (ii) Var (3X 2Y), (iv) p (X, Y), (v)p (2X, 3)
(vi) E (X |Y= 2).
20. X and Y are two discrete r.v.s with joint p.m.f.
p (x. y) =iifor (x, y) = (0, 0). (0, 1), (1, 0), (1, 1)
and px, y) = 0: otherwise
Calculate : i) Cov (X, Y), (i) Var (X + Y), (iii) Are X, Y independent ?
21. Obtain (i) conditional mean and variance of Xgiven Y=3, (ii) Conditional mean
and variance of Y given X =0for the following joint probability distribution.
Mathematical Expectation (Bivariate)
(Paper- I 9.19
EY.B.SC. Statistics
1 2 3
X
0 0.1 0.2 0.3
0.1 0.1 0.2
is
22, The joint p.m.f. of X and Y
Y 0 2
X
-1 1/6 0 1/12
1 1/4 /3 1/6
Find(i) E (Y|X = 1), (ii) Var (Y|X = 1), (ii) Cov (2X- 1, 1- Y),
(iv) p (X, Y), (V) Are X andY independent ?
23. X, and X, are two r.v.s. having p.m.f.
P(X,= X;, X= X) = k (2x+ 5x,)
where (x,, X) = (1, 1); (1, 2): (2, 1); (2, 2).
(i) Determine k.
variance of X, given X= 2.
(ii)) Obtain the conditional mean and
given by -
24. (X, Y) have a joint probability distribution
-1 1
X
0 0.1 0.1 0.1
2 0.1 0.2 0.1
0.1 0.1 0.1
they independent ?
Show that X andY are uncorrelated. Are
25. For the following probability distribution calculate (i) E (X2 Y),
E (27X?+9Y² 156), (ii) E(X|Y =1), (iv) E(2Y|X = 3)
(i) 2
Y 1 3
X
1 1/27 5/27 3/27
2 4/27 3/27 2/27
3 4/27 2/27 3/27
Prove that U=X+ Y and V= X-Y are uncorrelated if and only if Var (X) = Var (Y).
26.
27. With usual notations, prove that
CovX-PY,
Gy
Y =0.
two values. Show that if
28. Let X and Y be discrete r.v.s. where each takes only
Cov (X, Y) =0, then X and Y are independent.
29. Prove that : Var (X-Y) =o +oy - 2po, Oy
30. Prove that : Cov (X, X + Y} = Var (X) + Cov X, Y).
F.Y.B.Sc. Statistics (Paper - 1) 9.20 Mathematical Expectation (Bivariate)
31. For (X, Y), a bivariate discrete r.v., o, =9, oy =4, cov (X, Y) = 4.
(3X +5
Find () Var (2X - 3Y),. () p2 S-3Y
2
(ii) Cov (2X, 3Y), (iv) cov (X+ 5, 2Y - 6).
32. Xand Yare two independent discrete r.v.s, with o = 3 and var (2X + 3Y) =72
Compute Oy: (April 2012)
33. Suppose X,, X,, X, are three discrete r.v.s. with means 2, 4 and 5 and s.d.s. 2each
Further, p(X, X,) = p(X,. X) =5. and p(X,, X) =
Find: (i) E (2X, + X,- X), (ii) s.d. (iii) cov (3X,- 1, 2X,+ 6)
Y.
(a) the marginal distribution of
(b) the marginal distribution of X.
(e) joint distribution of X.
X Xj.
(d) the conditional distribution of Y given
F.Y.B.Sc. Statistics (Paper - ) 9.22 Mathematical Expectation (Bivariate)
10. If p (x. y)= I/9 X=0, 1,2
y=0, 1,2
=0 elsewhere.
Which of the following statements is true ?
(a) Xand Y are not independent.
(b) X and Y are uncorrelected.
(19). 6) 0.3, (ii) 5.27, (ii) 3.85, (iv) 0, (v) 0, (vi) 1.7. (20) (i) 0, (ii)
5,(iii) yes,
(21) 0).5. ()4.an (i) 8/9, (ii) (i). (ivy) 0.3612, () No. (22) (i) 89.
(i) . (iü) .iv) 0.3612, (v) No. 27
(23) () 5. (ii),0.5784; 154
(24) No., (25) ()7:
(ii) 10.3333, (iii) 53
o.(iv) 78 . (31) (i) 24, (ii) 2/3, (iii) 24, (iv) 8, (32) 2, (33) () 3.
(i)7. (iö) 3, (iv) 1, () (vi) 1/3; (34) () C=(i) 0.0346;
(35) A=
Answers of Objective Questions
(A) (1) c. (2y b, (3) a, (4) c, (5) c, (6) d, (7) d, (8) a,
(9) b, (10) b.
(B) (11) F, 12)E, (13) T, (14) T,(15) T.