0% found this document useful (0 votes)
24 views22 pages

Mathematical Expectation

Chapter 9 discusses Mathematical Expectation in Bivariate distributions, focusing on the contributions of Karl Pearson to statistical theory. It covers key concepts such as covariance, variance of linear combinations, and correlation coefficients, along with theorems related to expectation and independence of random variables. The chapter also provides definitions, examples, and computational methods for calculating expectations and covariances in bivariate probability distributions.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views22 pages

Mathematical Expectation

Chapter 9 discusses Mathematical Expectation in Bivariate distributions, focusing on the contributions of Karl Pearson to statistical theory. It covers key concepts such as covariance, variance of linear combinations, and correlation coefficients, along with theorems related to expectation and independence of random variables. The chapter also provides definitions, examples, and computational methods for calculating expectations and covariances in bivariate probability distributions.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter 9.

Mathematical Expectation (Bivariate)


Varl Pearson (1857 - 1936) was an influential English
ahematician who has been credited with establishing
e discipline of mathematical statistics, In 1911 he
ounded the world's tirst university statistics department at
Joiversity College, London, In 1901, he founded the
iournal Biometrika whose object was the development of
statistical theory. Correlation coefficients, method of
moments, chi-square test are his important contributions
among others in the field of Statistics.
Karl Pearson

Contents ...
9.1 Introduction
in Bivariate Distribution
9.2 Mathematical Expectation of a Function
9.3 Theorems on Expectation
9.4 Covariance
Còmbination of Random Variables
9.5 Variance of a Linear
9.6 Correlation Coefficient
9.7 Independence Versus Uncorrelatedness
9.8 Conditional Expectation
Distribution
9.9 Raw and Central Moments of a Bivariate
Key Words :
p.m.f., Joint c.d.f.
Joint p.m.f., Marginal p.m.f., Conditional
Objectives :
random variable and a function of a
Learn how to compute expectation of a bivariate
bivariate random variable.
quantities for bivariate
Learn to compute variance, covariance and other related
random variable.
9.1 Introduction
In the previous chapter, we discussed the joint, marginal as well as conditional
probubility distributions related to a two dimensional diserete rv. (X, Y). However, only the
knowledge of above probability distributions Is not suflicient to get the idea about the joint
behaviour of (X. Y). As in the case of univariate probability disributions, we might be
(9.1)
Mathematical
Expectation (
F.Y.B.Sc. Statistics
interested in some (Paper
9.2
- ) such as a measure of association between the two variables, or
quantities (Bivariate)
variance of linear combination of the variables etc. This is facilitated through the
he
mathematical expectation of a real valued function of (X, Y); which
development of
with in this chapter. Bivariate
Mathematical Expectation of a Function in
9.2
Distribution
a two dimensional discrete r.v. with probability distrik
Definition : Let (X, Y) be Y) be a real valued function (X. Y) Tn
..., n). Let g (X,
{(Xi, yj. Pij): i=1, ..., m; j = 1, 2,
expectation of g (X, Y) is given by m n

E[g (X, Y)) = 2 g(xi yj) Pij


i=l j=1
is also a random variable o
Note that g (X, being a real valued function ofX and Y,
Hence, we can talk about its expectation.
the same underlying sample space 2.
For instance, if (i) g (X, Y) = X+ Y, then
E (X+ Y) = EX (x;+ y) Pj

(ii) g (X, Y) = X² Y then


2
E(X*Y) = Xx ; Pij
Example 9.1 : For the following probability distribution of (X, Y) compute
() E(2X+3Y), (ii) E (X²Y).
Y 1 2 3
X
-1 0.15.0.1 0.2
0 0.050.05 0.05
0.150.2 0.05

Solution: i) g (X, Y) = 2X +3Y

E|g X, MI = (2x+ 3y) Pij...by definition


i j
= (2(-1) +3 (1)) 0.15 +(2(-1) +3 (2)} 0.1+..
= 5.75
(i) g (X, Y) = X²Y
Elg (X, Y)I = X xi yË Pj
i j
=-1P(1) 0.15 + (-1) (2) 0.1 + ...
= 1.65
(Paper- m
eVRS Satistics Mathematical Expectation (Bivariate)
9.3

Theeorems on Expectation
B3
Theerem l:Let Xand Y be two diserete
ers. Then.
E(X +Y = E(X) + E(Y)
Pal: Let ( Pgki= I. . m:j=1. .. n) be the probability distribution of

Coesider. E(X+) = i=l j=l (x+y) Pi (by definition )

i=l j=1
n

Y)2 P
n

i=1
xP +j=l2 yjP.j
shere. P and P.are marginal probabilities of (X = x) and (Y = y) respectively. Therefore,
I P=E (X) andj=l yP.j=E(M.
This leads to. E(X+ Y = E(X) +E(Y)
expectations.
Ihus, expectation of sum of two discrete r.v.s. is equal to the sum of their
Kemarks :1. The above theorem can be extended to more than two variables
AXX: where X. X. .... X are discrete r.v.s.
EX,+X,+ ... +X) = E(X,) +E (X;) + ... +E(X)
-Theorem I is a very powerful tool in Statistics. This can be seen from the following
mplications of the theorem.
E(aX +bY) = aE (X) + bE(): a, b real constants
(Proof left as an exercise)
Using this, we can compute.
E(X- Y) = E(X)-E(Y)
E(2X- 3Y) = 2E (X)- 3E() etc.
) Whenever, we have to obtain expectation of an expression containing several terms,
We can compute it by separating the expectations,
or example, E(3X-4X+5) = 3E(X)-4E (X) +5
EOoY+ 3X'Y - XY +6) m E(X')+ 3E (X'Y)-E(XY)+6 ete.
an we write E(XIY)=E(X)E)or E (XY) =E (X) E(Y)in the above expression ?
To know answer, let us read the next theorem.
F.Y.B.Sc. Statistics (Paper -)
9.4 Mathematical Expectation (Bivariate
discrete r.v.S. then
Theoremn 2 : Let X and Y be two independent
E(XY) = E(X)· E(Y)
n) represent the joint probability
Proof : Let ((xi. yi Pi): i = 1, .... m: j = 1,
distribution of (X, Y). Also let P = Pij and P. j=2 Pj represent the marginal

probabilities of (X = x) and (Y = y) respectively. n

Consider, E(XY) = Xi y; Pij


0=1j-=I
n

= i=l
X j= I Xiy; P;. P.j .X andY are independent
m

= i=1
xP. j=12 y Pj
E(XY) = E(X)E(Y)
Thus, expectation of product of two r.V.s. is equal to the product of their expectations if
the two variables are independent.
3. If X, X, ., Xç are kindependent r.v.s, then
E(X, X, ... , X) = E(X) E(X) ... E (X)
4. Converse of Theorem 2 is not in general true. That is, if E (XY) =E (X) E(Y).
then X and Y may not be independent. This can be seen from the following example.
Example 9.2 : Let (X, Y) h¡ve the following joint p.m.f.
Y -1 1 Pi
X
0 1/6 0 1/6
1 1/6 0 1/6 1/3
2 0 1/2 1/2
P.i 1/6 2/3 1/6
It is obvious that Xand Yare not independent. For, 1 1

On the other hand,


p1=0#%X%:
E(X) = +l =: E(Y) =-
and E(XY) = ) +=0
xy; pj = 0
E(XY) = E(X) E(Y) (verify)
But X and Y are not independent.
9.4 Covariance
We now deline a tncasure called
Covariance is defincd on similar lines as 'convariance'
for a
bivariate probability distribution.
Therefore. (covariance mecasures the joint'variance' for a univariate probability distribution.
variation in X and Y. is used
in
Covariance
(Paper- I) 9.5
Mathematical Expectation (Bivariate)
ESCStatistics
aining
variance of a linear combination of X and Y. Based on covariance, an important
called 'correlation coefficient'is developed which we shall discuss later.
neasure
Y,
Gnition: Let (X, Y) be a bivariate discrete r.v, Then covaríance betwecn X and
(X,
ented bycov .Y), is defined as follows.
Cov (X, Y) = E[[X-E(X)} (Y-E(Y)})
is simplified as follows :
Remark : 1. For computational purposes, the above formula
Cov (X,Y) = E[(X- E(X)) (Y-E(Y)}]
= E[XY- X-E (Y) -Y-E (X) +E (X) E (Y))
E (X) E (Y)
= E(XY)-E (X) E(Y)-E(Y) E (X) +
= E(XY)-E(X) E (Y)

Thus. RCov (X, Y) = E(XY) - E(X) E(Y)


2 Cov (X, Y) = Cov (Y, X) and Cov (X, Y) may be negative.
B. IfX and Y are independent, then
E (XY)= E(X)- E(Y) (by theorem 2)
yHence, Cov (X, Y) = E (XY)- E(X) E(Y) =0
that X and Y are independent (see Ex. 9.2).
4. Cov (X, Y) =0 does not imply
a, b are real constants.
5. Cov (aX, bY) = ab Cov (X, Y); where
To see this,consider, (6Y)
Cov (aX, bY) = E (aX bY) -E (aX)) E
= ab E (CXY) - abE (X) E (Y)
= ab Cov (X, Y)
As a consequence of this result, we get,
Cov (X, - Y) = -Cov (X, Y)
Cov(2x.) =(Cov (X, Y) etc.
change of scale.
Ihus covariance is not invariate to the
(X, Y);
6. Cov (X + c, Y + d) = Cov
Wherc c, d are constants. This is because,
E(Y + d)
Cov (X +c, Y+d) = E|(X+ c) (Y +d)} - E(X+c)
= E(XY +dX +cY+ cd) - (E (X)+c) (E() +d)
=E(XY)- E(X) E(Y)
Cov (X, Y)
1 Cov (X, X) = Var (X)
For., Cov (X, X) = E(X X)-E(X) E(X)
=ECX)-[E(X)P
Var(X)
Mathematical Expectation (Bivariat
FY.B.Sc. Statistics (Paper - I) 9.6

EXample 9.3 : Following is a ioint probability distribution of (X, Y).


2 3 P;
Y; 1
X
1/12 1/12 4/12
2/12
1/12 1/12 2/12 4/12
3/12 1/12 0 4/12
2
6/12 3/12 3/12
P.j
(iv) Cov
Caleulate () Cov (X, Y): (ii) Cov(X, - Y; (i) Cov (X-1, Y+3):
Solution : i) Note that E(X) = x P =1
and E() = Ey; P. j =
19
E(XY) = Exy Pj =12
19 1
Cov (X, Y) = E(XY)-E (X) E(Y) =1p- = -6

(ii) Cov (X, - Y) = -Cov (X, Y) = (Remark 4)

(ii) Cov (X -1, Y+3) = Cov (X, Y) = (Remark 5)

(iv)

76 F 36
9.5 Variance of a Linear Combination of Random Variables
Theoremn 3: Suppose X and Y are two discrete r.v.S. then,
Var (aX + bY) = a² Var (X) + b² Var (Y) + 2ab Cov (X, Y)
(ii) Var (aX -bY) = a? Var (X) + b² Var (Y) - 2ab Cov (X, Y)
where a, b are real constants.

Proof : (i) Let U= aX +bY :. E (U) = aE (X) + bE ()


By definition. VU = E (aX +bÝ - aE (X) - bE (Y)]
Var (U) = E[U- E (U))P
=E[aX + bY - aE (X) bE
= E[ak (X-E(X)) +b
(Y)}P
= E[a (X- E(X))P+ b² (Y(Y- E(YP
+ 2ab (X --E (X)) -E(Y}P
|Y-E()|
Mathematical Expectation (Bivariate)
Sttaatistics(Paper - m)
9.7
KYASC
we get,
Taking the expectation,
Var (U) = 'E (X-E (X)}+b² E(Y- E()P
+2ab E[|X- E(X)) (Y-E(Y))|
= a Var (X) + b²Var (Y) + 2ab Cov (X, Y)
... by definitions of variance and covariance.
aX-bY :. E(V) = aE (X) -bE(Y)
(i0) Let V=
By definition,
Var (V) = E[V-E (V)J
= E [aX -bY- (aE (X) - bE ()}P
= E[a (X-E(X))-b (Y-E()})P
E(MP
= E [a (X -E (X))P+ b² (Y -
- 2ab (X -E(X)) (Y- E(Y)}) (X. Y)
Cov
= af Var (X) + b² Var (Y) - 2ab
b= 1,
Remark 1 : In particular, when a =
()+2 Cov (X, Y)
Var (X + Y) = Var (X) + Var
When a =1, b=-1. Var (Y) - 2Cov (X, Y)
Var (X- Y) = Var (X) +
When X and Y are independent, cov (X, Y) = 0. Therefore,
2. (X) + b² Var (Y)
Var (aX + bY) = a²Var
(X) + b² Var ()
Var (aX - bY) = a² Var
independent,
Thus, when X and Y are
+ Var (Y)
Var (X + Y) = Var (X)
+ Var (Y)
Var (X - Y) = Var (X)
to n r.v.s. as follows.
Theorem 3 can be generalized with means E (X;) = u; and variance
n discrete r.v.S.
Theorem 4: Let X,. X, .... X, be
Then.
Var (X) = G: i= 1.2, ... n.
Var a X =2 j G +22 2 aja; Cov (X;. X)
i<j
Where, a,, as. .... an are constants.
n
n

Proof : Let Y = a X; Then E(Y)= aHi


Var (Y) = E|Y- E(Y))P
n
F.Y.B.Sc. Statistics (Paper -) 9.8 Mathematical Expectation (Bivariate)

Li=1 i=lj=1
i<j
n n
n

= i=1 aj E(X;- 4)+2 i=lj=l aj a E{(X;-) (X; -4))


i<j
= af oË + 2 2 a, a Cov (Xj, X;)
i=1 i=lj=l
i<j
Remark : 1. When X, X, .. Xn are independent r.v.s.
n
2

Var ax i=1
Cov (Xi, X) =0 for ij
M=
2. Suppose X,, X,, ..., X are independent r.v.s. with oË = G²; i = 1, ..., n, then,
X;
i=1
Var (X ) = Var
n

i=1 n " aj=i=1,..n


3. When, oË =o', and a; = 1for i= 1,2, .., n,
n n

Var i=1 x-E G+2E


i=1 j= 1
i=1
Cov (X, X)
i<j
9.6 Correlation Coeficient
In bivariate distributions we are generally interested in finding if thereis any relationship
between the two variables of interest. Karl Pearson's correlation coefficient (denoted by p)
provides such a measure. It provides an idea regarding the extent as well as the direction of
linear relationship between the two variables. It is defined on next
page.
Definition : Let (X, Y) be a discrete bivariate random variable with ((xi, VË, P): i=
.... m; j= 1,2, .. n as its joint probability distribution. The correlation
X and Y which is denoted by porp (X, Y) is defined as
coefficient betwen

p =pX, Y) = cov (X, Y)


Mathematical Expectation (Bivariate)
(Paper - ) 9.9
KASCStatistics
c, areesd. sof Xand Yrespectively.
hereO and
E(XY) -E(X) E(Y)
n other words
P= JEOX)-[E (X)F NEY)-[E(YDF
Remarks :

1 pXY) =pY, ) Cov (X, Y) =Cov (Y. X)


2p=0 if and ony if Cov (X, Y) =0
2
Cov (X, X)
3 pX.X =

pX-X) Cov (X,-X) =-1 and


4
5. Cov (NX, Y) = p, Gy
Interpretation of values of p :
be uncorrelated. It means that
there is no
the two variables are said to
) I f p=0,
(inear) relationship between the variables. case.
If0<p<l. then the two variables are said to be positively correlated. In this
() variable in the same
change in value of one variable causes change in the other
direction. change in
If-I<p<0, then the two variables are negatively correlated. That is,
() variable in reverse direction.
variable causes change in the other
one
perfect positive correlation between the two variables. That is
N) Ifp=l, then there is
Y=a+bX with b >0.
That
lfp=-1. then there is perfect negative correlation between the two variables
n)
ya- bX withb>0.
properties of the correlation coefficient.
We now discuss some important
origin and change of scale.
Result 1 : Correlation coefficient is invariant to the change ofvariables are not in the sane
both the
howcver, itcanges its sign if the changes of scale for
dectine Specifically ifa > 0, e0
PaX+b, cY + d)pX. Y)
or a <0,
ifa > 0,

Prout : Let U aX +h, and VcY+d.


Cov (U, V) ac Cov (X, Y)
9.10
Mathematical Expectation (Bivariatel
F.Y.B.Sc Statistics (Paper - 1)
and

Cov (u. v
p(U. V) =

ac cov (X. Y) lacl


p (X. Y)
lal lcl a, Gy

When a and c have same algebraic sign. then c = 1.


p(U. V) = p(X. Y)
On the other hand., when a and c have opposite signs.
ac =-1, . p(U. V) = -p (X. Y)
lacl
(X-E(X) Y-E(Y) =p (X. Y). The variables
Remark : 1. In particular.p Gy
X-E(X) and Y-E(Y) are called standardized variables of X and Y respectively.

Result 2 :The correlation coefficient lies between - I and+ 1.


ie. -1 SpX, Y)sI
Proof : Let. U=
X-ECX) and V Y-E(Y)
G Gy
denote the standardized variables of X and Y respectively.
Therefore, p (U. V) = pX, Y)
Consider. Var (U + V) = Var (U) + Var (V) + 2 Cov (U, V) by theorem 3.

Now, Var (U) = X-EX)

Similarly. o = Var (V) = 1

Cov (U. V) = p(U, V) o, G, = p(X. Y)


Var (U + V) = 2+2p (X. Y) >0 since
variance is always non-negatve.
2p (X, Y) 2 -2
p(X. Y) 2 -1 -1 sp(X, Y)
1)
EYB.SC. Statistics (Paper- 9.11 Mathenatical Expectation (Bivariate)
To prove the other part, consider,
Var (U-V) = Var (U) + Var (V) -2 Cov (U, V)
=|+1-2p (X, Y)20
p(X, Y) sI
Hence, -1Sp (X, Y) s1 is proved.
O7 Independence Versus Uncorrelatedness
() Independence ’ uncorrelatedness.
(ii) Uncorrelatedness ’ independence.
Proof : (i) We have already noted that, if Xand Yare independent, then
E(XY) =E(X)E (Y). Therefore Cov (X,Y) =0. Since p= Cov (X, Y) it implies that
p=0.
Thus independence of two random variables implies that the two variables are
uncorrelated.
(i) On the other hand, we have seen in Example 9.2, that the converse does not hold
true, We give here another situation, where the two r.v.s. are uncorrelated but not
independent. Consider the following probability distribution of a r.v. X.
X -1 1
P (x) 1/3 1/3 1/3: E(X) =0
Define Y =X². Hence, the probability distribution of Y is
Y 0 1
P(y) 1/3 2/3

E(Y) = 3
The joint p.m.f. of (X, Y) is tabulated below.
Y 0 1
X
0 1/3
/3 0
1/3
Obviously, X and Y are not independent.
But,
Cov (X, Y) = E(XY)-E(X) E() =- + =0
X and Y are uncorrelated.
Illustrative Examples
Example 9,4: For two discrete r.v.s. X, Y; Var (X) =Var (Y) =1. Cov (X, Y) =:
Find (6) var (4X - 3Y), (i) p ,
(i) Also prove that U=X +Y and V=X- Y are
uncorrelated.
F.Y.B.Sc. Statistics (Paper - I) 9.12 Mathematical Expectation (Bivariates
Solution : () Var (4X-3Y)= 4 Var (X) + 32 Var ()
-2.4.3 Cov (X, Y... by theorem 3.
= 13.

(ii) = p (X, Y)... by result 1

Cov (X, Y)
VVar (X) VVar (Y)
(11) In order to prove thatU=X+ Y and V = X-Y are uncorrelated, it is enouchs
prove that Cov (U, V) =0.
Cov (U, V) = E(UV) - E (U) E (V)
= E{(X+ Y) (X- Y} - (E (X) +E (D} (E (X)-E (Y)
= E(X?) -E(Y)-(E (X))?+ (E (Y}P
= E (X?) - {E (X)}2 - [E (Y) (E (Y))]
= Var (X) Var (Y)
= 1-1 =0

Example 9.5 : Suppose X,, Xz, X, are three discrete r.v.s. with means 10, 20 and 40 and
s.d.s. 2, 4, and 6 respectively. Further, p (X, X,) =i = p X, X) and p (X, X) =
Find (i) E (4X, + 2X, -3X,): (ii) Var (3X, -2X, + X); (ii) Var (X, -X, - X);
(iv) Cov (2X, +3,4 - 3X,); (v) p (3X,-3, X,+ 1): (vi) p (4 + X,, 5 2X,).
Solution:
1) E (4X,+ 2X,- 3X,) = 4E (X,) + 2E (X)- 3E (X,)
= -40
(ii) Using Theorem 4, we write
Var (3X, - 2X,+ X,) = 3oí +22 o + ¡ - 12 Cov (X, X)
+6 Cov (X, X;) 4 cov (X,, X) . .()
Now, Cov (X,, X,) = o, o,p (X, X,)

= 2x4xi =2
Cov (X, X,) = o, o,p (X, X) =3
Cov (X. X) = O, G,p(X,, X,) = 12
Hence (1) becomes,
Var (3X- 2X,+ X) = 36 +64 +36- 24+
18-48 = 82
(iii) Var (X- X- X1)= G +o; +oi +2 Cov (X, X,)
-2 Cov (X,, X)- 2 Cov (X, X)
44 16+ 36 +4-6- 24 =3O
I) Mathematical Expectation (Bivariate)
EY.8.Sc. Statistics (Paper- 9.13

(iv) Cov (2X+3,4-3X,) = 2x(-3) Cov(X,, X)) -12


(V) p(3X,-3,X,+ 1) = p(X, X) =5
(vi) p(4+ X), 5- 2X,)=-p(X, X) =
compute pX. Y).
Example 9.6: For the following joint probability distribution of (X, Y)
and Y
the correlation coefficient between X
Y 2
X
1/4 0 1/4
1/8 1/8 1/4

Solution: The marginal p.m.f.s of X and Y are 2


0
X 0 1
3/8 1/8 1/2
P (x) 1/2 1/2 P(y) 9
E(X) = x P() =~ : EY) = EyP() =g
E(X) = *P)=5:E(Y) = Ey'P(y) =8
Var (X) = o = E(X*) - [E (X)]P =1

(Y) = oy = E(Y')- [E (Y))P = 55


A
Var

E(X, Y) = Exy P(x. y) =


E(X) E (Y)
Cov (X, Y) = E(XY) -
3
= -5x
Y) = Cov(X, Y)
p=p X, G, Oy
3/16

0,4045
correlated.
Xand Y are negatively
D.8 Conditional Expectation distribution
dimensional discrete rv. with probability
Suppose (X, Y) is a two
function is nothing but the expectation of the r.v.
Conditional expectation of a FV. or its
conditional distribution.
sTunction using the appropriate
FY.B.Sc. Statistics (Paper -) 9.14 Mathematical Expectation (Bivariate)
(a) Conditional expcctation of Xgiven Y yj : We have seen carlier, that the
conditional distribution of XgivenY= y, is as follows :
X Km
P(N)Y =y) P Pa P Pmj
P., P. P. P.j

where, P. = 2 Pj: the marginal probability P(Y = y)


i=1

Hence, the conditional expectation of Xgiven Y =y denoted by E(X|Y =yj) is

E(X|Y =y) = i=1 xP(X= x|| Y= y)


Pi
i=1 NP.
Remark :

1. E(XY = y) is also called as the conditional mean of X


given Y = yi.
2. E (X|Y = y) is obtained by fixing the variable Y at
y;, a
fixed yj, the conditional mean is a constant. However, particular value. Hence for
it varies as j varies from
1.2. ..., n.
XIfXand Yare independent, then E(X|Y =y) =E(X) since P(X=
(b) Conditional x|Y = y) = P
Expectation of
distribution of Y given X = x; is as follows.
Ygiven X = x;: The conditional
probability
y yn
P(Y|X =X) Pi Pi: Pii
P P. Pin
Pi PË.
Hence, the conditional
expectation of Y given X = Xj denoted by E (Y| X = x)
will be.
n

E(Y)X=x) =2 y P(Y =yj|X =x)


j=1

Remark : The conditional


on X. If further, it is of linear mean E(Yix) is a function of x. It is
form ie. called as regression ot
E (Yk) =a+ bx,
then b is nothing but tlhe
E(XIY)=a' +b y. then b' is he regression coefficient of Yon X. Similarly, i
regression coefticient of Xon Y.
(Paper - ) 9.15
Mathematical Expectation (Bivariate)
cYB.SC. Statistics
Conditional variance of Xgiven Y = Vi: Conditional variance is obtaincd by
appropriate conditional distribution: as we do in case of conditional mcan, We
the
using that variance of any r.v. is obtained by using the following formula.
Var (r.v.) = E(rv.- (E (r.v.))2
Accordingly, the conditional variance of X given Y = y; is defined as
Var (XIY =y) = E(XY =y) - (E (XIY =y)}
where.
E(X\Y = y;) = xË P(X=x;|Y= y)
1=1
m

= 2 Pij
i=1
Xi P.
and E (XIY = y) is the conditional mean of X given Y = y;.
Conditional variance of Ygiven X = X;: The conditional variance of Y given
(d)
X= x;is defined on the similar lines.
Var (YIX= x;) = E(Y'/X = x;) (E (YIX= x))?
1

where, E(Y'X =x) = yj P(Y=y;| X= x)


j=1

of Y given X= Xf.
and E (Y|X = x;) is the conditional mean
Distribution
Moments of a Bivariate
9.9 Raw and Central
a bivariate distribution are. defined on similar lines of univariate
Moments for
distribution. represent the joint probability distribution
i = 1, 2, ...,m; j=1,2, .., n)
Let ((xi. y, pii);
of (X, Y).
(X, Y) is denoted by and is defined
(a) Raw Moments : The (r, sth raw moment of

H = E(X Y) =i=l j=l x Pi


Where, r, S are non-negative integers.
In particular, if r= l, s = 0, then
9.16 Mathematical Expectation (Bivariate)
F.Y.B.Sc. Statistics (Paper- n)

= P
= E(X)

Similarly, ifr=0, s= 1, then 4, = E(Y)


If r=2, s=0, then =E(X)
r=0,s= 2, then ,n =E(Y)
If r =l, s = 1, then L,, = E (XY) etc.

(b) Central Moments : The (r, s)th central moment of (X, Y) is denoted by ur and is
defined as

Hrs = E([X-E (X)F[Y -E(Y)])


m n

= [(x;-E()F(yj- E(Y)l Pij


i=l j=1
where r. s are non-negative integers.
In particular, Lo = E (X-E(X)} = 0. Similarly, Ho =0
Ho = E [X-E (X)]² = Var (X)
Hoz = E[Y-E(YP = Var (Y)
H = E[(X-E(X)) (Y-E(Y)}] = Cov (X, Y)
Thus, we get.
p= correlation coefficient between X and Y

cov (X, Y
Vvar (X) var (Y)
Points to Remember

Expectation of sum of two random variables is equal to the


sum of
the two random variables, We do not need the two expectations o
this result.
variables to be independent fot
Expectation of product of independent random variables is equal to the product of
their expectations. Converse is not true.
)
CYB.Sc. Statistics (Paper - 9.17 Mathematical Expectation (Bivariate)

Exercise 9 (A)
Theory Questions :
1 Define mathematical expectation of a function of a discrete bivariate r.v. (X, Y).
) Suppose Xand Yare two discrete r.v.s. with joint probability distribution.
(Xi. y;. Pij): i= 1, ..., m: j= 1, ...,n)
Prove that
(i) E(X+ Y) =E(X) + E (Y)
(ii) E(XY) =E(X) -E(Y) if Xand Yare independent.
3. Give an example of joint probability distribution of (X, Y) where
E(XY)=E(X)·E (Y) holds, but X and Y are not independent.
4 Define covariance between two r.v.s. What does it measure ?
5. Does Cov (X, Y) =0 imply that Xand Yare independent ? Justify.
6. Prove that :
() Cov (aX, bY) = ab Cov (X, Y)
(i) Cov (X + c, Y + d) = Cov (X, Y)
(ii) Cov (X, X) = Var (X).
a, b, c, d are constants.
7. Suppose X and Y are two discrete r.v.s. Prove that
() Var (aX + bY) =a² Var (X) + b² Var (Y) + 2ab Cov (X, Y)
(i) Var (aX - bY) =a? Var (X) + b² Var (Y) - 2ab Cov (X, Y)
8. For X, Y two independent discrete r.v.s. prove that
) Var (X+ Y) = Var (X) + Var (Y)
(i) Var (X-Y) = Var (X) + Var (Y).
9. Derive the expression for variance of linear combination of n discrete r.v.s. X;, X,
10. If x, X2, ..., Xn are independent discrete r.V.S. with common variance G², show that

Var (X) =
n

i=1
where, X =
11. 1x, X)s ***s Xn are discrete r.v.s. with Var (x;) = g for i= 1, 2, ...n and correlation
Coefficient between xË and x; as p (i # j). Then show that
n

Var = no'|l +(0- Dpl


2. Define correlation coefficient p between two discrete r.V,S. X and Y. Give
Interpretation of the various values of p.
F.Y.B.Sc. Statistics (Paper - ) 9.18 Mathematical Expectation (Bivariate)
13. Prove the following properties of a correlation coefficient p.
() pis invariant to the change of origin and scale. It changes its sign if the change
of scale for the variables are not in the same direction.
(ii) -1spsl.
14. Prove in case of two discrete r.v.S., the following :
() Independence’uncorrelatedness.
(i) Uncorrelatedness ’ independence.
15. Define conditional expectation of X given Y = y; and conditional expectation of y
given X= XË. Also give their interpretation.
16. Define conditional variance of Xgiven Y= yj and conditional variance of Ygiven
X= Xi
17. Define (r, s)th row moment() of adiscrete bivariate distribution of(X, Y). Hence
show L,, =E (X), L =E(Y), L, =E (XY).
18. Define (r, s)th central moment irs Of (X, Y). Hence, show that
H20 = Var (X): Hoz = Var (y)

and
VHzoHo2
Exercise 9 (B)
Numerical Problems:

19. For the following probability distribution,


Y 2 3 4

X
1 0.06 0.15 0.09
2 0.14 0.35 0.21
Find (i) E (2X- Y), (ii) E (XY), (ii) Var (3X 2Y), (iv) p (X, Y), (v)p (2X, 3)
(vi) E (X |Y= 2).
20. X and Y are two discrete r.v.s with joint p.m.f.
p (x. y) =iifor (x, y) = (0, 0). (0, 1), (1, 0), (1, 1)
and px, y) = 0: otherwise
Calculate : i) Cov (X, Y), (i) Var (X + Y), (iii) Are X, Y independent ?
21. Obtain (i) conditional mean and variance of Xgiven Y=3, (ii) Conditional mean
and variance of Y given X =0for the following joint probability distribution.
Mathematical Expectation (Bivariate)
(Paper- I 9.19
EY.B.SC. Statistics
1 2 3
X
0 0.1 0.2 0.3
0.1 0.1 0.2
is
22, The joint p.m.f. of X and Y
Y 0 2
X
-1 1/6 0 1/12
1 1/4 /3 1/6
Find(i) E (Y|X = 1), (ii) Var (Y|X = 1), (ii) Cov (2X- 1, 1- Y),
(iv) p (X, Y), (V) Are X andY independent ?
23. X, and X, are two r.v.s. having p.m.f.
P(X,= X;, X= X) = k (2x+ 5x,)
where (x,, X) = (1, 1); (1, 2): (2, 1); (2, 2).
(i) Determine k.
variance of X, given X= 2.
(ii)) Obtain the conditional mean and
given by -
24. (X, Y) have a joint probability distribution
-1 1

X
0 0.1 0.1 0.1
2 0.1 0.2 0.1
0.1 0.1 0.1
they independent ?
Show that X andY are uncorrelated. Are
25. For the following probability distribution calculate (i) E (X2 Y),
E (27X?+9Y² 156), (ii) E(X|Y =1), (iv) E(2Y|X = 3)
(i) 2
Y 1 3
X
1 1/27 5/27 3/27
2 4/27 3/27 2/27
3 4/27 2/27 3/27
Prove that U=X+ Y and V= X-Y are uncorrelated if and only if Var (X) = Var (Y).
26.
27. With usual notations, prove that

CovX-PY,
Gy
Y =0.
two values. Show that if
28. Let X and Y be discrete r.v.s. where each takes only
Cov (X, Y) =0, then X and Y are independent.
29. Prove that : Var (X-Y) =o +oy - 2po, Oy
30. Prove that : Cov (X, X + Y} = Var (X) + Cov X, Y).
F.Y.B.Sc. Statistics (Paper - 1) 9.20 Mathematical Expectation (Bivariate)
31. For (X, Y), a bivariate discrete r.v., o, =9, oy =4, cov (X, Y) = 4.
(3X +5
Find () Var (2X - 3Y),. () p2 S-3Y
2
(ii) Cov (2X, 3Y), (iv) cov (X+ 5, 2Y - 6).
32. Xand Yare two independent discrete r.v.s, with o = 3 and var (2X + 3Y) =72
Compute Oy: (April 2012)
33. Suppose X,, X,, X, are three discrete r.v.s. with means 2, 4 and 5 and s.d.s. 2each
Further, p(X, X,) = p(X,. X) =5. and p(X,, X) =
Find: (i) E (2X, + X,- X), (ii) s.d. (iii) cov (3X,- 1, 2X,+ 6)

(i) cov (x. ), o pX-5, X+ 2), () p[


Exercise 9 (C)
34. The probability distribution of (X, Y) is given by -
Pi,j) = C(i+j) i = 1,2, 3,
j'= 1,2
= 0 otherwise
Find (i) C, (i) The correlation coefficient between X and Y.
35. Let X and Y be two discrete r.v.s. with
variance o and oy respectively and
correlation coefficient pf Find suçh that U= X + 2Y and V = X + Y are
uncorrelated. Oy
Objective Type Questions
(A) Multiple Choice Questions (MCQ):
Choose the correct alternative.
1. If X and Y are two
random
the expression E [(X - E (X))variables
with means E (X) and E (Y)
(Y-E(Y)) is called respectively. then
(a) Variance of X
(b) Variance ofY
(c) Covariance between X and Y
(d) Correlation coefficient between X and Y.
2 In the context of two variables X
and Y which of the following
(a) If the covariance between statement is false
two variables X and Y is eoual to zero,
correlation coefficient between these then
variables equal to zero.
(b) If the covariance between two
variables X and Y is equal to zero, then they are
independently distributed.
(Paper-) 9.21
Mathematical Expectation (Bivariate)
Statistics
YASC.
covariance are both equal
ic) For independent variables X and Y, corrclation and
to Zero.
between Y and X.
Correlation coefficient between X and Y is same as that of
with covariance between them is cqual to
, X and Y are any twO random variables =5 Y + 5 is
then covariance betwcen Uand V, where U= 10 X+ 10 and V
50.
given by .....
(a) 2500 (b) 50
(d) 500
(c) 65
discrete bivariate random variable, then E (YIX = x) is a function of
4. If (X, Y) is a

(a) x and y (b) y


(d) xy between
(c) X
random variables, then the covariance
two
5. If X and Y are any
aX + b, cY + dis (b) abcd Cov (X, Y)
Cov
(a) C (X, Y)
(d) ac Cov (X, Y)+ bd
(c) ac Cov (X, Y) conclude that
cocfficient between X and Y is zero. We then
6. The correlation
distribution.
(a) Xand Y have same
Y are equal.
(b) the variance of X and
relationship between X and Y,
(c) there exists no
exists no linear relationship between X and Y.
(d) there p.m.f.
a bivariate discrete random variable with joint
7. Let (X, Y) be (0,0)(1, 0) (0, 1) (1, 1)
(X, Y) 5/12 1/4
P (X, Y 1/2 1/4

The mean of Y IS ***


(b) 1
(a) 1/3 (d) 2/3
(c) 5/12 independent if
discrete random variables X and Y are said to be
8. Two
(a) pj = P. P.j for all i
andj
and j
(b) P = P,.+P.i for all i
all i and j
C) Pj = P, /P. forfor all i and j.
(d) Pj= P -P. of X
random variables, then the conditional distribution
2 X and Y are two discrete
given Y y is the same aS ss******te*e

Y.
(a) the marginal distribution of
(b) the marginal distribution of X.
(e) joint distribution of X.
X Xj.
(d) the conditional distribution of Y given
F.Y.B.Sc. Statistics (Paper - ) 9.22 Mathematical Expectation (Bivariate)
10. If p (x. y)= I/9 X=0, 1,2
y=0, 1,2
=0 elsewhere.
Which of the following statements is true ?
(a) Xand Y are not independent.
(b) X and Y are uncorrelected.

(c) Xand Yare uncorrelated but not independent.


(d) X and Y are correlated and independent.
(B)State whether the following statements are true or false.
11. For two discrete random variables X and Y,
E (X+ Y) =E (X) +E (Y) only if Xand Y are independent.
12. Var (X- Y) = Var (X) - Var (Y) -2 Cov (X, Y).
13. Correlation coefficient between (3 -X) and (5 - 3Y) is the same as that between
X and Y.
14. If X and Y are independent, then conditional mean of X given Y =y is the
same as
mean of X.
15. If u=5, Lz0 = 16 and Lo =4, then the value of pis 0.625.

Hints and Answers

(19). 6) 0.3, (ii) 5.27, (ii) 3.85, (iv) 0, (v) 0, (vi) 1.7. (20) (i) 0, (ii)
5,(iii) yes,
(21) 0).5. ()4.an (i) 8/9, (ii) (i). (ivy) 0.3612, () No. (22) (i) 89.
(i) . (iü) .iv) 0.3612, (v) No. 27
(23) () 5. (ii),0.5784; 154
(24) No., (25) ()7:
(ii) 10.3333, (iii) 53
o.(iv) 78 . (31) (i) 24, (ii) 2/3, (iii) 24, (iv) 8, (32) 2, (33) () 3.
(i)7. (iö) 3, (iv) 1, () (vi) 1/3; (34) () C=(i) 0.0346;
(35) A=
Answers of Objective Questions
(A) (1) c. (2y b, (3) a, (4) c, (5) c, (6) d, (7) d, (8) a,
(9) b, (10) b.
(B) (11) F, 12)E, (13) T, (14) T,(15) T.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy