Mca4020 SLM Unit 02
Mca4020 SLM Unit 02
Mca4020 SLM Unit 02
Structure:
2.1 Introduction
2.2 One-dimensional Random Variable
Discrete and Continuous random variables
2.3 Distribution Function
Distribution Function of discrete random variables
Distribution Function of continuous random variables
2.4 Two-dimensional random variables
Discrete and Continuous random variables
Joint Density Function
2.5 Marginal and Condition probability distribution
2.6 Mathematical expectation
2.7 Summary
2.8 Terminal Questions
2.9 Answers
2.1 Introduction
In the previous unit we studied about sample space, different types of
events, and about probability. In this unit we introduce some important
mathematical concepts which have many applications to the probabilistic
models we are considering.
Objectives:
At the end of this unit students should be able to :
explain the concept of Random variables, Discrete and continuous
random variables.
explain the concept of Distribution function and Two-dimensional
random variables, Joint density function and its properties.
Definition:
Let S be the sample space of a random experiment. Suppose with each
element s of S, a unique real number X (s) is associated according to some
rule. Then, X is called a random variable on S.
In other words, if f is a function (mapping) from S into the set R of all real
numbers and X = f(s), sS, then X is called a random variable on S.
Roughly speaking, a random variable is a variable whose values depend on
chance.
For example, consider the random experiment of tossing a coin. For this
experiment, the sample space is
S = {H,T}. Let us define a mapping f : S R by
1 if S H
f ( s)
0 if S T
Then, X = f(s) is a random variable on S. For the outcome H, the value of
this random variable is 1 and for the outcome T, its value is zero.
the value 0 for the element s8 = TTT. As s varies over the set S, X varies
over the set {0, 1, 2} R.
2.2.1 Discrete and Continuous Random Variables
Definition:
Let X be a random variable. If the number of possible values of X (that is,
Rx, the range space) is finite or countably infinite, we call X a discrete
random variable. That is, the possible values of X may be listed as x1, x2,
.. xn, . In the finite case the list terminates and in the countably infinite
case the list continues indefinitely.
What are the possible values of X ? We shall assume that these values
consist of all non-negative integers. That is, the range space of X i.e.
Rx = {0, 1, 2, n, }.
The function p defined above is called the probability function (or point
probability function) of the random variable X. The collection of pairs
(xi, p(xi)), i = 1, 2,.. is sometimes called the probability distribution of X.
Definition:
Note: If F(x) is the cdf then pdf f(x) can be obtained by using formula
d
f x F x F1 x .
dx
Note3. P (a x b) P ( x a ) F (b) F (a )
2.3.1 Distribution Function of discrete random variables
Definition: Let X be a random variable whose space is x1 , x 2 ,...x n .
p i =1.
p (x) =
e x for x 0 is a probability density function.
0 for x < 0
Find the probability that the variate having this density will fall in the interval
[1.5 2.5]. Also, evaluate the cumulative distribution function F(2.5).
Solution:
Evidently, the given p( x ) 0 . Further,
0
P( < x < ) = p(x) dx = p(x) dx + p(x) dx
|| 0
0
= e xdx = 1
0
The two conditions verified above show that p(x) is a probability density
function.
Next, we find that
2 .5 2 .5
P( 1.5 < x < 2.5) = p(x)dx = e x dx
1 .5 1 .5
= (e2.5 e1.5) = 0.1410
0 t>0
F(t) = t 2
0t1 find density function. Also, evaluate
1 t>1
Solution:
Recalling that p(t) = F1(t), we find that
0 t>0
p(t) = 2t 0t1
0 t>1
0 x>0
p(x) = 2x 0x1
1 x>1
Also,
= (0.75)2 (0.5)2
( F( t ) P( x )dx )
= 0.3125.
P(X 0.5) = F(0.5) = (0.5)2 = 0.25
P(X > 0.75) = 1 P(X 0.75) = 1 F(0.75) = 1 (.75)2
= 0.4325
Example: The p.d.f. of a random variable x is given by
x 0x1
f(x) = 2x 1<x2
0 elsewhere
Find (i) cumulative distribution function, and (ii) P(x 1.5
For 0 t 1, we get
0 t
F(t) = f(x)dx + f(x)dx
0
t
0 t2
= 0.dx + xdx =
2
For
1 < t 2, we get
0 1 t
F(t) = 0.dx + xdx + (2 x)dx
0 1
1
=1 (t 2)2
2
For t > 2, we get
0 1 2 t
F(t) = 0.dx + xdx + (2 x) dx + 0.dx
0 1 2
1 3
= + (2 ) = 1
2 2
Thus, the required c.d.f. is
0 <t<0
t2
F(t) = 2 0t1
1
1 (t 2)2 1<t2
2
1 t>2
Now P(X 1.5) = f(x) dx
1 .5
2 1
= (2 x) dx + 0.dx =
1 .5 2 8
2 1 2 2 7
(i) P(1 X 2) = f(x)dx = x dx =
1 18 1 54
2 1 2 2 35
(ii) P(X 2) = f(x) dx = x dx =
18 3 54
1 3 2 13
(iii) P(X > 1) = f(x) dx = x dx =
1 18 1 27
x 0 1 2 3 4
P(X=x) c 3c 5c 7c 9c
, x 0
f ( x) xe
2
0, otherwise
With each possible outcome (xi, yj) we associate a number p(xi, yj)
representing
P[X = xi, Y = yj] and satisfying the following conditions
Note:
P(c X d) = P(c X d, < Y < )
d
= f(x, y) dydx
c
d
P(c X d) = g(x) dx
c
b
Similarly, P(a Y b) = h(y) dy
a
Definition: Let (X, Y) be a discrete two-dimensional random variable with
probability distribution p(xi, yj). Let p(xi) and q(yj) be the marginal pdfs of X
and Y, respectively.
f ( x, y )
g ( x / y) h ( y) 0
h ( y)
f ( x, y )
h ( y / x) g(x) 0
g ( x)
Definition:
(a) Let (X, Y) be a two-dimensional discrete random variable. We say that
X and Y are independent random variables if and only if P(xi, yj) = p(xi)
q(yj) for all i and j. That is, P(X = xi, Y = yj) = P(X = xi) P(Y = yj) for all
i and j. i.e. p(xi, yj) = p (xi) q (yj) I,j
(b) Let (X, Y) be a two-dimensional continuous random variable we say
that X and Y are independent random variables if and only if f(x, y) =
g(x) h(y) for all (x, y), where f is the joint pdf, and g and h are the
marginal pdfs of X and Y, respectively.
Note:
(i) Let (X, Y) be a 2-dimensional discrete random variable. Then X and Y
are independent if and only if p(xi/yj) = p(xi) for all i and j and, if and
only if q(yj/xi) = q(yj) for all i and j.
(ii) Let (X, Y) be a two-dimensional continuous random variable. Then X
and Y are independent if and only if g(x/y) = g(x) and, if and only if
h(y/x) = h(y), for all (x, y).
Note:
(1) The concepts discussed for the one-dimensional case also hold good for
2-dimensional random variable.
(2) Let (X, Y) be a two-dimensional random variable with a joint probability
distribution. Let Z = H1(X, Y) and W = H2(X, Y). Then E(Z+W) =
E(Z)+E(W).
(3) Let (X, Y) be a two-dimensional random variable and suppose that X
and Y are independent. Then E(XY) = E(X) E(Y)
E ( XY ) xy f ( x, y ) dxdy
xy g ( x) h( y ) dxdy
X/Y 1 2 3 Sum
0 0.02 0.08 0.1 p(x1) = 0.2
1 0.08 0.32 0.4 p(x2) = 0.8
Sum q(y1) = 0.1 q(y2) = 0.4 q(y3) = 0.5 1
Solution:
i) Cov(aX+b, cY+d)
= E[{(aX+b E (aX + b ) } {(cY+d) E [cY+d]}]
= E[{aX+b aE (X) b } {(cY+d cE (Y) d}]
= E[ac {X E (X) } {Y E (Y)}]
1 y
k ( x 1) dx e dy
0
2 2 12 3
k (0 1) k
2 2
(x y)
; x 0, y 0
f ( x, y ) e
0 otherw ise
Solution: We note that the given f (x, y) 0 for all x and y, and
(x y)
f ( x, y ) dxdy e dxdy
0 0
x
e dx e y dy
0 0
(0 1) (0 1) 1
y
y e x dx dy
e
0 0
e y 1 e y dy e y
e 2 y dy
0 0
1 1
1
2 2
Therefore
1 1
P [X > Y] = 1 P [X Y] = 1
2 2
1 1 x
(iv) P [X + Y 1] = f (x, y) dydx
x0 y0
x=0 y=x
(0,1)
Q
x+y =1
P
(1,0)
y=0
1 1 x
e ( x y ) dy dx
0 y 0
1 1 x
e x e y dy dx
y 0
0
1
e x 1 e (1 x ) dx
0
1
e x e 1 dx 1 2
e
0
and P [Y = 2] = e (x+y) dx at y =2
= e (x+2) dx = e 2
0
Therefore
e 2 (1 e 1) 1
2
1
P[0 < X < 1 | Y = 2] = e e
Example: The life time X and brightness Y of a light bulb are modeled as
continuous random variables with joint density function.
e ( x + y)
e x y 0<x<
f x, y
0 0<y<
elsewhere
when and are constants.
Find (i) the marginal density functions of x and y and (ii) the compound
cumulative distribution function.
(x+y)
g(x) = f(x, y) dy = e dy
0
= e
x e
y
dy
0
= e x , 0<x< (i)
y
Similarly the marginal density function of Y is h(y) = e , 0 < y < (ii)
The compound cumulative distribution function is
x y
F (x, y) = f (x, y) dy dx
x y
= e (x + y) dydx
x y y 1 1
e x dx e dy (1 e x ) (1 e y )
0 0
1 e x 1 e y
2
(2 x 5 y ) , if 0 x 1, 0 y 1
f ( x, y ) 5
0 otherwise
2.7 Summary
In this chapter we covered the topics such as discrete, continuous random
variables, one & two dimensional random variables, covariance, correlation
coefficient.
x 0 1 2 3 4 5 6 7 8
2.9 Answers
Self Assessment Questions:
1 16 9 3
1. c and , ,
25 25 25 5
2. Yes
3. From the given table, we note that X takes values 1, 2 and Y takes
values 2, 3, 4. Also p11= 0.06, p12 = 0.15, p13 = 0.09, p21 = 0.14,
p22 = 0.35, p23 = 0.21 where pij = p(xi, yj).
3
p( x i ) p( x i y j )
j 1
2
q (y2) = p (xi, y2)
i 1
= p (x1, y2) + p(x2, y2)
= p12 + p22 = 0.5
2
q (y3) = p(xi, y3)
i 1
= p (x1, y3) + p(x2, y3)
= p13 + p23 = 0.3
yj q(yi)
2 0.2
3 0.5
4 0.3
Y 0 1 2 3
P(y) 1 3 3 1
8 8 8 8
1
p21 = P[X = 1, Y = 0] = ,
8
2
p22 = P(X = 1, Y = 1) =
8
1
p23 = P(X = 1, Y = 2) =
8
p24= P[X = 1, Y = 3) = 0
1 4 3 1 4 3
0 ( 4 0) 2
8 8 8 8 8 8
pij x i y j pij y j x 0 0
and E (XY) = i j j
= 1
5 7 9 9 12
5. , , and ,
21 21 21 21 21
1
6. k
32
2 5 2
7. For X: (2 x ) and For Y: (1 5 y )
5 2 5
Terminal Questions:
1 1 8 8 1 4
1. (i ) a (ii ) , , (iii ) , ,.....1
81 9 9 27 81 81
24
2. 4 y ,0 y x (iii ) ,0 x 1 ,0 y x
32
3 5 5
3. (i ) (ii ) (iii )
8 8 24