Lecture 7 - Fall 2023

Download as pdf or txt
Download as pdf or txt
You are on page 1of 28

Table of Contents

Bivariate Random Variable


• A discrete bivariate random variable (X , Y ) is an ordered pair of
discrete random variables.
• A discrete bivariate random variable (X , Y ) is an ordered pair of
discrete random variables.

JOINT PPROBABILITY MASS FUNCTION :


Given a pair of discrete random variables X and Y , their
joint probability mass function f : RX ⇥ RY ! R is defined
by

f (x, y ) = P(X = x, Y = y ).
• A discrete bivariate random variable (X , Y ) is an ordered pair of
discrete random variables.

JOINT PPROBABILITY MASS FUNCTION :


Given a pair of discrete random variables X and Y , their
joint probability mass function f : RX ⇥ RY ! R is defined
by

f (x, y ) = P(X = x, Y = y ).

EXAMPLES:
1. A fair coin is tossed two times; let X denote the number of
heads on the first toss and Y the total number of heads.
Sample space of this random experiment is

! = {HH, HT , TH, TT }.
We have
RX = {0, 1}
RY = {0, 1, 2}
and RX ⇥ RY = {(0, 0), (0, 1), (0, 2), (1, 0), (1, 1), (1, 2)}. Then
1
f (0, 0) = P(X = 0, Y = 0) =
4
and
f (0, 1) = P(X = 1, Y = 0) = 0.
The joint PMF of X and Y is as given in the following table:

Y=0 Y=1 Y=2

1 1
X= 0 4 4 0
1 1
X= 1 0 4 4
2. A fair die is rolled, and a fair coin is tossed independently. Let
X be the face value on the die, and let Y = 0 if a tail turns up and
Y = 1 if a head turns up. Find the joint PMF of X and Y ?
MARGINAL PPROBABILITY MASS FUNCTION :
Given a pair of discrete random variables X and Y and their
joint probability mass function f . The marginal PMF of X ,
fX : RX ! R is defined by
X
fX (x) = f (x, y ).
y 2RY

Similarly, the function fY : RY ! R defined by


X
fY (y ) = f (x, y ).
x2RX

is called the marginal PMF of Y .


EXAMPLE: A fair coin is tossed two times; let X denote the
number of heads on the first toss and Y the total number of heads.
Sample space of this random experiment is

! = {HH, HT , TH, TT }.

The joint PMF f of X and Y is as given in the following table:

Y=0 Y=1 Y=2 P(X=x)


1 1 2
X= 0 4 4 0 4

1 1 2
X= 1 0 4 4 4
1 2 1
P(Y=y) 4 4 4

Then the marginal PMF of X is given by

fX (0) = f (0, 0) + f (0, 1) + f (0, 2) = 1/4 + 1/4 + 0 = 2/4


and

fX (1) = f (1, 0) + f (1, 1) + f (1, 2) = 0 + 1/4 + 1/4 = 2/4.

Similarly, the marginal PMF of Y is given by

fY (0) = f (0, 0) + f (1, 0) = 1/4 + 0 = 1/4

fY (1) = f (0, 1) + f (1, 1)+ = 1/4 + 1/4 = 2/4


and
fY (2) = f (0, 2) + f (1, 2) = 0 + 1/4 = 1/4.
JOINT CUMULATIVE DISTRIBUTION FUNCTION :
Given a pair of discrete random variables X and Y and their
joint probability mass function f . The joint CDF of X and
Y , is a function F : R ⇥ R ! R defined by

F (x, y ) = P(X  x, Y  y ),
or
XX
F (x, y ) = f (s, t).
sx ty
JOINT PPROBABILITY DENSITY FUNCTION :
A bivariate random variable (X , Y ) is said to be of continuous
type , if there exists a function f : R ⇥ R ! R such that
f (x, y ) 0
R1 R1
1 1 f (x, y )dxdy = 1
For any subset A ✓ R ⇥ R,
Z Z
P((X , Y ) 2 A) = f (x, y )dxdy .
A
The function f is called the joint PDF of (X , Y ).
JOINT PPROBABILITY DENSITY FUNCTION :
A bivariate random variable (X , Y ) is said to be of continuous
type , if there exists a function f : R ⇥ R ! R such that
f (x, y ) 0
R1 R1
1 1 f (x, y )dxdy = 1
For any subset A ✓ R ⇥ R,
Z Z
P((X , Y ) 2 A) = f (x, y )dxdy .
A
The function f is called the joint PDF of (X , Y ).
EXAMPLE: Let X and Y have the joint density function
86 2
< 5 (x + 2xy ) if 0 < x < 1, 0 < y < 1
f (x, y ) =
:
0 otherwise,

What is P(X  Y ) ?
MARGINAL PPROBABILITY DENSITY FUNCTION :
Let (X , Y ) be a continuous bivariate random variable. Let
f be the joint probability density function of X and Y . The
function fX : R ! R defined by
Z 1
fX (x) = f (x, y )dy .
1
is called the marginal probability density function of X . Sim-
ilarly, the function fY : R ! R defined by
Z 1
fY (y ) = f (x, y )dx,
1
is called the marginal probability density function of Y .
EXAMPLE: Let (X , Y ) be jointly distributed with PDF
8
<2 if 0<x <y <1
f (x, y ) =
:
0 otherwise,
EXAMPLE: Let (X , Y ) be jointly distributed with PDF
8
<2 if 0<x <y <1
f (x, y ) =
:
0 otherwise,
Then
8
Z 1 <2 x if 0<x <1
fX (x) = 2dy =
x :
0 otherwise
and
8
Z y <2y if 0<y <1
fY (y ) = 2dx =
0 :
0 otherwise
are the two marginal pdfs.
JOINT CUMULATIVE DISTRIBUTION FUNCTION :
Given a pair of continuous random variables X and Y and
their joint probability density function f . The joint CDF of
X and Y , is a function F : R ⇥ R ! R defined by

Z y Z x
F (x, y ) = P(X  x, Y  y ) = f (s, t)ds dt.
1 1

Note that

@2F
f (x, y ) = ,
@x@y
wherever these partial derivative exists.
EXAMPLE: Let (X , Y ) be jointly distributed with CDF
81
< 5 (2x 3 y + 3x 2 y 2 ) if 0<x <y <1
F (x, y ) =
:
0 elsewhere,
Then what is the joint pdf of X and Y ?
CONDITIONAL DISTRIBUTION :
Let X and Y be any two random variables with joint pdf (or
pmf) f and marginals fX and fY . The conditional probability
density function (or pmf) g of X , given (the event) Y = y ,
is defined as

f (x, y )
g (x|y ) = ,
fY (y )
provided fY (y ) > 0.
CONDITIONAL DISTRIBUTION :
Let X and Y be any two random variables with joint pdf (or
pmf) f and marginals fX and fY . The conditional probability
density function (or pmf) g of X , given (the event) Y = y ,
is defined as

f (x, y )
g (x|y ) = ,
fY (y )
provided fY (y ) > 0.
Similarly, the conditional probability density function (or pmf)
h of Y , given (the event) X = x, is defined as

f (x, y )
h(y |x) = ,
fX (x)
provided fX (x) > 0
Example: Let X and Y be discrete random variables with joint
probability mass function
81
< 21 (x + y ) if x = 1, 2, 3, y = 1, 2
f (x, y ) =
:
0 otherwise,
What is the conditional probability mass function of X , given
Y =2?
Example: Let X and Y be continuous random variables with joint
pdf
8
<12x if 0 < y < 2x < 1
f (x, y ) =
:
0 otherwise,
what is the conditional density function of Y given X = x?
INDEPENDENCE OF RANDOM VARIABLES:

Let X and Y be any two random variables with joint cdf F


and marginals FX and FY . The random variables X and Y
are independent if and only if

F (x, y ) = FX (x)FY (y ),

for all (x, y ) 2 R2 .


Theorem:
(a) A necessary and sufficient condition for random variables X
and Y of the discrete type to be independent is that

P(X = xi , Y = yi ) = P(X = xi )P(Y = yi )


for all (xi , y) 2 RX ⇥ RY .
(b) Two random variables X and Y of the continuous type are
independent if and only if

f (x, y ) = fX (x)fY (y )

for all (x, y ) 2 R2 , where f , fX , fY , respectively, are the joint and


marginal pdfs of X and Y .
Example: Let X and Y be continuous random variables with joint
pdf
8 (x+y )
<e if 0 < x, y < 1
f (x, y ) =
:
0 otherwise,
Are X and Y independent?
Theorem: Let X and Y be independent random variables and
, : R ! R are Borel measurable functions. Then the random
variables (X ) and (Y ) are also independent.
Proof: We have

1 1
P( (X )  x, (Y )  y ) = P(X 2 ( 1, x], Y 2 ( 1, y ])

1 1
= P(X 2 ( 1, x]) P(Y 2 ( 1, y ])

= P( (X )  x) P( (Y )  y ).

Hence the proof.


IID Random Variables :
The random variables X and Y are said to be independent
and identically distributed (IID) if and only if they are inde-
pendent and have the same distribution.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy