8mmpjoint Annotated
8mmpjoint Annotated
8mmpjoint Annotated
Giulia Giantesio
giulia.giantesio@unicatt.it
Introduction
a) p (∅) = 0;
b) p (Ac ) = 1 − p (A) ;
c) if A, B ∈ A, then p (A ∪ B) = p (A) + p (B) − p (A ∩ B) ;
d) if A ⊆ B, then p (A) ⩽ p (B).
Introduction
p (A ∩ B) if p (B) > 0
p (A | B) = p (B)
p(A) otherwise
Definition
Two events A and B are called independent if and only if p(A ∩ B) = p(A)p(B).
Introduction
where
▶ fXY (x, y ) ⩾ 0;
XX
▶ fXY (x, y ) = 1.
x y
Joint Probability Mass Function
Notation:
The joint PMF contains all the information regarding the distributions of X and Y .
Definition
We define marginal probability mass functions of X and Y :
X
fX (x) = f (x, yj ), for any x ∈ RX ,
yj ∈RY
X
fY (y ) = f (xi , y ), for any y ∈ RY .
xi ∈RX
Joint probability table
y1 y2 ... yn
x1 f (x1 , y1 ) f (x1 , y2 ) ... f (x1 , yn ) fX (x1 )
x2 f (x2 , y1 ) f (x2 , y2 ) ... f (x2 , yn ) fX (x2 )
.. .. .. .. ..
. . . . .
xm f (xm , y1 ) f (xm , y2 ) ... f (xm , yn ) fX (xm )
fY (y1 ) fY (y2 ) ... fY (yn ) 1
Remark
The probability that X = xi (Y = yk ) is obtained by adding all entries in the row (column)
corresponding to xi (yk ).
Joint cumulative distribution function
Definition
The joint cumulative distribution function of two discrete rv X and Y is defined by
XX
F (x, y ) = p(X ⩽ x, Y ⩽ y ) = f (u, v ).
u⩽x v ⩽y
Joint cumulative distribution function: properties
We must have
F (∞, ∞) = 1,
F (−∞, y ) = 0, for any y ,
F (x, −∞) = 0, for any x.
Joint cumulative distribution function: properties
Lemma
For two random variables X and Y , and real numbers x1 ⩽ x2 , y1 ⩽ y2 , we have
Y =0 Y =1 Y =2
1 1 1
X =0 6 4 8
1 1 1
X =1 8 6 6
Y =0 Y =1 Y =2
1 1 1
X =0 6 4 8
1 1 1
X =1 8 6 6
The multinomial distribution arises from an extension of the binomial experiment to situations
where each trial has k ⩾ 2 possible outcomes.
Suppose that we have an experiment with n independent trials, where each trial produces
exactly one of the events A1 , A2 , . . . , Ak (these events are mutually exclusive).
Moreover, suppose that each trial Ai occurs with probability pi .
We must have p1 + p2 + · · · + pk = 1.
Let Xi be the number of trials in which Ai occurs, then Xi is a binomial rv.
n!
p(X1 = n1 , X2 = n2 , . . . , Xk = nk ) = p1n1 p2n2 . . . pknk .
n1 !n2 ! . . . nk !
Multinomial distribution:example
Suppose that a fair dice is rolled 9 times. Compute the probability that 1 appears three times,
2 and 3 twice each, 4 and 5 once each, and 6 not at all.
Continuous case
where
▶ fXY (x, y ) ⩾ 0;
Z +∞ Z +∞
▶ fXY (x, y )dxdy = 1.
−∞ −∞
Continuous case
Z b Z d
p(a < X < b, c < Y < d) = f (x, y )dxdy
a c
Continuous case: example
Let X and Y be two jointly continuous random variables with joint PDF
(
x + cy 2 if 0 ⩽ x ⩽ 1, 0 ⩽ y ⩽ 1
f (x, y ) =
0 otherwise
∂2F
= f (x, y ).
∂x∂y
Marginal functions
Marginal distributions:
Z x Z ∞
FX (x) = f (u, v )dudv , for any x ∈ RX ,
−∞ −∞
Z ∞Z y
FY (y ) = f (u, v )dudv , for any y ∈ RY .
−∞ −∞
Marginal densities:
Z ∞
fX (x) = f (x, v )dv , for any x ∈ RX ,
−∞
Z ∞
fY (y ) = f (u, y )du, for any y ∈ RY .
−∞
Extension to the case of n variables: JCDF
n
!
→
−
\
f−
→( x ) = p
X
[Xk = xk ]
k=1
Z Z Z
p[(X1 , X2 , . . . , Xn ) ∈ A] = ··· f (x1 , x2 , . . . , xn )dx1 . . . dxn .
A