6 Jointly Continuous Random Variables: 6.1 Joint Density Functions
6 Jointly Continuous Random Variables: 6.1 Joint Density Functions
6 Jointly Continuous Random Variables: 6.1 Joint Density Functions
Again, we deviate from the order in the book for this chapter, so the subsections in this chapter do not correspond to those in the text.
6.1
Recall that X is continuous if there is a function f (x) (the density) such that
Z t
P(X t) =
fX (x) dx
The integral is over {(x, y) : x s, y t}. We can also write the integral as
Z s Z t
P(X s, Y t) =
fX,Y (x, y) dy dx
Z t Z s
=
fX,Y (x, y) dx dy
f (x, y)dxdy = 1
Just as with one random variable, the joint density function contains all
the information about the underlying probability measure if we only look at
the random variables X and Y . In particular, we can compute the probability
of any event defined in terms of X and Y just using f (x, y).
Here are some events defined in terms of X and Y :
{X Y }, {X 2 + Y 2 1}, and {1 X 4, Y 0}. They can all be written
in the form {(X, Y ) A} for some subset A of R2 .
1
Proposition 1. For A R2 ,
P((X, Y ) A) =
Z Z
f (x, y) dxdy
A
1
1
exp( (x2 + y 2))
2
2
6.2
Suppose we know the joint density fX,Y (x, y) of X and Y . How do we find
their individual densities fX (x), fY (y). These are called marginal densities.
The cdf of X is
FX (x) = P(X x) = P( < X x, < Y < )
Z x Z
=
fX,Y (u, y) dy du
Z
fY (y) =
fX,Y (x, y) dx
We will define independence of two contiunous random variables differently than the book. The two definitions are equivalent.
Definition 2. Let X, Y be jointly continuous random variables with joint
density fX,Y (x, y) and marginal densities fX (x), fY (y). We say they are
independent if
fX,Y (x, y) = fX (x)fY (y)
If we know the joint density of X and Y , then we can use the definition
to see if they are independent. But the definition is often used in a different
way. If we know the marginal densities of X and Y and we know that they
are independent, then we can use the definition to find their joint density.
Example: If X and Y are independent random varialbes and each has the
standard normal distribution, what is their joint density?
f (x, y) =
1
1
exp( (x2 + y 2))
2
2
6.3
Expected value
If we write the marginal fX (x) in terms of the joint density, then this becomes
Z Z
E[X] =
x fX,Y (x, y) dxdy
provided
6.4
FZ (z) = z +
= z+
Z
Z
1
z
1
z
"Z
"Z
z/x
1 dy dx
0
z/x
1 dy dx
0
z
dx
z x
= z + z ln x|1z = z z ln z
= z+
fZ (z) =
ln z, if 0 z 1
0,
otherwise
This is known as a convolution. We can use this formula to find the density of
the sum of two independent random variables. But in some cases it is easier
to do this using generating functions which we study in the next section.
Example: Let X and Y be independent random variables each of which has
the standard normal distribution. Find the density of Z = X + Y .
We need to compute the convolution
Z
1
1
1
fZ (z) =
exp( x2 (z x)2 ) dx
2
2
2
Z
1
1
exp(x2 z 2 + xz) dx
=
2
2
Z
1
1
=
exp((x z/2)2 z 2 ) dx
2
4
Z
1
2
exp((x z/2)2 ) dx
= ez /4
2
Now the substitution u = x z/2 shows
Z
Z
2
exp((x z/2) ) dx =
exp(u2 ) du
6.5
Example: Compute it for exponential. Should find M(t) = t
.
Example: In the homework you will compute it for the gamma distribution
and find (hopefully)
w
M(t) =
t
=
fX (x) xk etx |t=0 dx
Z
=
fX (x) xk dx = E[X k ]
i=1
n
X
i2
i=1
6.6
If we know the joint cdf, then we can compute the joint pdf by taking partial
derivatives of the above :
2
FX,Y (x, y) = f (x, y)
xy
Properties of the joint cdf
6.7
Independence continued
6.8
Change of variables