Marginally Gaussian But Not Jointly Gaussian PDF
Marginally Gaussian But Not Jointly Gaussian PDF
We have seen that the MMSE estimator takes on a particularly simple form when x and θ
are jointly Gaussian and we went to great lengths to show that this is satisfied for the
Bayesian linear model.
The definition of jointly Gaussian is: Two Gaussian RVs X and Y are jointly Gaussian if
their joint PDF is a 2-D Gaussian PDF. (Of course, there is an obvious extension to
random vectors).
Note the main ingredients here: both RV’s must individually be Gaussian and they must
have a joint PDF that is Gaussian. This raises the obvious question: Is it possible to have
two RVs that are each individually Gaussian but are NOT jointly Gaussian?
The answer is: Yes…. otherwise we wouldn’t make such a big stink about this. So let’s
see if we can find one such case to demonstrate that we DO have to worry about this.
Remember that given a joint PDF pXY(x,y) the individual PDFs are the marginal PDFs
that are found by integrating out “the other variable,” that is:
p X ( x ) = ∫ p XY ( x, y )dy
pY ( y ) = ∫ p XY ( x, y )dx
So we see what we need for our counterexample: we need a joint PDF that is NOT a 2-D
Gaussian but that integrates to two Gaussian marginal PDFs. So let’s construct one of
these. Let’s start with a 2-D joint Gaussian PDF and modify it. Define the 2-D Gaussian
PDF with zero-mean, uncorrelated RVs, which is then given by:
1 − 1 x 2 y 2
p XY ( x, y ) = exp 2 + 2
2πσ X σ Y 2 σ X σ Y
Now, from what we have studied about 2-D Gaussian PDFs, integrating over x this gives
a Gaussian marginal in y; likewise, integrating over y gives a Gaussian marginal in x. But
2
because of the symmetry of this joint PDF about both x and y axes we can write these
integrations as
∞
2 ∫ p XY ( x, y )dy , x > 0
0
p X ( x) =
0
2 ∫ p XY ( x, y )dy , x ≤ 0
− ∞
∞
2 ∫ p XY ( x, y )dx, y > 0
0
pY ( y ) =
0
∫ p XY ( x, y )dx, y ≤ 0
2
− ∞
In other words we only have to integrate over the following hatched quadrants to get the
marginals, as long as we multiply by 2:
This gives us the route to what we need. If we take this original 2-D Gaussian PDF and
set it to zero over the non-hatched quadrants above (the parts we didn’t need to create the
marginals) and multiply the rest by two we get a new 2-D PDF that is definitely NOT
Gaussian:
1 − 1 x 2 y 2
exp 2 + 2 , when xy > 0
p X~Y~ ( x, y ) = πσ X σ Y 2 σ X σ Y
0, when xy ≤ 0
~ ~
The new RVs X and Y are definitely NOT jointly Gaussian but they are each Gaussian
because (as we have constructed above) the marginals of their joint PDF are Gaussian!