0% found this document useful (0 votes)
95 views

Marginally Gaussian But Not Jointly Gaussian PDF

This document provides an example of two random variables (RVs), X and Y, that are marginally Gaussian but not jointly Gaussian. It first defines what it means for two RVs to be jointly Gaussian (their joint probability density function (PDF) must be a 2D Gaussian). It then demonstrates how to construct a joint PDF for X and Y that is not Gaussian overall, but integrates to Gaussian marginal PDFs for each RV. Specifically, it modifies a 2D Gaussian joint PDF by setting it to zero in certain quadrants, resulting in RVs that are individually Gaussian but not jointly Gaussian.

Uploaded by

Nouman Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
95 views

Marginally Gaussian But Not Jointly Gaussian PDF

This document provides an example of two random variables (RVs), X and Y, that are marginally Gaussian but not jointly Gaussian. It first defines what it means for two RVs to be jointly Gaussian (their joint probability density function (PDF) must be a 2D Gaussian). It then demonstrates how to construct a joint PDF for X and Y that is not Gaussian overall, but integrates to Gaussian marginal PDFs for each RV. Specifically, it modifies a 2D Gaussian joint PDF by setting it to zero in certain quadrants, resulting in RVs that are individually Gaussian but not jointly Gaussian.

Uploaded by

Nouman Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Example: RVs Marginally Gaussian but not Jointly Gaussian

We have seen that the MMSE estimator takes on a particularly simple form when x and θ
are jointly Gaussian and we went to great lengths to show that this is satisfied for the
Bayesian linear model.

The definition of jointly Gaussian is: Two Gaussian RVs X and Y are jointly Gaussian if
their joint PDF is a 2-D Gaussian PDF. (Of course, there is an obvious extension to
random vectors).

Note the main ingredients here: both RV’s must individually be Gaussian and they must
have a joint PDF that is Gaussian. This raises the obvious question: Is it possible to have
two RVs that are each individually Gaussian but are NOT jointly Gaussian?

The answer is: Yes…. otherwise we wouldn’t make such a big stink about this. So let’s
see if we can find one such case to demonstrate that we DO have to worry about this.

Remember that given a joint PDF pXY(x,y) the individual PDFs are the marginal PDFs
that are found by integrating out “the other variable,” that is:

p X ( x ) = ∫ p XY ( x, y )dy

pY ( y ) = ∫ p XY ( x, y )dx

So we see what we need for our counterexample: we need a joint PDF that is NOT a 2-D
Gaussian but that integrates to two Gaussian marginal PDFs. So let’s construct one of
these. Let’s start with a 2-D joint Gaussian PDF and modify it. Define the 2-D Gaussian
PDF with zero-mean, uncorrelated RVs, which is then given by:

1  − 1  x 2 y 2  
p XY ( x, y ) = exp   2 + 2  
2πσ X σ Y  2  σ X σ Y  

which looks like this in a contour plot: y

Now, from what we have studied about 2-D Gaussian PDFs, integrating over x this gives
a Gaussian marginal in y; likewise, integrating over y gives a Gaussian marginal in x. But
2

because of the symmetry of this joint PDF about both x and y axes we can write these
integrations as

 ∞
2 ∫ p XY ( x, y )dy , x > 0
 0
p X ( x) = 
0

2 ∫ p XY ( x, y )dy , x ≤ 0
 − ∞

 ∞
2 ∫ p XY ( x, y )dx, y > 0
 0
pY ( y ) = 
0

 ∫ p XY ( x, y )dx, y ≤ 0
2
 − ∞

In other words we only have to integrate over the following hatched quadrants to get the
marginals, as long as we multiply by 2:

This gives us the route to what we need. If we take this original 2-D Gaussian PDF and
set it to zero over the non-hatched quadrants above (the parts we didn’t need to create the
marginals) and multiply the rest by two we get a new 2-D PDF that is definitely NOT
Gaussian:

 1  − 1  x 2 y 2  
 exp   2 + 2  , when xy > 0
p X~Y~ ( x, y ) =  πσ X σ Y  2  σ X σ Y  

0, when xy ≤ 0

~ ~
The new RVs X and Y are definitely NOT jointly Gaussian but they are each Gaussian
because (as we have constructed above) the marginals of their joint PDF are Gaussian!

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy