a little Gaussian problem

Here’s a problem I heard last month that I’ve been meaning to write up.

We know that if two random variables X and Y are jointly Gaussian, then X and Y are each Gaussian. The converse is not true, even if X and Y are uncorrelated — we have examples of this. What if one more condition is added?

Assume the following:

  1. X and Y each distributed as \(\mathcal{N}(0, 1)\)
  2. X and Y are uncorrelated
  3. X+Y is distributed as \(\mathcal{N}(0, 2)\)

Is X and Y jointly Gaussian or, equivalently, are X and Y independent (because uncorrelation implies independence for a pair of jointly Gaussian random variables)?

The necessary and sufficient condition for X and Y to be jointly Gaussian is for every non-trivial linear combination of X and Y to be Gaussian, so unless (3) somehow makes that true, X and Y would not be jointly Gaussian.
(Read the article)