<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Some stuff &#187; density function</title>
	<atom:link href="http://blog.yhuang.org/?feed=rss2&#038;tag=density-function" rel="self" type="application/rss+xml" />
	<link>https://blog.yhuang.org</link>
	<description>here.</description>
	<lastBuildDate>Wed, 27 Aug 2025 08:50:58 +0000</lastBuildDate>
	<language>en</language>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.1.1</generator>
		<item>
		<title>a little Gaussian problem</title>
		<link>https://blog.yhuang.org/?p=87</link>
		<comments>https://blog.yhuang.org/?p=87#comments</comments>
		<pubDate>Mon, 14 Jan 2008 08:45:02 +0000</pubDate>
		<dc:creator>admin</dc:creator>
				<category><![CDATA[Uncategorized]]></category>
		<category><![CDATA[array c]]></category>
		<category><![CDATA[density function]]></category>
		<category><![CDATA[dirac delta]]></category>
		<category><![CDATA[linear combination]]></category>
		<category><![CDATA[random variables]]></category>

		<guid isPermaLink="false">http://scripts.mit.edu/~zong/wpress/?p=87</guid>
		<description><![CDATA[Here&#8217;s a problem I heard last month that I&#8217;ve been meaning to write up. We know that if two random variables X and Y are jointly Gaussian, then X and Y are each Gaussian. The converse is not true, even if X and Y are uncorrelated &#8212; we have examples of this. What if one [...]]]></description>
			<content:encoded><![CDATA[<p>Here&#8217;s a problem I heard last month that I&#8217;ve been meaning to write up.</p>
<p>We know that if two random variables X and Y are <strong>jointly Gaussian</strong>, then X and Y are each Gaussian. The converse is not true, even if X and Y are uncorrelated &#8212; we have examples of this. What if one more condition is added?</p>
<p>Assume the following:</p>
<ol>
<li>X and Y each distributed as \(\mathcal{N}(0, 1)\)</li>
<li>X and Y are uncorrelated</li>
<li>X+Y is distributed as \(\mathcal{N}(0, 2)\)</li>
</ol>
<p>Is X and Y jointly Gaussian or, equivalently, are X and Y independent (because uncorrelation implies independence for a pair of jointly Gaussian random variables)?</p>
<p>The necessary and sufficient condition for X and Y to be jointly Gaussian is for <em>every</em> non-trivial linear combination of X and Y to be Gaussian, so unless (3) somehow makes that true, X and Y would not be jointly Gaussian.<br />
<span id="more-87"></span><br />
Not surprisingly, the answer is no. The difficult part is constructing a counter-example, where X and Y are uncorrelated but not independent, individually Gaussian, and X+Y is also Gaussian.</p>
<p>Suppose we have a prototype density function \(g(t)\). Now define random variables U and V by the joint density</p>
<p><img align="bottom" alt="Input: $$p_{UV}(u,v) = \frac{1}{2} g(u)\delta_v(v) + \frac{1}{2} g(v)\delta_u(u)$$" src="wp-content/cache/65a1bb78d6f287d0a6ddb2c90eaa598e.png" /></p>
<p>where we use the Dirac delta to represent discrete probabilities.</p>
<p>Define X and Y by a unitary transformation of U and V (actually a rotation by 22.5 degrees):</p>
\(<br />
\left[\begin{array}{c}<br />
X \\ Y<br />
\end{array}\right] = \overset{A}{\overbrace{<br />
\left[\begin{array}{cc}<br />
\cos \frac{\pi}{8} &#038; -\sin \frac{\pi}{8} \\<br />
\sin \frac{\pi}{8} &#038; \cos \frac{\pi}{8}<br />
\end{array}\right]}}<br />
\left[\begin{array}{c}<br />
U \\ V<br />
\end{array}\right]<br />
\)
<p>Let us note that, obviously, X and Y are not independent (and not jointly Gaussian), but they are uncorrelated, because of quadrantal symmetry &#8212; wherever there is a positive density at (x, y), the same is at (-y, x), (-x, -y), and (y, -x).</p>
<p>The rotation by 22.5 degrees is also important to ensure a kind of marginalization symmetry since in addition to marginalizing along the 0-degree and 90-degree axes, assumption (3) requires a marginalization along the 45-degree axis.</p>
<p>By change of variables, we have</p>
<p><img align="bottom" alt="Input: \begin{eqnarray*}
p_{XY}(x,y) &amp;=&amp; p_{UV}(A^{T}(u,v)) \left|\det A^T\right| \\
&amp; = &amp; \frac{1}{2} g(x \cos \frac{\pi}{8} + y \sin \frac{\pi}{8}) \delta_v(-x \sin \frac{\pi}{8} + y \cos \frac{\pi}{8}) \\
&amp; &amp; + \frac{1}{2} g(-x \sin \frac{\pi}{8} + y \cos \frac{\pi}{8}) \delta_u(x \cos \frac{\pi}{8} + y \sin \frac{\pi}{8})
\end{eqnarray*}" src="wp-content/cache/cb271ab49739c239157f177617af9d82.png" /></p>
<p>Next we find \(p_X\), \(p_Y\), and \(p_{X+Y}\). We can do this formally (or even by inspection):</p>
<p><img align="bottom" alt="Input: \begin{eqnarray*}
p_X(x) &amp; = &amp; \frac{1}{2} \int{ g(x \cos \frac{\pi}{8} + y \sin \frac{\pi}{8}) \left|(\cos \frac{\pi}{8})^{-1}\right| \delta_y(-x \tan \frac{\pi}{8} + y) dy} \\
&amp; &amp; + \frac{1}{2} \int{ g(-x \sin \frac{\pi}{8} + y \cos \frac{\pi}{8}) \left|(\sin \frac{\pi}{8})^{-1}\right| \delta_y(x \cot \frac{\pi}{8} + y) dy} \\
&amp; = &amp; \frac{1}{2} g(x / \cos \frac{\pi}{8})  (\cos \frac{\pi}{8})^{-1} + \frac{1}{2} g(-x / \sin \frac{\pi}{8}) (\sin \frac{\pi}{8})^{-1}
\end{eqnarray*}" src="wp-content/cache/7a85f93525eab18b91e6d7364d36f87a.png" /></p>
<p>Similarly,</p>
<p><img align="bottom" alt="Input: \begin{eqnarray*}
p_Y(y) &amp; = &amp; \frac{1}{2} \int{ g(x \cos \frac{\pi}{8} + y \sin \frac{\pi}{8}) \left|(\sin \frac{\pi}{8})^{-1}\right| \delta_x(-x + y \cot \frac{\pi}{8}) dx} \\
&amp; &amp; + \frac{1}{2} \int{ g(-x \sin \frac{\pi}{8} + y \cos \frac{\pi}{8}) \left|(\cos \frac{\pi}{8})^{-1}\right| \delta_x(x + y \tan \frac{\pi}{8}) dx} \\
&amp; = &amp; \frac{1}{2} g(y / \sin \frac{\pi}{8})  (\sin \frac{\pi}{8})^{-1} + \frac{1}{2} g(y / \cos \frac{\pi}{8}) (\cos \frac{\pi}{8})^{-1}
\end{eqnarray*}" src="wp-content/cache/61d447706aa25a979a0b005f09e548d8.png" /></p>
<p>and since</p>
\(\left[\begin{array}{c}<br />
X+Y \\ -X+Y<br />
\end{array}\right] =<br />
\left[\begin{array}{cc}<br />
1 &#038; 1 \\<br />
-1 &#038; 1<br />
\end{array}\right] A<br />
\left[\begin{array}{c}<br />
U \\ V<br />
\end{array}\right] = \sqrt{2}<br />
\left[\begin{array}{cc}<br />
\cos \left(\frac{\pi}{8} - \frac{\pi}{4}\right) &#038; -\sin \left(\frac{\pi}{8} - \frac{\pi}{4}\right) \\<br />
\sin \left(\frac{\pi}{8} - \frac{\pi}{4}\right) &#038; \cos \left(\frac{\pi}{8} - \frac{\pi}{4}\right)<br />
\end{array}\right]<br />
\left[\begin{array}{c}<br />
U \\ V<br />
\end{array}\right]<br />
\)
<p>which is a (scaled) rotation by -22.5 degrees, we immediately have</p>
<p><img align="bottom" alt="Input: $$
p_{X+Y}(z)  =  \frac{1}{\sqrt{2}} g(z / \cos \frac{\pi}{8})  (\cos \frac{\pi}{8})^{-1} + \frac{1}{\sqrt{2}} g(z / \sin \frac{\pi}{8}) (\sin \frac{\pi}{8})^{-1}
$$" src="wp-content/cache/3483576b90dc118df88d1ab80c6d9341.png" /></p>
<p>If we look at the marginal densities, we see that they all have the same form or shape, namely \(\alpha g(\alpha t) + \beta g(\pm \beta t)\). Suppose we impose symmetry on \(g(t)\), then the problem becomes one of finding a \(g(t)\) such that \(\alpha g(\alpha t) + \beta g(\beta t)\) is the <em>right</em> shape (a Gaussian), i.e. proportional to \(h(t) = \exp(-t^2)\).</p>
<p>This is not too hard. Let \(k = |\beta / \alpha|\) (which we assume to be >1, w.l.o.g.). Then define \(g(t)\) arbitrarily on any interval \((\gamma, k\gamma]\). That defines \(g(t)\) for all \(t>0\) recursively as follows:</p>
\(\because g(t) + kg(kt) = h(t/\alpha) / \alpha\)<br />
\(\therefore g(kt) = h(t/\alpha) / (k\alpha) &#8211; g(t)/k\) and<br />
\(g(t/k) = h(t/(k\alpha)) / \alpha &#8211; kg(t)\)
<p>Finally, define \(g(0) = h(0)/2\), \(g(-t) = g(t)\), and then scale it appropriately. QED.</p>
<p>Note that \(g(t)\) is by no means a &#8220;nice&#8221; function, but that is no issue for this problem.</p>
<p>What generalizations are possible is unclear. What if we ask for more linear combinations of X and Y to be Gaussian? My guess is X and Y should still not be necessarily Gaussian, but counter-examples may be harder to come by. It would be interesting to know whether a finite number of linear combination constraints would ever result in joint Gaussianity, or whether the sums of a small number of prototype functions is enough to generate a counter-example.</p>
]]></content:encoded>
			<wfw:commentRss>https://blog.yhuang.org/?feed=rss2&#038;p=87</wfw:commentRss>
		<slash:comments>0</slash:comments>
		</item>
	</channel>
</rss>
