Probability question

If X and Y are independently distributed Gaussian random variables with zero mean and variance sigma1^2 and sigma2^2, respectively, what is the expected value of
sqrt(X^2 + Y^2) ?

I found info on chi and chi-square distributions, but those don’t apply to this case, and I couldn’t find anything relevant.

Do you guys have a link to a site that discusses this case? (or do you know the answer?)

I’l be vague, in case this is homework. Suppose that sigma1 and sigma2 were equal. What shape would your error contours be? Based on that, and on the expected value of X^2, what would the answer to your question be?

Now, with two different sigmas, what does that do to the shape of the error contours, and what does that mean for the answer to your question?

Don’t worry, I’m too old for homework :slight_smile:

I’m not sure what you mean by “error contours”. Could you explain?

Anyway, what I found was that



if(sigma1==sigma2),  E[r]       = sigma1*sqrt(pi/2)
if(sigma1>>sigma2),  E[r] approx= sigma1*sqrt(2/pi)


If we define s1=max(sigma1,sigma2) and s2=min(sigma1,sigma2), and rho=s2/s1, then rho \in [0,1] and



if(rho==1),  E[r] = s1*sqrt(pi/2)
if(rho==0),  E[r] = s1*sqrt(2/pi)


So, the value of E[r] for a general rho should be of the form


E[r] approx= s1*( sqrt(2/pi)+g(rho)*(sqrt(pi/2)-sqrt(2/pi)))


where g(rho) should be a function s.t. g(0) = 0 and g(1) = 1, and s.t. it provides the best approximation for E[r].

The above turns out to be a pretty good approximation, but I’m still interested to see if there is a closed form expression out there for E[r].

Most likely, the closed form expression may much more complex than the above approximation, and I may end up using the approximation anyway, but for completeness it would be good to know.

In general, the expected value of a function h of an n-dimensional random vector X with density f is the integral over R[sup]n[/sup] of h(X)f(X) with respect to all the components of X.

In this case, h(X) = sqrt(x[sub]1[/sub][sup]2[/sup] + x[sub]2[/sub][sup]2[/sup]), and f(X) = exp(-(x[sub]1[/sub][sup]2[/sup]/[symbol]s[/symbol][sub]1[/sub][sup]2[/sup] + x[sub]2[/sub][sup]2[/sup]/[symbol]s[/symbol][sub]2[/sub][sup]2[/sup]))/(2*[symbol]p[/symbol][symbol]s[/symbol][sub]1[/sub][symbol]s[/symbol][sub]2[/sub]). The integral of the product of those two functions over R[sup]2[/sup] doesn’t appear to have a closed form, but if you have specific values of [symbol]s[/symbol][sub]1[/sub] and [symbol]s[/symbol][sub]2[/sub], you can probably get a numerical answer from your favorite computer algebra system.

On thinking about this some more, I’m not so sure about the different-sigma case, but I am sure about the equal-sigma case. If you cross two equal-width Gaussians, the resulting two-dimensional distribution is circularly symmetric. The error contours are the shapes of the regions where you can say that there some percentage of being within that region. For instance, in this case, I might draw a big circle, and say that there’s a 90% chance of being inside that circle, and a smaller circle, where there’s a 50% chance of being inside, and a very small circle, where there’s a 10% chance. Those three circles would be three of the error contours. One of those circles would be the 1-sigma error contour, which should intersect the 1-sigma points on the X and Y axes. It’d be a circle, with radius equal to the expected value of sqrt(X^2 + Y^2).

Now, with different widths for X and Y, the error contours (including the 1-sigma contour) would be ellipses, and the answer you seek would be, in some sense, the average radius of the 1-sigma ellipse (the ellipse with axes 2sigma[sub]x[/sub] and 2sigma[sub]y[/sub]). But there are a lot of different ways to define the average radius of an ellipse, and I’m not entirely sure which one would be appropriate here. At a guess, I’d take the line with slope sigma[sub]y[/sub]/sigma[sub]x[/sub], and look at the distance where that intersects the 1-sigma error ellipse. But I’m not certain this is right.

There is a closed-form expression in terms of elliptic functions: specifically, the complete elliptic integral of the second kind, E(k)=E(k,pi/2). Use the defining equation (1) for E(k,pi/2) in the second link above to simplify the integral ultrafilter sketched out (note that he’s missing a factor of 1/2 in the exponent).

OK, the final answer is


E[sqrt(x^2+y^2)] = s1 * sqrt(2/pi) * EllipticE(1-rho^2)


where s1 and rho are defined as in my previous post.

Alternatively, if we put the final answer in the form mentioned in my previous post


E[sqrt(x^2+y^2)] = s1*( sqrt(2/pi) + g(rho)*(sqrt(pi/2)-sqrt(2/pi)))

where g(rho) = (EllipticE(1-rho^2) - 1)/(pi/2 - 1)
[ g(rho) goes from 0 to 1 as rho goes from 0 to 1. ]



For anyone interested in how I got it, here are the gory details:

We want E[sqrt(x^2+y^2)], which is defined as


 E[sqrt(x^2+y^2)] = integral[ sqrt(x^2+y^2) * p_x(x) * p_y(y) dx dy, x=-inf..inf, y=-inf..inf]
where 
p_x(x) = (1/sqrt(2*pi*s1))*exp(-x^2/(2*s1^2))
p_y(y) = (1/sqrt(2*pi*s2))*exp(-y^2/(2*s2^2))


Mathematica and Maple completely barf on this, so we have to re-write in a form that one of them can “digest” it.

One way to re-write this is as


E[sqrt(x^2+y^2)] =  E[sqrt(s1^2*n1^2 + s2^2*n2^2)]

where n1 and n2 are i.i.d. ~ N(0,1)

Then, using polar coordinates


R     = sqrt(n1^2+n2^2)
theta = atan(n2/n1)


R will be Rayleigh with parameter sigma = 1, and theta will be uniform [0,2*pi]

Using the above,


n1 = R*cos(theta)
n2 = R*sin(theta)

So,


 E[sqrt(s1^2*n1^2 + s2^2*n2^2)] = integral[ sqrt(s1^2*R^2*cos^2(theta) + s2^2*R^2*sin^2(theta)) * p_R(R) * p_theta(theta) dR dtheta, r=0..inf, theta=0..2*pi]


where p_R() is the pdf of a Rayleigh rv parameter sigma=1
and p_theta() is the pdf of a rv that is uniform over [0,2*pi]

The above intergral can be re-written as


 integral[ R*p_R(R) dR, r=0..inf] * integral[sqrt(s1^2*cos^2(theta) + s2^2*sin^2(theta)), theta=0..2*pi]


The first integral is just the mean of a Rayleigh rv with parameter sigma=1, so the mean is sqrt(pi/2)
For the second integral, Mathematica gave me, after some massaging of the answer, (2/pi)s1EllipticE(1-rho^2)

So the final answer is


E[sqrt(x^2+y^2)] = sqrt(pi/2) * (2/pi)*s1*EllipticE(1-rho^2)


which simplifies to


E[sqrt(x^2+y^2)] = sqrt(2/pi) * s1 * EllipticE(1-rho^2)

For rho = 0, we have EllipticE(1-rho^2) = 1
For rho = 1, we have EllipticE(1-rho^2) = pi/2

So,
if(rho == 1), E[sqrt(x^2+y^2)] = s1sqrt(pi/2)
if(rho == 0), E[sqrt(x^2+y^2)] = s1
sqrt(2/pi)

which agrees with what was mentioned in my previous post, so it’s a good sanity check.

That’s what I got too. Note (in case you want to calculate values) that Mathematica uses an unusual convention for the arguments of its elliptic integrals; most common is to use k, but Mathematica uses k[sup]2[/sup]. (So Mathematica’s EllipticE[1-rho^2] is E(sqrt(1-rho^2)) in most tables.)