PDA

View Full Version : Simple statistics question?


PancakeBoy
04-12-2006, 10:13 PM
Let x = s + l.

l is normally distributed with mean 0 and variance sigma_l^2, i.e. ~ N(0,sigma_l^2).

s is normally distributed with mean 0 and variance sigma_s^2, i.e. ~ N(0,sigma_s^2).

What is the expected value of s given a value for x? I.e.,

E(x|s)?

HotPants
04-12-2006, 11:20 PM
x?

PokerPadawan
04-12-2006, 11:38 PM
The sum of expectations is equal to the expectation of the sum.

If <l> = 0, and x is given, then <s> = <x-l> = <x> - <l> = x - <l> = x.

I'm using <> to denote the expectation.

Isn't it just that simple?

Thythe
04-13-2006, 12:05 AM
shouldn't it be E(s|x)?

jason1990
04-13-2006, 12:31 AM
[ QUOTE ]
Let x = s + l.

l is normally distributed with mean 0 and variance sigma_l^2, i.e. ~ N(0,sigma_l^2).

s is normally distributed with mean 0 and variance sigma_s^2, i.e. ~ N(0,sigma_s^2).

What is the expected value of s given a value for x? I.e.,

E(x|s)?

[/ QUOTE ]
First, the expected value of s given x is written E(s|x).

Now, to give a meaningful answer to this question, we must assume that s and l are jointly normal, which means that any linear combination of s and l is normal. This will also imply that s and x are jointly normal. From this, it follows that there exists a constant c such that s - cx and x are independent. Hence,

E(s|x) = E(s - cx|x) + E(cx|x) = E(s - cx) + cx = cx.

In order to find the constant c, we observe that

0 = E((s - cx)x) = E(sx) - cE(x^2)
= sig_s^2 + E(sl) - c(sig_s^2 + 2E(sl) + sig_l^2).

Hence,

c = (sig_s^2 + E(sl))/(sig_s^2 + 2E(sl) + sig_l^2).

It's interesting to note that if sig_s = sig_l, then c = 1/2, regardless of the correlation of s and l.

oneeye13
04-13-2006, 03:39 AM
[ QUOTE ]
Let x = s + l.

l is normally distributed with mean 0 and variance sigma_l^2, i.e. ~ N(0,sigma_l^2).

s is normally distributed with mean 0 and variance sigma_s^2, i.e. ~ N(0,sigma_s^2).

What is the expected value of s given a value for x? I.e.,

E(x|s)?

[/ QUOTE ]

seems like it would be something like (sigma_s/(sigma_l+sigma_s))*x? (i haven't tried it)

oneeye13
04-13-2006, 03:41 AM
[ QUOTE ]
The sum of expectations is equal to the expectation of the sum.

If <l> = 0, and x is given, then <s> = <x-l> = <x> - <l> = x - <l> = x.

I'm using <> to denote the expectation.

Isn't it just that simple?

[/ QUOTE ]

if we fix the value of x, is <l> still going to be zero?

MathEconomist
04-13-2006, 05:28 AM
This looks right to me. Interestingly, this matches up with what we ought to think if we were doing this informally. You have one variable you observed with two unobserved components. How you attribute the value of the observed variable to its unobserved components should depend on the variances and covariances of those unobserved components. If the two unobserved components are "equally noisy", you'd place half the weight on each. Otherwise you'd adjust to generally put more weight on the less noisy component.

I find that for people who don't know that much stat, one way to check your answers is to see if you can provide an informal inference interpretation of the answer you find. Usually, the answers that a formal derivation gives correspond to what you ought to have expected.