Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > General Gambling > Probability
FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Display Modes
  #1  
Old 06-21-2007, 11:30 PM
alebron alebron is offline
Senior Member
 
Join Date: Oct 2004
Posts: 120
Default Convergence in probability vs. almost-sure convergence challenge

A probability theory challenge, for those that know their set-theoretic probability foundations.

If a sequence converges almost surely, this implies that the sequence also converges in probability. The converse is not true.

The challenge: define a sequence of RVs that does converge in probability, but does not converge almost-surely.
Reply With Quote
  #2  
Old 06-22-2007, 12:08 AM
imfatandugly imfatandugly is offline
Senior Member
 
Join Date: Jul 2005
Posts: 267
Default Re: Convergence in probability vs. almost-sure convergence challenge

i have no idea what your talking about.
Reply With Quote
  #3  
Old 06-22-2007, 07:24 AM
sixhigh sixhigh is offline
Senior Member
 
Join Date: Oct 2005
Location: Highway 61
Posts: 1,778
Default Re: Convergence in probability vs. almost-sure convergence challenge

Y_1 = 1_[0,1]
Y_2 = 1_[0,1/2], Y_3 = 1_[1/2,1/1]
Y_4 = 1_[0,1/4], Y_5 = 1_[1/4,2/4], Y_6 = 1_[2/4,3/4], Y_7= 1_[3/4,1]
...

Obviously Y_k converges in probability but not p-almost-surely as Y_k(\omega) wont converge for any \omega.
Reply With Quote
  #4  
Old 06-22-2007, 01:40 PM
alebron alebron is offline
Senior Member
 
Join Date: Oct 2004
Posts: 120
Default Re: Convergence in probability vs. almost-sure convergence challenge

[ QUOTE ]
Y_1 = 1_[0,1]
Y_2 = 1_[0,1/2], Y_3 = 1_[1/2,1/1]
Y_4 = 1_[0,1/4], Y_5 = 1_[1/4,2/4], Y_6 = 1_[2/4,3/4], Y_7= 1_[3/4,1]
...

Obviously Y_k converges in probability but not p-almost-surely as Y_k(\omega) wont converge for any \omega.

[/ QUOTE ]

Sixhigh,

I'm not sure I understand your notation (UBB has no math mode, heh). What I read for, say, Y_1 is "1 subscript closed-interval 0 to 1". Does this mean that Y_1 is a uniform RV with sample space [0,1]? If so, I don't see how the sequence converges in probability. I'm sure I'm misunderstanding here, can you please clarify?

Thanks a bunch.
Reply With Quote
  #5  
Old 06-22-2007, 03:43 PM
sixhigh sixhigh is offline
Senior Member
 
Join Date: Oct 2005
Location: Highway 61
Posts: 1,778
Default Re: Convergence in probability vs. almost-sure convergence challenge

Every Y_k is a uniform distribution on [0,1].

1_[x,y] is a function that is 1 on the interval [x,y] and zero everywhere else.

For Y_k those intervals get thinner as k grows, thus P(|Y_k - 0| > \epsilon) --> 0.

We can even define Y_k for all k:
Y_{2^n + m} = 1_[m 2^{-n}, (m-1) 2^{-n}] (with 0 <= m < 2^n)

It's basically a series of wandering bars that get thinner every 2^n steps.
Reply With Quote
  #6  
Old 06-22-2007, 05:57 PM
alebron alebron is offline
Senior Member
 
Join Date: Oct 2004
Posts: 120
Default Re: Convergence in probability vs. almost-sure convergence challenge

[ QUOTE ]
Every Y_k is a uniform distribution on [0,1].

1_[x,y] is a function that is 1 on the interval [x,y] and zero everywhere else.

For Y_k those intervals get thinner as k grows, thus P(|Y_k - 0| > \epsilon) --> 0.

We can even define Y_k for all k:
Y_{2^n + m} = 1_[m 2^{-n}, (m-1) 2^{-n}] (with 0 <= m < 2^n)

It's basically a series of wandering bars that get thinner every 2^n steps.

[/ QUOTE ]

I believe I understand, but let me try to paraphrase to make sure:

-Convergence in probability means that the measure of Y_k converges.
-Convergence almost-surely means that the expectation of Y_k converges.

Correct?

Thanks for the help.
Reply With Quote
  #7  
Old 06-22-2007, 09:24 PM
marv marv is offline
Senior Member
 
Join Date: Aug 2004
Posts: 107
Default Re: Convergence in probability vs. almost-sure convergence challenge

[ QUOTE ]
[ QUOTE ]
Every Y_k is a uniform distribution on [0,1].

1_[x,y] is a function that is 1 on the interval [x,y] and zero everywhere else.

For Y_k those intervals get thinner as k grows, thus P(|Y_k - 0| > \epsilon) --> 0.

We can even define Y_k for all k:
Y_{2^n + m} = 1_[m 2^{-n}, (m-1) 2^{-n}] (with 0 <= m < 2^n)

It's basically a series of wandering bars that get thinner every 2^n steps.

[/ QUOTE ]

I believe I understand, but let me try to paraphrase to make sure:

-Convergence in probability means that the measure of Y_k converges.
-Convergence almost-surely means that the expectation of Y_k converges.

Correct?

Thanks for the help.

[/ QUOTE ]


Not quite.

Convergence in prob. means that for any set A, P_n(A) converges.

In the Y_k example \Omega is [0,1] and P_k, which is the distribution of Y_k, is a distribution on the two point set {0,1}. Here we have P_k({1}) -> 0 and P_k({0}) -> 1 as the bars get thinner and thinner, so P_k converges in prob.

Convergence a.s. means for almost all \omega\in\Omega, Y_k(\omega) converges (thought of as a plain old sequence of real numbers).

This isn't true in our example since for any \omega\in [0,1], Y_k(\omega) will sometimes be 1 (when the thin bar defining Y_k happens to contain \omega) and sometimes 0. As k->infinity Y_k takes the value 1 on a thinner and thinner bar, but eventually it will hit every \omega\in [0,1] for arbitrarily large k as it wanders around.

Thus Y_k doesn't converge a.s. - it'll keep on hitting 1 every now and then as k gets large.

Marv
Reply With Quote
  #8  
Old 06-23-2007, 01:16 AM
filsteal filsteal is offline
Senior Member
 
Join Date: May 2007
Location: ^IDK, my BFF Billy?
Posts: 1,100
Default Re: Convergence in probability vs. almost-sure convergence challenge

Doesn't like every graduate-level mathematical statistics text have an example of this? I know mine did, I just don't remember what it was.
Reply With Quote
  #9  
Old 06-23-2007, 02:46 AM
pzhon pzhon is offline
Senior Member
 
Join Date: Mar 2004
Posts: 4,515
Default Re: Convergence in probability vs. almost-sure convergence challenge

[ QUOTE ]
Doesn't like every graduate-level mathematical statistics text have an example of this? I know mine did, I just don't remember what it was.

[/ QUOTE ]
I think the standard example might be independent {0,1} (Bernoulli) random variables where 1 is hit with probability 1/k. Almost surely, there are infinitely many 0s and 1s, so the sequence almost surely diverges. It converges to the constant 0 in probability.
Reply With Quote
  #10  
Old 06-23-2007, 02:46 AM
alebron alebron is offline
Senior Member
 
Join Date: Oct 2004
Posts: 120
Default Re: Convergence in probability vs. almost-sure convergence challenge

I understand perfectly now.

Thanks sixhigh and marv. To filsteal: which text did you have?
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 06:46 AM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.