Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > General Gambling > Probability
FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Display Modes
  #1  
Old 12-10-2006, 06:57 PM
dtbog dtbog is offline
Senior Member
 
Join Date: Jun 2004
Location: mostly offline
Posts: 2,615
Default Monty Hall-esque question

Say you have two unmarked envelopes. You're on a new game show, and the host has put money in the envelopes in the following manner: the value of one envelope is twice the value of the other envelope. You have no idea whatsoever about the expected magnitude of the prizes.

You open one of the envelopes, and you see a check for $100. Now, the host offers you the chance to switch to the other envelope. (Again, no game theoretical assumptions about whether or not it is more likely that the game show would have a $200 prize or a $50 prize.) Do you switch?

Simply stepping through a calculation, it seems like you should: the expected value of the unseen envelope is (50+200)/2 or $125.

Now, what if you are presented with the same conundrum, but this time the host didn't show you what was in the first envelope. Should you still switch?

It seems like you should; the EV of the other envelope should always be higher through conventional calculation. However, this clearly makes no sense!

Is there a way to explain this seemingly paradoxical fact?
Reply With Quote
  #2  
Old 12-10-2006, 08:08 PM
PairTheBoard PairTheBoard is offline
Senior Member
 
Join Date: Dec 2003
Posts: 3,460
Default Re: Monty Hall-esque question

Switching envelopes amounts to betting half the amount in the First Envelope at 2-1 odds that the second envelope contains the larger amount. That would be a good bet if you could bet any amount you like, at 2-1 odds, that the second Envelope will be larger. But you're Not allowed to bet any amount you like. You are required to bet an amount that is being determined by the outcome. If the outcome is a win you are being required to bet a small amount. If the outcome is a loss you are required to bet twice as much.

Here is a comparable proposition. Using a standard deck of 52 cards I will let you draw a random card face down and allow you to bet that it will be a red card. I will pay you 2-1 odds if you win. Sounds like a good deal right? But there's a catch. I have written dollar amounts on the back of all the cards and when you draw the card you are required to bet the amount written on the back of the card. It's a one shot deal. Do you make the bet? If you do you will be getting the worst of it because I've writtn $10 on the back of all the red cards and $50 on the back of all the black cards.

The card game would be fair if I'd written $10 on all the red cards and $20 on all the black cards. If it's a 2 card deck, 1-red 1-black, that's exactly the 2 Envelope Situation.

This is the best explanation you'll ever see for what's going on here. Credit for it goes to me,

PairTheBoard
Reply With Quote
  #3  
Old 12-10-2006, 09:08 PM
AaronBrown AaronBrown is offline
Senior Member
 
Join Date: May 2005
Location: New York
Posts: 2,260
Default Re: Monty Hall-esque question

You will find extensive discussion of this if you Search on this forum, with no consensus. You can also use Google to search for "Necktie paradox," "Wallet Game" or "Envelope paradox."
Reply With Quote
  #4  
Old 12-11-2006, 12:47 AM
mykey1961 mykey1961 is offline
Senior Member
 
Join Date: Oct 2005
Posts: 249
Default Re: Monty Hall-esque question

S = ( 50 + 100) / 2
L = (100 + 200) / 2

EV(1) = (S + L) / 2 = 112.50

EV(2) = (50 + 200) / 2 = 125.00

The "paradox" is that in your first example, you only look at 50% of the possible outcomes.

Where you picked the larger amount when the amounts were $50, and $100, and the smaller amount when the amounts were $100 and $200.

You never consider when you pick $50, or $200 on the first decision.

instead of EV(2) = (50 + 200) / 2 = 125,

A = 50, switching from 100 to 50
B = 100, switching from 50 to 100
C = 200, switching from 100 to 200
D = 100, switching from 200 to 100

EV(2) = ((A + B + C + D))/4 = 112.50

EV(2) = EV(1)


In the second part:

S and L are unknown values.

The EV(1) of your first choice is (S + L) / 2

The EV(2) of switching is (L + S) / 2

EV(2) = EV(1)
Reply With Quote
  #5  
Old 12-18-2006, 05:38 AM
Xhad Xhad is offline
Senior Member
 
Join Date: Jul 2005
Location: .25/.50 6max - stars
Posts: 5,289
Default Re: Monty Hall-esque question

I did some digging and found this. I'm still not sure it makes sense to me though.

http://www.faculty.ucr.edu/~eschwitz...lopeSimple.htm
Reply With Quote
  #6  
Old 12-18-2006, 07:56 PM
elindauer elindauer is offline
Senior Member
 
Join Date: Jun 2003
Location: analyzing hand ranges
Posts: 2,966
Default Re: Monty Hall-esque question

Hi PairTheBoard,

I read your explanation, and I thought it was interesting. But I had to think about it some more. So I thought... and I thought... and then I understood... but wait, no... so some more thinking...

While I was thinking, I read some of the other descriptions resolving the paradox. They were long. They involved lots of rigorous math. I basically had to take the author's word on many points. I wondered to myself, could the answer really be as simple as what PairTheBoard has written?

And after a while, I came to really understand what you were saying, and the beauty of it dawned on me like the first light of a glorious new day. Wow. "This guy is really smart", I thought to myself, as I stood and marveled at your creation.

But then!...

It suddenly occured to me that this really elegant description of why the game must be fair is in fact only proving something that everyone already knows! Hell, I can prove the game is fair in one word... "symmetry". QED!

The real "problem" is not to find an explanation of why switching offers no gain... the problem is to find an explanation of why the argument that it does gain something IS WRONG. Everyone, I think, agrees that there is no advantage to switching. The problem is trying to explain why the seemingly sound logic for switching is incorrect.


So, kudos for coming up with a really fantastic description of the two envelopes problem and why there is no advantage to switching. However, to really resolve the paradox, you do have to debunk the other logic (hence the paradox), which means you have to do the math demonstrating that there is no uniform probability distribution with integral of 1 and EV that's not infinite, etc etc.

-eric

PS. Although I don't think your explanation resolves the paradox, I do still admire the elegance of your explanation and that "this guy is really smart" feeling lives on. Well done.
Reply With Quote
  #7  
Old 12-19-2006, 01:33 AM
pzhon pzhon is offline
Senior Member
 
Join Date: Mar 2004
Posts: 4,515
Default Re: Envelope Paradox

[ QUOTE ]
You open one of the envelopes, and you see a check for $100 [x]. Now, the host offers you the chance to switch to the other envelope. (Again, no game theoretical assumptions about whether or not it is more likely that the game show would have a $200 prize or a $50 prize.) Do you switch?

[/ QUOTE ]
This is the well-known envelope paradox. It is understood by many, but hard to explain. (I like PairTheBoard's attempt, but it may not convince people who won't take the trouble to follow it.) There are attempts to explain it in the rec.puzzles newsgroup archives, and there are articles on it in academic journals (see Richard Thaler's column in the American Economic Review). Mike Caro, a poker theorist, got it mostly right, but misanalyzed part of it. Here are some important points for following discussions:

[img]/images/graemlins/diamond.gif[/img] The conditional probability that the other envelope holds x/2, given that X=x, can't be assumed to be 1/2. Indeed, you can check that this probability is not 1/2 for simple distributions and many particular values of x.

[img]/images/graemlins/diamond.gif[/img] For any distribution for the smaller amount, the strategies of always switching and of never switching have the same (possibly infinite) expected value.

[img]/images/graemlins/diamond.gif[/img] There is no uniform distribution on the positive real numbers, but that does not resolve the paradox. There are random variables with positive probability on all intervals of positive length in R+, such as the one defined by (1/z)-1 where z is uniformly distributed on (0,1).

[img]/images/graemlins/diamond.gif[/img] There are strategies which are better than always switching or never switching. You can choose a threshold t, and switch if you see x<t, and don't switch if you see x>t. If t is between the values in the envelopes, you make the right choice. Otherwise, you break even, on average. If you choose t randomly to have positive probability on all intervals of R+, then you will choose correctly better than 50% of the time regardless of the distribution. For any distribution with a finite expected value, this strategy will have a greater expected value than that of always switching.

[img]/images/graemlins/diamond.gif[/img] There exist distributions such that no matter what you see, you should switch. For these distributions, the expected value of always staying is infinite, as is the expected value of always switching.

These have all been proven rigorously, but people still argue because the only way these are intuitive is if you have studied this before. Math is hard.
Reply With Quote
  #8  
Old 12-19-2006, 12:14 PM
AaronBrown AaronBrown is offline
Senior Member
 
Join Date: May 2005
Location: New York
Posts: 2,260
Default Re: Envelope Paradox

[ QUOTE ]
These have all been proven rigorously, but people still argue because the only way these are intuitive is if you have studied this before. Math is hard.

[/ QUOTE ]
I have great respect for phzon and PairTheBoard, but this one statement is not only dead wrong, but very dangerous.

Skill in math can be good, but it often induces a special kind of blindness. Problems are simplified so they fit in neat mathematical frameworks, then some mathematicians can only see the math, not the original problem. This leads to people insisting they have the only rational solution to a problem, when what they should say is, they have a rational solution to a simplified problem which may or may not give insight into the real problem.

In this case, it's very easy to come up with mathematical formulations that justify always switching, and formulations that argue you shouldn't always switch. It's easy to come up with real-world examples where each type of formulation is a good model. That's why we call this a paradox. Explaining one side or the other, either in rigorous math or exhaustive words, misses the point. The original problem already did that better.

People who try to "resolve" or "explain" paradoxes don't understand them. No one believes both sides are true simultaneously. The point is to understand both arguments, not to decide which one is right but to see the limits of each. They can't both be true all the time in full generality. To learn from this paradox, you have to understand the force of both sides, and think about which one to apply in different situations.

The paradox was invented by Belgian mathematician Maurice Kraitchik. He was not an idiot. He was good at hard math. Yet he still saw the force of both arguments. He introduced it because both arguments are used all the time both by statisticians and in informal reasoning. Both are valuable tools in some situations. But it took this example to get people to admit that neither argument is universal, that you have to be careful using either one.

Sure, you can redefine the problem to bring it into the realm where one or the other argument is stronger. That's easy. The hard part is to define the precise border between the realms.
Reply With Quote
  #9  
Old 12-20-2006, 08:18 PM
elindauer elindauer is offline
Senior Member
 
Join Date: Jun 2003
Location: analyzing hand ranges
Posts: 2,966
Default Re: Envelope Paradox

[ QUOTE ]
I have great respect for phzon and PairTheBoard, but this one statement is not only dead wrong, but very dangerous.

[/ QUOTE ]

His statement that all of his points have been proven rigorously is dead wrong? I'm surprised to hear you say this, since when I read his post I thought... yup, that sums it up pretty well.

Note that his points still leave open the possibility that switching is always right... and he even defines the border between the strategies (switching is always right when the probability function for selection the amounts has infinite expected value).

Probably I'm just missing something, so I'm curious, which part of phzon's post was incorrect?


I am sincerely curious and hope this won't come off and some kind of attempt to poke at you. I bow before your phenomenal mathematical talents. In fact, if it was some random person, I'd probably just blow it off figuring "ok, that guy doesn't know what he's talking about", and it's precisely because it is you, and I can't possibly make that argument, that I post this question.


thanks,
eric

edit: perhaps you mean that "well understood mathematically" should not be mistaken for "knows what to do when presented with this problem in real life" or "not interesting any more". I personally read phzon to say only the first thing and not the second two, but maybe you read it differently?
Reply With Quote
  #10  
Old 12-20-2006, 10:47 PM
PairTheBoard PairTheBoard is offline
Senior Member
 
Join Date: Dec 2003
Posts: 3,460
Default Re: Monty Hall-esque question

[ QUOTE ]
It suddenly occured to me that this really elegant description of why the game must be fair is in fact only proving something that everyone already knows! Hell, I can prove the game is fair in one word... "symmetry". QED!

The real "problem" is not to find an explanation of why switching offers no gain... the problem is to find an explanation of why the argument that it does gain something IS WRONG. Everyone, I think, agrees that there is no advantage to switching. The problem is trying to explain why the seemingly sound logic for switching is incorrect.

[/ QUOTE ]

I'm not giving the symmetry argument and I believe I am giving an explanation for why the EV calculation is incorrect. If you buy the insight that switching is equivalent to betting half your envelope amount at 2-1 odds that it's the smaller envelope then you should ask, why isn't that a +EV bet? Looked at that way the incorrect EV calc works exactly the same. If it's a $10 envelope, you are betting $5 at 2-1 that it's the smaller envelope. Why isn't the EV on that bet 2(5)(.5) - (5)(.5) = +$2.50 ? It's the same EV you get looking at it as switching envelopes.

The answer I'm giving is I believe an important principle. If you could bet any amount of your choosing on these terms, the EV calc would be ok based on the given envelopes. But you are Not allowed to bet any amount of your choosing. You are required to bet an amount dictated by the already determined outcome.

You can see this easily with the deck of red black cards. You can bet your card is Red and get paid 2-1 odds. If you could bet any amount you wanted you would indeed have +$2.50 EV on every $5 you bet. But in my scenario you are not allowed to bet whatever you want. If it's a Red Card you must bet $10 and if it's a black card you must bet $50.

Being required to bet an amount that is being dictated by the outcome changes the EV calculation. That's the principle. It has nothing inherently to do with symmetry.


This is I believe the simple principle most people are missing when they incorrectly calculate the EV. If you want to go into the more esoteric observations about possible - and impossible - prior distributions for the amounts in the envelopes then pzhon is of course correct on all his points. The most obvious being that no uniform distribution exists on the positive real numbers. But you don't have to go into prior distributions to see the principle I gave above.

PairTheBoard
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 06:45 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.