PDA

View Full Version : The Nash Equilibrium and the traveller's dilemma


Galwegian
06-21-2007, 01:14 PM
For anyone who thinks that the Nash equilibrium is the nuts - check the following excellent scientific american article

Traveller's Dilemma (http://www.sciam.com/article.cfm?chanID=sa006&colID=1&articleID=7750A57 6-E7F2-99DF-3824E0B1C2540D47)

or if you can't access that article you can look at the wikipedia page

wiki - traveller's dilemma (http://en.wikipedia.org/wiki/Traveler's_dilemma)

I'm curious - can anyone think of a poker example that would illustrate the same concept as the traveller's dilemma?

djames
06-21-2007, 01:27 PM
This article doesn't argue that the Nash equilibrium isn't the nuts. In fact, it reiterates the fact that it is the nuts when playing against an opponent also playing optimally.

When opponents play suboptimally, it's widely accepted that playing an optimal strategy isn't necessarily a maximal strategy. Thus, strategies other than the NE choice of $2 in the traveller's dilemma certainly can have higher payoffs than the NE choice. However, such a strategy would be suboptimal and hence exploitable.

So gauging the strategy of your opponent is always paramount when playing a suboptimal (or maximal) strategy.

Galwegian
06-21-2007, 02:24 PM
[ QUOTE ]
This article doesn't argue that the Nash equilibrium isn't the nuts. In fact, it reiterates the fact that it is the nuts when playing against an opponent also playing optimally.

When opponents play suboptimally, it's widely accepted that playing an optimal strategy isn't necessarily a maximal strategy. Thus, strategies other than the NE choice of $2 in the traveller's dilemma certainly can have higher payoffs than the NE choice. However, such a strategy would be suboptimal and hence exploitable.

So gauging the strategy of your opponent is always paramount when playing a suboptimal (or maximal) strategy.

[/ QUOTE ] I think it does argue that NE is not the nuts. It argues that there is difference between rational and optimal (in the game theoretic sense)

djames
06-21-2007, 02:35 PM
Optimal has a mathematical meaning. Rational does not.

What may be "rational" to a game player who wants to play an unexploitable strategy would be the NE strategy (or other equilibrium strategies).

What may be rational to a game player who doesn't care if they can be exploited but rather cares to maximally exploit the weaknesses of their opponent may be a strategy that isn't the NE.

Either way, both players are playing "the nuts" when they have their goal in mind. I don't think rigid game theory accepts the term "rational" as it is not mathematical but rather subjective.

tolbiny
06-21-2007, 02:42 PM
[ QUOTE ]
[ QUOTE ]
This article doesn't argue that the Nash equilibrium isn't the nuts. In fact, it reiterates the fact that it is the nuts when playing against an opponent also playing optimally.

When opponents play suboptimally, it's widely accepted that playing an optimal strategy isn't necessarily a maximal strategy. Thus, strategies other than the NE choice of $2 in the traveller's dilemma certainly can have higher payoffs than the NE choice. However, such a strategy would be suboptimal and hence exploitable.

So gauging the strategy of your opponent is always paramount when playing a suboptimal (or maximal) strategy.

[/ QUOTE ] I think it does argue that NE is not the nuts. It argues that there is difference between rational and optimal (in the game theoretic sense)

[/ QUOTE ]

Basically it argues that optimal play can only occur with full knowledge of how player #2 will play. Optimal play if you know that you opponent will pick $100 is to pick 99$, them knowing this should pick 98$ all the way on down to 2. But when you don't know how your opponent is going to play this backwards induction doesn't work since you don't expect your opponent to work their way all the way back to 2$.

Galwegian
06-21-2007, 03:02 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
This article doesn't argue that the Nash equilibrium isn't the nuts. In fact, it reiterates the fact that it is the nuts when playing against an opponent also playing optimally.

When opponents play suboptimally, it's widely accepted that playing an optimal strategy isn't necessarily a maximal strategy. Thus, strategies other than the NE choice of $2 in the traveller's dilemma certainly can have higher payoffs than the NE choice. However, such a strategy would be suboptimal and hence exploitable.

So gauging the strategy of your opponent is always paramount when playing a suboptimal (or maximal) strategy.

[/ QUOTE ] I think it does argue that NE is not the nuts. It argues that there is difference between rational and optimal (in the game theoretic sense)

[/ QUOTE ]

Basically it argues that optimal play can only occur with full knowledge of how player #2 will play. Optimal play if you know that you opponent will pick $100 is to pick 99$, them knowing this should pick 98$ all the way on down to 2. But when you don't know how your opponent is going to play this backwards induction doesn't work since you don't expect your opponent to work their way all the way back to 2$.

[/ QUOTE ] I understand the logic of the arguments presented in the article quite well. However, you are missing (I think) the deeper point of the article which is the dichotomy between optimal and rational - the interesting part of the article is towards the end

"What is interesting is that this rejection of formal rationality and logic has a kind of meta-rationality attached to it. If both players follow this meta-rational course, both will do well. The idea of behavior generated by rationally rejecting rational behavior is a hard one to formalize. But in it lies the step that will have to be taken in the future to solve the paradoxes of rationality that plague game theory and are codified in Traveler's Dilemma. "

I am just wondering if there are examples of this in poker - that is, situations where where two oppponents who understand optimal strategy might choose to reject it on the basis of this "meta-rationality" referred to in the article? This has nothing to do with exploitive play against a suboptimal opponent.

Galwegian
06-21-2007, 03:07 PM
[ QUOTE ]
Optimal has a mathematical meaning. Rational does not.

[/ QUOTE ] And this is exactly the point. Indeed the author himself refers to the difficulty of formalizing the notion of rationality.

djames
06-21-2007, 03:29 PM
[ QUOTE ]
[ QUOTE ]
Optimal has a mathematical meaning. Rational does not.

[/ QUOTE ] And this is exactly the point. Indeed the author himself refers to the difficulty of formalizing the notion of rationality.

[/ QUOTE ]

Good, then you can understand the difference between an optimal strategy and a maximal strategy. You can also understand that the idea that if we knew the second player's number, the optimal & maximal strategies are known and agree (e.g if their number is 100, ours is 99). But, when the other player's number is unknown, the optimal & maximal strategies (probably*) disagree. The author uses "rational" when speaking of a maximal strategy, not an optimal one. He lucks out that they are the same in the case where we know the other players action.

* I say probably because I haven't seen the derivation of the maximal strategy and I don't care to figure one out.

Galwegian
06-21-2007, 03:36 PM
[ QUOTE ]

Good, then you can understand the difference between an optimal strategy and a maximal strategy. You can also understand that the idea that if we knew the second player's number, the optimal & maximal strategies are known and agree (e.g if their number is 100, ours is 99). But, when the other player's number is unknown, the optimal & maximal strategies (probably*) disagree. The author uses "rational" when speaking of a maximal strategy, not an optimal one. He lucks out that they are the same in the case where we know the other players action.

* I say probably because I haven't seen the derivation of the maximal strategy and I don't care to figure one out.

[/ QUOTE ] I do not know the definition of "maximal strategy". I do know the definition of "optimal strategy". Can you tell me the definition of "maximal"? (I am a mathematician, but not a game theorist)

djames
06-21-2007, 03:45 PM
I suck with Google, but typing in

definition "maximal strategy" "game theory"

gives hits from Cornell and Jstor at the top. Pick your poison.

In my limited experience most people use the term optimal when they are really thinking about a maximal strategy and most people don't actually know what an optimal strategy is. It's usually not the correct strategy to play in games with opponents that are casual gamers.

CallMeIshmael
06-21-2007, 04:33 PM
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

gumpzilla
06-21-2007, 04:37 PM
[ QUOTE ]
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

[/ QUOTE ]

I'm surprised you commented in this thread without bringing up the epic OOT thread on this problem that you started. I thought of it when I saw this Scientific American article.

CallMeIshmael
06-21-2007, 04:42 PM
[ QUOTE ]
[ QUOTE ]
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

[/ QUOTE ]

I'm surprised you commented in this thread without bringing up the epic OOT thread on this problem that you started. I thought of it when I saw this Scientific American article.

[/ QUOTE ]

I looked for, but couldnt find a link

do you have it?

gumpzilla
06-21-2007, 04:52 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

[/ QUOTE ]

I'm surprised you commented in this thread without bringing up the epic OOT thread on this problem that you started. I thought of it when I saw this Scientific American article.

[/ QUOTE ]

I looked for, but couldnt find a link

do you have it?

[/ QUOTE ]

Took a little bit of searching, but here (http://forumserver.twoplustwo.com/showflat.php?Cat=0&Number=4384737&page=0&fpart=1&v c=1) it is.

PairTheBoard
06-21-2007, 05:02 PM
[ QUOTE ]
I don't think rigid game theory accepts the term "rational" as it is not mathematical but rather subjective.


[/ QUOTE ]

For me, this is the most interesting point here. The idea that what's "rational" is subjective. Such a notion puts the lie to the dominant attitude of many posters on this Forum. It's the arrogant attitude that what's rational can always be determined and that they have the ability to make that determination and dictate it to the rest of us.

PairTheBoard

Piers
06-21-2007, 05:04 PM
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

If your aim is to beat your opponent obviously you choose $2, if you just want a load of dosh and don’t care about the other guy you choose something around $100.

I don’t think it’s a paradox, just a case of the model not really fitting, anyone choosing $2 for real would be nuts.

CallMeIshmael
06-21-2007, 05:21 PM
[ QUOTE ]
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

[/ QUOTE ]

The solution simply assumes that each player tries to make the most money possible.

Piers
06-21-2007, 05:55 PM
[ QUOTE ]
[ QUOTE ]
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

[/ QUOTE ]


The solution simply assumes that each player tries to make the most money possible.

[/ QUOTE ]

What I guess I am saying is that the model does not match the story about the travellers.

vhawk01
06-21-2007, 05:56 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

[/ QUOTE ]

I'm surprised you commented in this thread without bringing up the epic OOT thread on this problem that you started. I thought of it when I saw this Scientific American article.

[/ QUOTE ]

I looked for, but couldnt find a link

do you have it?

[/ QUOTE ]

Took a little bit of searching, but here (http://forumserver.twoplustwo.com/showflat.php?Cat=0&Number=4384737&page=0&fpart=1&v c=1) it is.

[/ QUOTE ]

Man I love that thread. I've linked it in SMP at least one other time. Everyone who posts here should read the entire thing at least once. If nothing else, it is a great insight into human suckitude.

tolbiny
06-21-2007, 08:27 PM
[ QUOTE ]
[ QUOTE ]
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

[/ QUOTE ]

The solution simply assumes that each player tries to make the most money possible.

[/ QUOTE ]

What a strange game, the only way to win seems to be not to play

http://www.cennydd.co.uk/uploaded_images/Wargames1-797127.gif

borisp
06-22-2007, 12:22 AM
[ QUOTE ]
[ QUOTE ]
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

[/ QUOTE ]

The solution simply assumes that each player tries to make the most money possible.

[/ QUOTE ]
This is the truth, but not quite the "whole truth." The whole truth would add that each player KNOWS that their opponent is perfectly rational, is seeking the same goal, and that their opponent knows this of them. Then the solution follows.

The marginal utility of a guaranteed 2$ pales in comparison to the marginal utility of a possible 100$. This is why "normal" people would choose 100, and why they would expect other normal people to do so. A rational agent can "beat" this by playing 99, but has gained only 1$ relative to the "stable" strategy of both players playing 100$. In real life, we make these decisions as if we plan to play the game over and over, and a player who consistently played 99, in an effort to get this tiny edge, would most likely not get invited to play this positive sum game very often.

Hence we see how it relates to poker: the "reasonable" choice is the one that keeps the game going and is profitable for everybody. The "rational" choice is one that is unexploitable, but it also ensures no one will want to play against you.

CallMeIshmael
06-22-2007, 12:44 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
The confusion appears to be that the dilemma assumes (while stating it does not) that the travellers are playing against each other where in the story they are not playing a game but just both trying to get as much money as possible.

[/ QUOTE ]

The solution simply assumes that each player tries to make the most money possible.

[/ QUOTE ]
This is the truth, but not quite the "whole truth." The whole truth would add that each player KNOWS that their opponent is perfectly rational, is seeking the same goal, and that their opponent knows this of them. Then the solution follows.

[/ QUOTE ]

yeah, very true


Im not sure if you read the linked thread, but, in it, a lot of people made the mistake "the solution assumes they are trying to beat each other", when the solution most certainly does not assume that, and I just wanted to point that out.

borisp
06-22-2007, 12:55 AM
[ QUOTE ]
...Im not sure if you read the linked thread...

[/ QUOTE ]
I stopped after about 4 pages, to stifle the indigestion /images/graemlins/smile.gif

iggymcfly
06-22-2007, 04:18 AM
I think this is kind of stupid. Why is it "rational" to assume that your opponent is going to try to use 98th level thinking on you? And to risk a potential $99 to make $2 when he does?

It seems to me like the maximal solution is to assume that a lot of people will play 100 and pick 99 which maximizes your expectation most often in the real world while the optimal unexploitable solution would be to assume that your opponent is equally likely to pick any number and pick either 96 or 97 since those numbers have the highest sum totals against all possibilities.

Picking 2 is stupid since it only does best against 2 and 3 and is the worst choice you can pick against any other number.

borisp
06-22-2007, 04:26 AM
[ QUOTE ]
I think this is kind of stupid. Why is it "rational" to assume that your opponent is going to try to use 98th level thinking on you? And to risk a potential $99 to make $2 when he does?

It seems to me like the maximal solution is to assume that a lot of people will play 100 and pick 99 which maximizes your expectation most often in the real world while the optimal unexploitable solution would be to assume that your opponent is equally likely to pick any number and pick either 96 or 97 since those numbers have the highest sum totals against all possibilities.

Picking 2 is stupid since it only does best against 2 and 3 and is the worst choice you can pick against any other number.

[/ QUOTE ]
Before you start reasoning, think about whether the opposite side will hear your reasoning.

Galwegian
06-22-2007, 06:00 AM
[ QUOTE ]
I suck with Google, but typing in

definition "maximal strategy" "game theory"

gives hits from Cornell and Jstor at the top. Pick your poison.

In my limited experience most people use the term optimal when they are really thinking about a maximal strategy and most people don't actually know what an optimal strategy is. It's usually not the correct strategy to play in games with opponents that are casual gamers.

[/ QUOTE ]
OK - so according to the University of Alberta game theory pages, a maximal strategy is what poker players normally refer to as an exploitive strategy (I suppose I should say maximally exploitive strategy). I definitely understand the difference between these concepts.

[ QUOTE ]
You can also understand that the idea that if we knew the second player's number, the optimal & maximal strategies are known and agree (e.g if their number is 100, ours is 99).

[/ QUOTE ] This doesn't make any sense to me. The term optimal specifically refers to the situation where you assume that the opponent plays perfectly i.e. inexploitably. If you know that opponent's number is 100 (for example), then it makes no sense to talk about optimal strategy, hence it makes no sense to wonder whether it coincides with maximal strategy.

Galwegian
06-22-2007, 06:09 AM
[ QUOTE ]
Off the top of my head, Id say you might have problems finding a very good poker equivalent to the situation in the TD, since poker is zero sum and the TD isnt, AND the weird results of the TD are specifically because of the fact that it isnt zero sum.

[/ QUOTE ]
Two observations. Tournament poker is not zero sum. There are many situations (for example) where there are just two players left in a hand, and where their strategies will result in changes of equity for the players not in the hand. Easy example - say there are 4 players left in a standard STT (3 places paid) and it is folded to the small blind. Now we have a heads up between the blinds that is not a zero sum game. In fact, virtually every tournament situation where some players have folded their hands results in a non zero sum game between the remaining players.

Second, it is not clear to me at all that the non zero sum nature of TD has anything to do with the surprising Nash equilibrium

Piers
06-22-2007, 08:28 AM
[ QUOTE ]
This is the truth, but not quite the "whole truth." The whole truth would add that each player KNOWS that their opponent is perfectly rational, is seeking the same goal, and that their opponent knows this of them. Then the solution follows.


[/ QUOTE ]

So what does perfectly rational mean here? I am fairly sure no human is ‘perfectly rational’, so how can a perfectly rational entity assume a human opponent is perfectly rational?

I believe in any real situation that the proportion of people that would choose $2 is so small that it would be bad judgement to assume that your opponent is one of them. Its possible one of the individuals is a game theory geek who thinks looking clever to himself is what’s important and is not really thinking about the other person, and the other person knows it (But even so I would choose $99 or $100 just in case I am wrong)..

Of course if units were billions of dollars not dollars, then I would choose $2billion, but that’s something else.

Approaching this only as a mathematical toy, the fact that it talks about rational humans when no such entity exists is fine. However when you start sounding like you are talking about real people in the real world you cannot be so blasé about the existence of these curiously defined rational humans.

CallMeIshmael
06-22-2007, 02:42 PM
[ QUOTE ]
the optimal unexploitable solution would be to assume that your opponent is equally likely to pick any number and pick either 96 or 97 since those numbers have the highest sum totals against all possibilities.

[/ QUOTE ]

Do you know what "unexploitbale" means in the GT sense?

Because neither of those choices are

wazz
06-22-2007, 02:52 PM
I hate to point this out, but if the objective is to 'make as much money as possible' you should choose $99 or $100. The solution can only be $2 if the objective is to make more money than the other person.

On further thinking, $2 couldn't be the right answer either - assuming the other guy picks $2 as well, which is a 'safe' assumption given all this reasoning, you can make $3 by choosing $1. EDIT oh or maybe not.

wazz
06-22-2007, 02:56 PM
It gets much more interesting when you read this far.

'Game theorists have made a number of attempts to explain why a lot of players do not choose the Nash equilibrium in TD experiments. Some analysts have argued that many people are unable to do the necessary deductive reasoning and therefore make irrational choices unwittingly. This explanation must be true in some cases, but it does not account for all the results, such as those obtained in 2002 by Tilman Becker, Michael Carter and Jörg Naeve, all then at the University of Hohenheim in Germany. In their experiment, 51 members of the Game Theory Society, virtually all of whom are professional game theorists, played the original 2-to-100 version of TD. They played against each of their 50 opponents by selecting a strategy and sending it to the researchers. The strategy could be a single number to use in every game or a selection of numbers and how often to use each of them. The game had a real-money reward system: the experimenters would select one player at random to win $20 multiplied by that player's average payoff in the game. As it turned out, the winner, who had an average payoff of $85, earned $1,700.'

wazz
06-22-2007, 03:00 PM
'Many of us do not feel like letting down our fellow traveler to try to earn only an additional dollar, and so we choose 100 even though we fully understand that, rationally, 99 is a better choice for us as individuals.'

But this must be where game theory goes wrong; the aim of choosing 100 or 99 is nothing to do with 'letting down our fellow traveller' but in assuming your fellow traveller is rational and wants to give himself as much of a chance of making money as possible. If you assume the other traveller is rational, it is decidedly +EV to choose 99 or 100 because it makes no sense for the traveller to give up on the actual reward of 98-101 just to get one over on the other guy and collect $4 or even $2.

vhawk01
06-22-2007, 03:01 PM
[ QUOTE ]
I hate to point this out, but if the objective is to 'make as much money as possible' you should choose $99 or $100. The solution can only be $2 if the objective is to make more money than the other person.

On further thinking, $2 couldn't be the right answer either - assuming the other guy picks $2 as well, which is a 'safe' assumption given all this reasoning, you can make $3 by choosing $1.

[/ QUOTE ]

No. Explain how 99 makes you the most money. Pick 99 and pray he picks 100?

CallMeIshmael
06-22-2007, 03:05 PM
[ QUOTE ]
Two observations. Tournament poker is not zero sum. There are many situations (for example) where there are just two players left in a hand, and where their strategies will result in changes of equity for the players not in the hand. Easy example - say there are 4 players left in a standard STT (3 places paid) and it is folded to the small blind. Now we have a heads up between the blinds that is not a zero sum game. In fact, virtually every tournament situation where some players have folded their hands results in a non zero sum game between the remaining players.

[/ QUOTE ]

Well, for starters, the equity doesnt change, so, yes, this is still a constant sum game, which is a modified form of a zero sum game. Lets say that 10 players start a 10 dollar SNG (ignore rake for simplicities sake). No matter what the chip stacks, the total equity for all players must be 100. So, it might start off as {10,10,10...10}, and later become {60,40,0,0...0}, but, no matter what the equity doesnt change.

Now, I cant remember the name of the theorem, but there is a theorem that states that you can subtract/add a constant to every payoff and not change the nature of the game, so, subtracting 10 from all payoffs in this game transforms the above example payoffs into {0,0,0..,0} and {50,30,-10,-10..-10}, and we have a zero sum game.

Beyond that, I guess this all depends on how closely you want the situations to be before they are analogous, but, the sitution you just described has more than 2 players, whereas the TD does not. If you want to make it a 3 player game, where two players can collude to get all the payoff out of a third player, then you can, perhaps, find a good equivalent situation to concept of the TD. But, that is changing things around quite a bit, imo.



[ QUOTE ]
Second, it is not clear to me at all that the non zero sum nature of TD has anything to do with the surprising Nash equilibrium

[/ QUOTE ]

It has everythhing to do with it.

The whole point of the TD is that two players can cooperate to achieve a much higher payment than they would at the Nash.

In any two player zero sum game, the payoffs are in the form {X,-X}, and there is no way for players to collude such that BOTH payoffs are higher than all NEs, since, by definition, if one is higher, the other must be lower.

Beyond that, this can be extended for all N-player 0 sum games, since if (a1,a2,a3...aN) are the payoffs at NE, and sum to 0, it is impossible for all payoffs to be increased and still sum to 0.

wazz
06-22-2007, 03:07 PM
Well I'm pretty sure I'm right, but obviously that 2nd paragraph was wrong.

Aside from 'mathematics of poker' i couldn't really say I'm in any way versed or educated in game theory, so I'm not saying I am right in a game theoretical sense here. I'm saying that it doesn't make sense to choose $2 - the fact that $100 is dominated by $99 is an issue, but to resolve that you just take $99 instead; taking $98, via that logic, is equivalent to choosing $100 in the first place as that's the maximum payout you can hope to achieve. $99 'dominates' (in hyphens because it's possible I'm not getting some of the subtleties of that word) both these choices.

CallMeIshmael
06-22-2007, 03:09 PM
[ QUOTE ]
I hate to point this out, but if the objective is to 'make as much money as possible' you should choose $99 or $100. The solution can only be $2 if the objective is to make more money than the other person.

[/ QUOTE ]

Again, this is just mistaken.


Also, if you think picking 100 will give you the 'most money possible', can you tell me the number that the opponent played that made 100 such a good choice?

CallMeIshmael
06-22-2007, 03:12 PM
[ QUOTE ]
Well I'm pretty sure I'm right, but obviously that 2nd paragraph was wrong.

Aside from 'mathematics of poker' i couldn't really say I'm in any way versed or educated in game theory, so I'm not saying I am right in a game theoretical sense here. I'm saying that it doesn't make sense to choose $2 - the fact that $100 is dominated by $99 is an issue, but to resolve that you just take $99 instead; taking $98, via that logic, is equivalent to choosing $100 in the first place as that's the maximum payout you can hope to achieve. $99 'dominates' (in hyphens because it's possible I'm not getting some of the subtleties of that word) both these choices.

[/ QUOTE ]

99 does not dominate 98, since, IF my opponent chooses 99, 98 is better.

And, since you just said that you would resolve the issue of 100 being doiminated by playing 99 instead, 98 looks pretty attractive, no?

CallMeIshmael
06-22-2007, 03:16 PM
All,

instead of the possible numbers being $2-$100, imagine instead that you are playing the game with a magic genie, who is going to let you write down $200 billion - $10000 billion, in increments of $100 billion (ie, same game, but with much larger payoffs), what do you write?

EDIT: I just saw piers' post where he brought up the same thing

vhawk01
06-22-2007, 03:20 PM
[ QUOTE ]
All,

instead of the possible numbers being $2-$100, imagine instead that you are playing the game with a magic genie, who is going to let you write down $200 billion - $10000 billion, in increments of $100 billion (ie, same game, but with much larger payoffs), what do you write?

EDIT: I just saw piers' post where he brought up the same thing

[/ QUOTE ]

No doubt that the paltry $2 payoff emotionally skews peoples knee-jerk impressions of the problem. 2 is practically the same as 0 to most adults who have jobs, so it cant possibly be correct...ANYTHING is better than 0!

CallMeIshmael
06-22-2007, 03:26 PM
[ QUOTE ]
[ QUOTE ]
All,

instead of the possible numbers being $2-$100, imagine instead that you are playing the game with a magic genie, who is going to let you write down $200 billion - $10000 billion, in increments of $100 billion (ie, same game, but with much larger payoffs), what do you write?

EDIT: I just saw piers' post where he brought up the same thing

[/ QUOTE ]

No doubt that the paltry $2 payoff emotionally skews peoples knee-jerk impressions of the problem. 2 is practically the same as 0 to most adults who have jobs, so it cant possibly be correct...ANYTHING is better than 0!

[/ QUOTE ]

this is EXACTLY the point of why I asked the question

Inherent in a lot of answers in the assumption that $2 negligible.


Which, dont get me wrong, for most intents and purposes, it is. But it is a very slight irrationality that is exploited in this problem.

wazz
06-22-2007, 03:27 PM
I say again, 99 is optimal.

The problem here is that game theory is butting its head in where it's not needed. This is a simple maths/logic question.

Every number is dominated by the number 1 lower. However, it dominates every choice 2 numbers lower; this is because the aim is to make money rather than make more money than the other guy. You don't even need to appeal to your sense of marginal utility to get this, I feel.

Maybe this can prove my point. If both agents were fully rational, they need not be able to communicate, but would have the same inner dialogue with each other, i.e. there's no need to screw each other over when we can both take $100 guaranteed. The extra $1 is not worth it when you take into account the possibility the other guy will take this strategy down to the wire.

Consider the paragraph about the game theorists I quoted from the example above; if the reward were directly proportional to the amount of money you earned, the result would have nothing to do with game theory. As it is, one is rewarded for how much more than everyone else you earned rather than absolute amounts.

I agree that game theory is needed when the aim is to beat the other guy; as it is, if you're just trying to make +EV decisions, take $99 or $100.

wazz
06-22-2007, 03:28 PM
[ QUOTE ]

this is EXACTLY the point of why I asked the question

Inherent in a lot of answers in the assumption that $2 negligible.


Which, dont get me wrong, for most intents and purposes, it is. But it is a very slight irrationality that is exploited in this problem.

[/ QUOTE ]

Ever heard of marginal utility? Nothing irrational about it.

CallMeIshmael
06-22-2007, 03:31 PM
[ QUOTE ]
[ QUOTE ]

this is EXACTLY the point of why I asked the question

Inherent in a lot of answers in the assumption that $2 negligible.


Which, dont get me wrong, for most intents and purposes, it is. But it is a very slight irrationality that is exploited in this problem.

[/ QUOTE ]

Ever heard of marginal utility? Nothing irrational about it.

[/ QUOTE ]

Of course I have. But this isnt it.

Do you agree that the assumption that $2 is the same as $0 is irrational, if only slightly?

wazz
06-22-2007, 03:36 PM
I agree slightly. Put £5 in the changebox of a coke machine and I pick it up. Put 5 pence on the floor of a train station and I don't.

vhawk01
06-22-2007, 03:38 PM
[ QUOTE ]
I agree slightly. Put £5 in the changebox of a coke machine and I pick it up. Put 5 pence on the floor of a train station and I don't.

[/ QUOTE ]

As long as it is even SLIGHTLY irrational, then this problem does exactly what it is trying to do. If it isn't irrational at all, then this is the dumbest problem ever.

vhawk01
06-22-2007, 03:40 PM
[ QUOTE ]
I say again, 99 is optimal.

The problem here is that game theory is butting its head in where it's not needed. This is a simple maths/logic question.

Every number is dominated by the number 1 lower. However, it dominates every choice 2 numbers lower; this is because the aim is to make money rather than make more money than the other guy. You don't even need to appeal to your sense of marginal utility to get this, I feel.

Maybe this can prove my point. If both agents were fully rational, they need not be able to communicate, but would have the same inner dialogue with each other, i.e. there's no need to screw each other over when we can both take $100 guaranteed. The extra $1 is not worth it when you take into account the possibility the other guy will take this strategy down to the wire.

Consider the paragraph about the game theorists I quoted from the example above; if the reward were directly proportional to the amount of money you earned, the result would have nothing to do with game theory. As it is, one is rewarded for how much more than everyone else you earned rather than absolute amounts.

I agree that game theory is needed when the aim is to beat the other guy; as it is, if you're just trying to make +EV decisions, take $99 or $100.

[/ QUOTE ]

A better way to look at it, as opposed to your inner monologue idea (which I do like) is to say that, if you are both perfectly rational, then it is essentially the same as saying "Have the first guy write his answer down on a sheet, and let the other guy look at it and then write whatever he wants. First guy cannot change."

Now, in this scenario, do you think the first guy is writing...what? 99? And so now if you are the second guy, are you seriously contending that 99 gets you the most money? Pretend you are either player and you can see that you will NEVER be writing 99.

wazz
06-22-2007, 03:45 PM
If the first guy can't change his choice, the first guy writes 100, the second guy writes 99. That's clearly not the same as having a pretend discussion with the other guy in your head, though. Whatever the first guy chooses, he can expect to earn $2 less, unless he chooses $2. Thus he maximizes expectation by choosing $100.

vhawk01
06-22-2007, 03:46 PM
[ QUOTE ]
If the first guy can't change his choice, the first guy writes 100, the second guy writes 99. That's clearly not the same as having a pretend discussion with the other guy in your head, though. Whatever the first guy chooses, he can expect to earn $2 less, unless he chooses $2. Thus he maximizes expectation by choosing $100.

[/ QUOTE ]

If you are the first guy, you write 100? God, why? You suck at this game! Didn't you read the rules!?

CallMeIshmael
06-22-2007, 03:49 PM
That is the crux of the issue

By chossing something like 100, you risk of the situation of (100,99) --> (97,101), where you "only" get $97.

The difference between 97, and the payoff of switching to 98 (100), is so small that people dont think of these as different numbers. But, in the GT world, they are.



Now, when you say things like "I say again, 99 is optimal." This is just flat out wrong. Now, whether or not this is a maximal strategy, depends on the opponent. And, for the record, Id NEVER pick 2 in real world situations. But, the term "optimal" has a rigorous definition, and 2 is the optimal strategy.


In the theoretic world, it doesnt matter if the increments in the game are $2 or $200 billion, the solution is the same.

It is only when we introduce the slight human irrationalities like "$2 ~= $0" and "$200 billion ~= $10000 billion" that we are able to come to "real world" solutions to the problem.

Now, as you mentioned, the marginal utility of 0 and 2 dollars are very very close, as are the marginal utilities for 200 billion and 10000 billion, but they ARE different. 2 is slightly better than 0, and 10000 billion is slightly better than 200 billion.

The fact that the utility of these values are so close as to be negligible in the real world, doesnt make them equal in the theoretic world.

CallMeIshmael
06-22-2007, 03:50 PM
[ QUOTE ]
[ QUOTE ]
If the first guy can't change his choice, the first guy writes 100, the second guy writes 99. That's clearly not the same as having a pretend discussion with the other guy in your head, though. Whatever the first guy chooses, he can expect to earn $2 less, unless he chooses $2. Thus he maximizes expectation by choosing $100.

[/ QUOTE ]

If you are the first guy, you write 100? God, why? You suck at this game! Didn't you read the rules!?

[/ QUOTE ]

100 is correct, methinks

Piers
06-22-2007, 05:00 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
If the first guy can't change his choice, the first guy writes 100, the second guy writes 99. That's clearly not the same as having a pretend discussion with the other guy in your head, though. Whatever the first guy chooses, he can expect to earn $2 less, unless he chooses $2. Thus he maximizes expectation by choosing $100.

[/ QUOTE ]

If you are the first guy, you write 100? God, why? You suck at this game! Didn't you read the rules!?

[/ QUOTE ]

100 is correct, methinks

[/ QUOTE ]

[ QUOTE ]
100 is correct, methinks

[/ QUOTE ]

It’s not a competiont; you just want as much money as possible.

The OP’s traveller’s dilemma plays two tricks with your mind.

First it says the aim is to make money not beat the other player, but it keeps using terms like game and opponent that get you into the patter of thought that it is a competition. This seems to have fooled vhawk.

Secondly it uses this idea of a perfectly rational person, as the concept appears fairly well defined its fine for a mathematical problem. It then starts implying that the situation has real world applications while overlooking the fact that the ‘perfectly rational’ people need for the dilemma to hold do not actually exist.

CallMeIshmael
06-22-2007, 05:11 PM
Piers,

you quoted my post, but was that in repsonse to the post?? Just to clarify, the post you quoted was about a game that is different than the OP.

Also, "First it says the aim is to make money not beat the other player, but it keeps using terms like game and opponent that get you into the patter of thought that it is a competition. "


Again, this is wrong. The solution doesnt assume you are trying to beat the opponenent, it assumes you are trying to get as much money as possible.


also, "Secondly it uses this idea of a perfectly rational person, as the concept appears fairly well defined its fine for a mathematical problem. It then starts implying that the situation has real world applications while overlooking the fact that the ‘perfectly rational’ people need for the dilemma to hold do not actually exist. "


I dont think anyone disagrees with this. The idea is that IF you assume this purely rational beings, you can sometimes get odd results. But, I dont think many peopla actually think these beings exist.


EDIT: I think I know why you quoted that post now. Was it just to point out that vhawk was assuming the goal was simply to beat the opponent? (which appears to be true given that he didnt want to play 100 in the altered game)

At first I thought it was in reference to something I said, and I most certainly am not assuming the goal is anything other than maximizing payoff

Piers
06-22-2007, 05:39 PM
[ QUOTE ]
Also, "First it says the aim is to make money not beat the other player, but it keeps using terms like game and opponent that get you into the patter of thought that it is a competition. "

Again, this is wrong. The solution doesnt assume you are trying to beat the opponenent, it assumes you are trying to get as much money as possible.


[/ QUOTE ]

As I said

[ QUOTE ]
, "First it says the aim is to make money not beat the other player, but it keeps using terms like game and opponent that get you into the patter of thought that it is a competition. "


[/ QUOTE ]

It says one thing then tries to confuse people by making them think it is saying something else. I guess I was using vhawk’s post to illustrate how successful the problem was as achieving this.

[ QUOTE ]
I dont think anyone disagrees with this. The idea is that IF you assume this purely rational beings, you can sometimes get odd results. But, I dont think many peopla actually think these beings exist.

[/ QUOTE ]

I think you underestimate people’s ability to confuse themselves.

[ QUOTE ]
A quote from the SA article:
Game theorists have made a number of attempts to explain why a lot of players do not choose the Nash equilibrium in TD experiments. Some analysts have argued that many people are unable to do the necessary deductive reasoning and therefore make irrational choices unwittingly


[/ QUOTE ]

Someone here is acting confused, I am not sure if it’s the author of the SA article or the game theorists he talks about.

I start off knowing that the rational people defined in the problem do not exist, then after thinking some other stuff, I tell myself I am a cool dude that posts on SM&P forum so I must be rational. Clearly rational people exist. Hence I must choose $2 as that is what rational people choose, but that does not make sense I make more by choosing $100 --- help I’m confused.

CallMeIshmael
06-22-2007, 06:42 PM
[ QUOTE ]
I start off knowing that the rational people defined in the problem do not exist, then after thinking some other stuff, I tell myself I am a cool dude that posts on SM&P forum so I must be rational. Clearly rational people exist. Hence I must choose $2 as that is what rational people choose

[/ QUOTE ]


(semi nit pick) This isnt technically true. Even perfectly rational beings would often pick a number other than 2. For a perfectly rational being to pick 2, they must also be certain that the other player is perfectly rational, that the other player knows that you are perfectly rational, that the other player knows that you know that they know you are perfectly rational....

PairTheBoard
06-22-2007, 07:49 PM
I think there is more going on here than just the nonexistence of "perfectly rational" humans. I think the game calls into question the notion of "perfectly rational" itself. Is it really being less than "perfectly rational" to realize that cooperation is much more beneficial than defection in this game when both players cooperate. If I realize that, why can't I conclude my Partner in the game will also realize the same thing? Why isn't it reasonable and "perfectly rational" to then bet on my opponent doing what we both realize will be to our mutual benefit?

Choosing Full cooperation with 100 or nearly full cooperation with a high number is like choosing Peace over War. Realizing that Peace is more profitable than War and betting your Partner will realize the same thing seems "perfectly rational" to me. Betting that your opponent will choose the route of a costly War over a profitable Peace just looks like Game Theory gone Mad to me.

I wonder how many other positions people promote as being the only possible "perfectly rational" way of thinking which are likewise lunacies of logic.

PairTheBoard

Sephus
06-22-2007, 10:31 PM
[ QUOTE ]
I think there is more going on here than just the nonexistence of "perfectly rational" humans. I think the game calls into question the notion of "perfectly rational" itself. Is it really being less than "perfectly rational" to realize that cooperation is much more beneficial than defection in this game when both players cooperate. If I realize that, why can't I conclude my Partner in the game will also realize the same thing? Why isn't it reasonable and "perfectly rational" to then bet on my opponent doing what we both realize will be to our mutual benefit?

[/ QUOTE ]

it's not possible to "cooperate." it's a one time game.

[ QUOTE ]
Choosing Full cooperation with 100

[/ QUOTE ]

how can you even suggest 100 as a possible choice?

[ QUOTE ]
or nearly full cooperation with a high number is like choosing Peace over War. Realizing that Peace is more profitable than War and betting your Partner will realize the same thing seems "perfectly rational" to me.

[/ QUOTE ]

it seems like you're just assuming your conclusion here.

[ QUOTE ]
Betting that your opponent will choose the route of a costly War over a profitable Peace just looks like Game Theory gone Mad to me.

I wonder how many other positions people promote as being the only possible "perfectly rational" way of thinking which are likewise lunacies of logic.

[/ QUOTE ]

solutions to game theory problems will often not make sense when you refuse to "play by the rules."

CallMeIshmael
06-22-2007, 10:34 PM
[ QUOTE ]
Choosing Full cooperation with 100 or nearly full cooperation with a high number is like choosing Peace over War. Realizing that Peace is more profitable than War and betting your Partner will realize the same thing seems "perfectly rational" to me. Betting that your opponent will choose the route of a costly War over a profitable Peace just looks like Game Theory gone Mad to me.

[/ QUOTE ]


This was an anology that people used in the other thread, but it suffers a flaw. In a war/peace type thing, picking war even when the other person picks peace, still suffers penalties. Specifically, peace fights back after being provoked.

In this game, there is no "fight back" time period, since its a one shot game.


You cant use logic like "well, I know if I pick 99, then he will pick 98, and we will spiral down to 2, so I will pick 100", since there is no mechanism by which the opponent can know you are picking 99.

Piers
06-22-2007, 10:35 PM
[ QUOTE ]
I think there is more going on here than just the nonexistence of "perfectly rational" humans. I think the game calls into question the notion of "perfectly rational" itself.

[/ QUOTE ]

I thought ‘perfectly rational’ was a noun clause not a noun + adverb, a mathematical definition defined specifically for this problem and similar ones. Something like an algorithm that always selects a Nash equilibrium.

[ QUOTE ]
I think the game calls into question the notion of "perfectly rational" itself.

[/ QUOTE ]

I get real confused with the way ‘perfectly’ is used here.

[ QUOTE ]
Peace is more profitable than War and betting your Partner will realize the same thing seems "perfectly rational"

[/ QUOTE ]

Ok that’s fine, but aren’t you definition shifting.

Perfectly (adverb) (http://mw1.merriam-webster.com/dictionary/perfectly)

1 : in a perfect manner
2 : to a complete or adequate extent : QUITE <was perfectly happy until now>

Perfect (http://mw1.merriam-webster.com/dictionary/perfect)
1 a: being entirely without fault or defect : FLAWLESS <a perfect diamond


[ QUOTE ]
I wonder how many other positions people promote as being the only possible "perfectly rational" way of thinking which are likewise lunacies of logic.

[/ QUOTE ]

Confused again, do you mean adequately rational or flawlessly rational, either way the meaning seems OK.

I know I am nitpicking, but these nits seem particularly good at confusing people.

Sephus
06-22-2007, 10:49 PM
maybe this has been explained already, i didn't go through the whole thread.

you can't choose 100, because no matter what your opponent does you will never get a higher payoff with 100 than 99 and 99 has some payoffs higher than 100.

can we all agree on that? neither player can rationally choose 100. it's impossible.

if you agree with that, you have to agree that neither player can possibly choose 99 if he knows that the other player is rational. in order for choosing 99 to be better than 98, there has to be a chance that the other player will choose 100. but we already agreed that no rational player will ever choose 100. you can't go back on that now.

so 99 can never be a better choice than 98 if you're playing against a rational opponent. for the same reason, 97 beats out 98, 96 97, etc.

basically if you want to argue that any solution other than $2 can be "correct" under assumptions of mutual knowledge of rationality, you're forced to maintain that $99 is not better than $100.

do i have this right? is my explanation ok?

CallMeIshmael
06-22-2007, 11:08 PM
[ QUOTE ]
basically if you want to argue that any solution other than $2 can be "correct" under assumptions of mutual knowledge of rationality, you're forced to maintain that $99 is not better than $100.

do i have this right? is my explanation ok?

[/ QUOTE ]


Yes.

Essentially, once you introduce the tiny irrationality that p(100) == p(99), then you can get a very nice solution of 'play 100'.

Sephus
06-23-2007, 12:44 AM
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).

i'm pretty sure that's all correct, but i might have screwed up somewhere.

a dissenter can say, "i know that he is rational, but i don't know that he knows that i am rational, therefore i can play 98," but if you somehow know that he is rational, he must know that you're rational because the game is supposed to be symmetrical.

so then you might say "ok fine, i do know that he knows that i am rational, but i don't know that he knows that i know he is rational, and therefore i can play 97." but this runs into the same problem. you're not going to get to a point where you know something he doesn't. nothing in the game is hidden.

it's starting to get messy but i hope it's clear why no answer other than $2 can be justified given that both players are rational and know that the other is rational.

the entire discrepancy between the nash equilibrium results and the experimental ones comes from the fact that "both opponents are rational and know that the other is rational" is basically taken out of the problem when you do it in the "real world."

GMontag
06-23-2007, 01:00 AM
[ QUOTE ]
maybe this has been explained already, i didn't go through the whole thread.

you can't choose 100, because no matter what your opponent does you will never get a higher payoff with 100 than 99 and 99 has some payoffs higher than 100.

can we all agree on that? neither player can rationally choose 100. it's impossible.

if you agree with that, you have to agree that neither player can possibly choose 99 if he knows that the other player is rational. in order for choosing 99 to be better than 98, there has to be a chance that the other player will choose 100. but we already agreed that no rational player will ever choose 100. you can't go back on that now.

so 99 can never be a better choice than 98 if you're playing against a rational opponent. for the same reason, 97 beats out 98, 96 97, etc.

basically if you want to argue that any solution other than $2 can be "correct" under assumptions of mutual knowledge of rationality, you're forced to maintain that $99 is not better than $100.

do i have this right? is my explanation ok?

[/ QUOTE ]

$99 isn't better than $100 precisely because of the mutual knowledge of perfect rationality. If both players know that each other is perfectly rational, then both players know that the number the other player chooses will be identical to the one he chooses. Since the solutions are limited to (X,X), the optimal solution is (100,100). All other reasoning implictly contradicts the assumption of mutual knowledge of perfect rationality.

Sephus
06-23-2007, 01:03 AM
[ QUOTE ]
[ QUOTE ]
maybe this has been explained already, i didn't go through the whole thread.

you can't choose 100, because no matter what your opponent does you will never get a higher payoff with 100 than 99 and 99 has some payoffs higher than 100.

can we all agree on that? neither player can rationally choose 100. it's impossible.

if you agree with that, you have to agree that neither player can possibly choose 99 if he knows that the other player is rational. in order for choosing 99 to be better than 98, there has to be a chance that the other player will choose 100. but we already agreed that no rational player will ever choose 100. you can't go back on that now.

so 99 can never be a better choice than 98 if you're playing against a rational opponent. for the same reason, 97 beats out 98, 96 97, etc.

basically if you want to argue that any solution other than $2 can be "correct" under assumptions of mutual knowledge of rationality, you're forced to maintain that $99 is not better than $100.

do i have this right? is my explanation ok?

[/ QUOTE ]

$99 isn't better than $100 precisely because of the mutual knowledge of perfect rationality. If both players know that each other is perfectly rational, then both players know that the number the other player chooses will be identical to the one he chooses. Since the solutions are limited to (X,X), the optimal solution is (100,100). All other reasoning implictly contradicts the assumption of mutual knowledge of perfect rationality.

[/ QUOTE ]

a rational agent doesn't choose a lower payoff over a higher one. your entailments are all out of order.

PairTheBoard
06-23-2007, 01:13 AM
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

GMontag
06-23-2007, 01:20 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
maybe this has been explained already, i didn't go through the whole thread.

you can't choose 100, because no matter what your opponent does you will never get a higher payoff with 100 than 99 and 99 has some payoffs higher than 100.

can we all agree on that? neither player can rationally choose 100. it's impossible.

if you agree with that, you have to agree that neither player can possibly choose 99 if he knows that the other player is rational. in order for choosing 99 to be better than 98, there has to be a chance that the other player will choose 100. but we already agreed that no rational player will ever choose 100. you can't go back on that now.

so 99 can never be a better choice than 98 if you're playing against a rational opponent. for the same reason, 97 beats out 98, 96 97, etc.

basically if you want to argue that any solution other than $2 can be "correct" under assumptions of mutual knowledge of rationality, you're forced to maintain that $99 is not better than $100.

do i have this right? is my explanation ok?

[/ QUOTE ]

$99 isn't better than $100 precisely because of the mutual knowledge of perfect rationality. If both players know that each other is perfectly rational, then both players know that the number the other player chooses will be identical to the one he chooses. Since the solutions are limited to (X,X), the optimal solution is (100,100). All other reasoning implictly contradicts the assumption of mutual knowledge of perfect rationality.

[/ QUOTE ]

a rational agent doesn't choose a lower payoff over a higher one. your entailments are all out of order.

[/ QUOTE ]

Its only a lower payoff if you contradict your assumption about mutual knowledge. (100,100) is a larger payoff than (99,99).

Sephus
06-23-2007, 01:21 AM
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

GMontag
06-23-2007, 01:27 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

You're assuming something contradictory to the setup of the problem, namely that you and the other person can come up with different numbers (or at least different ranges).

PairTheBoard
06-23-2007, 01:38 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

Sephus
06-23-2007, 01:45 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

vhawk01
06-23-2007, 01:48 AM
[ QUOTE ]
Piers,

you quoted my post, but was that in repsonse to the post?? Just to clarify, the post you quoted was about a game that is different than the OP.

Also, "First it says the aim is to make money not beat the other player, but it keeps using terms like game and opponent that get you into the patter of thought that it is a competition. "


Again, this is wrong. The solution doesnt assume you are trying to beat the opponenent, it assumes you are trying to get as much money as possible.


also, "Secondly it uses this idea of a perfectly rational person, as the concept appears fairly well defined its fine for a mathematical problem. It then starts implying that the situation has real world applications while overlooking the fact that the ‘perfectly rational’ people need for the dilemma to hold do not actually exist. "


I dont think anyone disagrees with this. The idea is that IF you assume this purely rational beings, you can sometimes get odd results. But, I dont think many peopla actually think these beings exist.


EDIT: I think I know why you quoted that post now. Was it just to point out that vhawk was assuming the goal was simply to beat the opponent? (which appears to be true given that he didnt want to play 100 in the altered game)

At first I thought it was in reference to something I said, and I most certainly am not assuming the goal is anything other than maximizing payoff

[/ QUOTE ]

Just to excuse my previous stupid answer, I was talking about the original game found in that old OOT thread, not the altered game. Apologize for the confusion.

CallMeIshmael
06-23-2007, 01:51 AM
[ QUOTE ]
I assume that if I tried to cheat by picking 99, so would my partner.

[/ QUOTE ]

Can you explain the mechanism by which the opponent WITH WHOM YOU HAVE NO CONTACT somehow manages to write down different numbers depending on whether you write down 99 or 100?

PairTheBoard
06-23-2007, 02:07 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

vhawk01
06-23-2007, 02:10 AM
[ QUOTE ]
[ QUOTE ]
I assume that if I tried to cheat by picking 99, so would my partner.

[/ QUOTE ]

Can you explain the mechanism by which the opponent WITH WHOM YOU HAVE NO CONTACT somehow manages to write down different numbers depending on whether you write down 99 or 100?

[/ QUOTE ]

I really don't understand all the "We'd both realize" and "We'd come to some unspoken agreement" talk. All that does is start the loop over again. There is some absurd holdover notion that, since we would both realize the futility of our actions, we'd both agree 'not to go down that path.' But this necessitates that your actions somehow influence his. Its the lesson drilled into us by people who want us to recycle and to vote. Hey, of course YOUR vote doesn't matter, but if everyone thought that, no one would vote! Right...so? My voting doesn't impact them, no matter how much we want to pretend it does, and my writing down 100 doesn't prevent them or even encourage them to do the same. If at any point in their perfectly rational decision-making process they ever thought that I would write 100, they would most certainly write 99. This is why I like the simplification where Player A writes his answer, turns it in, and you get to look at it before writing your own. Surely, in that case, no one ever writes 100, right?

vhawk01
06-23-2007, 02:13 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]

This is a weak attack, and I'm a little surprised to hear it come from you. No one would argue that two people using your paradigm would outcompete two people using the 'perfectly rational' paradigm. There are LOTS of better strategies (in fact there aren't any WORSE) if we are allowed to postulate whatever we want or have people working as a team. But that isn't the case. You find that whatever you decide on, that is also what HE will decide on. Well, thats sort of true, using the perfectly rational operators...but only because you both have only one possible choice. The fact that both actors are perfectly rational DOES mean that they will both pick the same number, but that isn't the only implication. You can't just stop there and declare victory. You have to take the bad with the good.

CallMeIshmael
06-23-2007, 02:18 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]


So, what you are saying, is that AFTER HE WRITES A NUMBER DOWN, no matter what, it is going to be the same as whatever you choose?

really? like, seriously?

can you explain the mechanism for that?

PairTheBoard
06-23-2007, 02:31 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]


So, what you are saying, is that AFTER HE WRITES A NUMBER DOWN, no matter what, it is going to be the same as whatever you choose?

really? like, seriously?

can you explain the mechanism for that?

[/ QUOTE ]

Identical thinking. At least I'm going on the assumption that we think identically right down to the final decision of what to write on the paper. I cannot outthink him anymore than he can outthink me.

That's the mechanism. Now you tell me, why are you having such a problem with it? And why do you think it is more rational for two people thinking in concert to settle on $2 apiece when they could both have $100?

PairTheBoard

GMontag
06-23-2007, 02:32 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]


So, what you are saying, is that AFTER HE WRITES A NUMBER DOWN, no matter what, it is going to be the same as whatever you choose?

really? like, seriously?

can you explain the mechanism for that?

[/ QUOTE ]

So in other words, one of the starting assumptions of the problem is nonsensical.

There's only two possibilities here, either the problem is self-contradictory and meaningless, or both players will *always* come up with the same number. Because that's what mutual knowledge of perfect rationality implies. Since you are implicitly rejecting that, the only thing left is a self-contradictory situation.

Sephus
06-23-2007, 02:43 AM
[ QUOTE ]
It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

[/ QUOTE ]

it's not hard to understand. i agree, based on that assumption, your best choice is 100.

i'm arguing that the assumption "we are both rational therefore he must choose whatever i choose no matter what i choose" does not trump every other assumption one can make based on mutual rationality. for example, if you can't say with certainty what your opponent will do/has done until you make your own bid, then one could argue that mutual rationality has already broken down.

you can claim that the problem has no solution because of a "rationality paradox," but you can't claim that 100/100 "solves" the problem.

[ QUOTE ]
Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems?

[/ QUOTE ]

nice.

[ QUOTE ]
What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational".

[/ QUOTE ]

why should i deny it? it's obvious. two of you would beat two of me in the prisoner's dilemma too! of course, you disagree with the "accepted" solution to that problem as well. why would you expect me to be bothered by the fact that two irrational players can fare better than two rational ones when that concept is so well established in game theory?

[ QUOTE ]
Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]

along with every person who teaches or writes about game theory.

vhawk01
06-23-2007, 03:20 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
interestingly enough...

A. if you are rational, the highest you can play is 99 (because it weakly dominates 100).

B. if you know that the other player is rational, the highest you can play is 98 (because A. applies to him).

(from now on just assume that both know the other is rational)

C. if you know that he knows you are rational, the highest you can play is 97 (because B. applies to him).

D. if you know that he knows that you know he is rational, the highest you can play is 96 (because C. applies to him).

E. if you know that he knows that you know that he knows that you are rational, the highest you can play is 95 (because D. applies to him).


[/ QUOTE ]

We both see that this is nonsense and we improve on getting $2 apiece by simply both picking 100. We both come to the conclusion that each of us unilaterally declaring peace is far more beneficial to us both than going to war. We realize that we think alike and can benefit from that knowledge without actually having to communicate it. I assume that if I tried to cheat by picking 99, so would my partner. We do worse by both picking 99 than by both picking 100.

If we both assume that we are both rational, then we will both pick the same number. Since we both know that, and since the best number for us both to pick is 100, we do the rational thing and both pick 100.

It would be irrational for us to decide the best number for us both to pick is 2.

PairTheBoard

[/ QUOTE ]

if he picks first (you don't get to see) and you go after him, you still choose 100? it's really the exact same game, you don't get any extra information.

[/ QUOTE ]

It doesn't change the game nor my assumption that he chooses (or will have chosen) the same number I choose. Under that assumption, I do best individually if and only if we do best together and that is accomplished by picking 100.

PairTheBoard

[/ QUOTE ]

so if he writes his bid down on a card it exists as all possible bids and then "collapses" to 100 when you submit your own.

[/ QUOTE ]

It is what it is. I assume it is whatever I end up deciding on for my number. If I decide 99 I assume that's what he wrote. If I decide 100 I assume That's what he wrote. Based on that assumption, my best choice is 100. Why is this so hard for you to understand?

Could it be because you are unable to comprehend two people "Thinking in Concert"? Is such a thing outside the scope of your standard logical systems? What you cannot deny is that two people thinking according to what I claim is "rational" will do 50 times better in this game than two people thinking according to what you claim is "rational". Maybe your standard logical systems need an upgrade.

PairTheBoard

[/ QUOTE ]


So, what you are saying, is that AFTER HE WRITES A NUMBER DOWN, no matter what, it is going to be the same as whatever you choose?

really? like, seriously?

can you explain the mechanism for that?

[/ QUOTE ]

So in other words, one of the starting assumptions of the problem is nonsensical.

There's only two possibilities here, either the problem is self-contradictory and meaningless, or both players will *always* come up with the same number. Because that's what mutual knowledge of perfect rationality implies. Since you are implicitly rejecting that, the only thing left is a self-contradictory situation.

[/ QUOTE ]

Read my post a few up. Of course the two players will come up with the same number. But they dont get there by one person putting whatever the heck he wants and the other one putting the same via telepathy. There are assumptions that lead to the inevitable outcome that they both pick the same number, and those assumptions have additional implications. One of them happens to be that they both pick $2.

Sephus
06-23-2007, 03:21 AM
my last stab at combating agametheoryism.

"my opponent will choose whatever i do provided the decision is the rational one."
"since i know this, i will choose whatever gives me the highest payoff when we both do the same thing."
"this maximizes my payoff provided that it is the rational choice, therefore it is the rational choice."

GMontag
06-23-2007, 03:28 AM
[ QUOTE ]
[ QUOTE ]
So in other words, one of the starting assumptions of the problem is nonsensical.

There's only two possibilities here, either the problem is self-contradictory and meaningless, or both players will *always* come up with the same number. Because that's what mutual knowledge of perfect rationality implies. Since you are implicitly rejecting that, the only thing left is a self-contradictory situation.

[/ QUOTE ]

Read my post a few up. Of course the two players will come up with the same number. But they dont get there by one person putting whatever the heck he wants and the other one putting the same via telepathy. There are assumptions that lead to the inevitable outcome that they both pick the same number, and those assumptions have additional implications. One of them happens to be that they both pick $2.

[/ QUOTE ]

The assumption of mutual knowledge directly implies that both pick the same number without any of the additional implications. The argument that leads to (2,2) only works if you ignore the mutual knowledge. In fact the argument for (2,2) has both players assuming that they are "more rational", or can think on at least one additional level, than the other player, in direct contradiction to one of the premises of the problem. When you recognize that fallacy for what it is, the argument breaks down completely.

EDIT: removed some of the quotes for readability.

Freyalise
06-23-2007, 09:20 AM
Didn't read the whole thread, but personally I pick $99. I expect this performs the best against the field of 'random people picking'.

I gain $0 instead of $2 against perfectly rational Nash-equilibrists, but gain $101 against GMontag and others. This seems like a good strategy as I expect there are more of the latter than the former.

Surely no-one can justify picking $100, a number which *always* does worse than picking $99 - no matter what number the opponent picked. I just cannot imagine any credible explanation for $100. Every other number makes sense in that it can do well against opponents with a certain range, but $100 can never be correct.

I also assume that everyone who picked $100 would also choose 'keep silent' in the prisoner's dilemma, even though it's a dominated strategy?

Would be interesting to hear the justification of anyone who would pick $100 here, but choose 'Betray' in prisoner's dilemma.

GMontag
06-23-2007, 10:53 AM
[ QUOTE ]
Didn't read the whole thread, but personally I pick $99. I expect this performs the best against the field of 'random people picking'.

I gain $0 instead of $2 against perfectly rational Nash-equilibrists, but gain $101 against GMontag and others. This seems like a good strategy as I expect there are more of the latter than the former.

Surely no-one can justify picking $100, a number which *always* does worse than picking $99 - no matter what number the opponent picked. I just cannot imagine any credible explanation for $100. Every other number makes sense in that it can do well against opponents with a certain range, but $100 can never be correct.

I also assume that everyone who picked $100 would also choose 'keep silent' in the prisoner's dilemma, even though it's a dominated strategy?

Would be interesting to hear the justification of anyone who would pick $100 here, but choose 'Betray' in prisoner's dilemma.

[/ QUOTE ]

The justification depends on the opponent. In this situation, the problem stipulates "infinite" rationality (i.e. I know he's rational, he knows I know he's rational, I know he knows I know he's rational, ad infinitum) for both players. The normal formulation of the Prisioner's Dilemma doesn't make such a stipulation. If it did, the correct answer there would also be to keep silent.

wazz
06-23-2007, 10:59 AM
[ QUOTE ]
Surely no-one can justify picking $100, a number which *always* does worse than picking $99 - no matter what number the opponent picked. I just cannot imagine any credible explanation for $100. Every other number makes sense in that it can do well against opponents with a certain range, but $100 can never be correct.

[/ QUOTE ]

Incorrect! See if you can work out why.

CallMeIshmael
06-23-2007, 01:41 PM
PTB, GM:

Imagine a situation in which YOU are playing this game, against someone who is not only perfectly rational, but assumes you are, assumes that you know that he is...

Bascially, you have somehow convinced him/her that you are perfectly rational. Perhaps its the shirt you are wearing.

What do you write?

Im assuming 99, since you seem to be under the impression that a perfectly rational being writes 100.

Is it odd that a less than perfect being does better than a perfect being against the same opponent in this game?

GMontag
06-23-2007, 02:06 PM
[ QUOTE ]
PTB, GM:

Imagine a situation in which YOU are playing this game, against someone who is not only perfectly rational, but assumes you are, assumes that you know that he is...

Bascially, you have somehow convinced him/her that you are perfectly rational. Perhaps its the shirt you are wearing.

What do you write?

Im assuming 99, since you seem to be under the impression that a perfectly rational being writes 100.

Is it odd that a less than perfect being does better than a perfect being against the same opponent in this game?

[/ QUOTE ]

Why would it be odd that a person with faulty knowledge does worse than a person with correct knowledge?

CallMeIshmael
06-23-2007, 02:20 PM
They both have correct knowledge.



Lets say there are 3 players, A, B and C.


Lets say A is perfectly rational, and so is C, but B is not. However, C assumes they are both perfectly rational.


So, according to you, when the game is AC, they both play 100.

And, when the game is BC, C also plays 100.


Now, assuming you are B, you play 99, yes?


And, to note, NEITHER A nor B has faulty knowledge. They both, correctly, know that their opponent is perfectly rational. Yet, the less than perfect B does better than A.


It seems odd that the perfectly rational A doesnt, just this once, forget that he is perfectly rational, and switches to 99.

GMontag
06-23-2007, 02:35 PM
[ QUOTE ]
They both have correct knowledge.

[/ QUOTE ]

No, you said I had managed to *convince* my opponent that I was perfectly rational and had knowledge of his infinite inductive chain. As that is not true, he was operating on faulty assumptions.

[ QUOTE ]
Lets say there are 3 players, A, B and C.


Lets say A is perfectly rational, and so is C, but B is not. However, C assumes they are both perfectly rational.


So, according to you, when the game is AC, they both play 100.

And, when the game is BC, C also plays 100.


Now, assuming you are B, you play 99, yes?


And, to note, NEITHER A nor B has faulty knowledge. They both, correctly, know that their opponent is perfectly rational. Yet, the less than perfect B does better than A.

[/ QUOTE ]

Because of the faulty knowledge of C. Not because of any "better than perfect"ness on B's part.


[ QUOTE ]
It seems odd that the perfectly rational A doesnt, just this once, forget that he is perfectly rational, and switches to 99.

[/ QUOTE ]

The only reason that strategy would work better would be because then C's knowledge about A would also be faulty. If C's knowledge is not faulty, then 100 is better than 99.

The whole problem with this setup is you start out by saying "Assume you can't outthink your opponent", then you go about making an argument where you assume you can outthink your opponent. It's nonsense.

CallMeIshmael
06-23-2007, 02:40 PM
"The only reason that strategy would work better would be because then C's knowledge about A would also be faulty."

So you admit that it is in a perfectly rational players interest to deviate, essentially pretending they are not rational, but also believe they would opt to not do this.

OK.

GMontag
06-23-2007, 02:49 PM
[ QUOTE ]
"The only reason that strategy would work better would be because then C's knowledge about A would also be faulty."

So you admit that it is in a perfectly rational players interest to deviate, essentially pretending they are not rational, but also believe they would opt to not do this.

OK.

[/ QUOTE ]

I admit no such thing. If C's knowledge about A is faulty, then A will benefit from that, but A cannot "pretend" to be irrational while still being rational, as that would be predicted by C, being perfectly rational.

CallMeIshmael
06-23-2007, 03:03 PM
"A cannot "pretend" to be irrational while still being rational, as that would be predicted by C, being perfectly rational. "

How?

Like, how does C know exactly what A is thinking? The definition that they are both rational, simply means they are going to examine the structure of the game, and use logic and mathematics to deduce the optimal strategy.

Mutual assumption of rationality doesnt equal telepathy

Is A constantly thinking "maybe I should play 99... no, [censored], wait, he just heard that... no, 100.. we gotta play 100"?

PairTheBoard
06-23-2007, 03:04 PM
On the other hand, A,B may play 100 irrationally according to you while C makes the Game Theory rational choice of 2. Now who does better after all the matchups?

According to the Scientific American article this problem remains open. It requires creative thinking. Game theory as it presently stands is simply not adequate. I'm inclined to look at the "Parasite Dilemma" for further investigation into what's going on here.

The simple Prisoner's Dilemma has a Dominating Strategy of Defection when played only once. Yet if played repeatedly in a population of opponents the constant Defect strategy does worse than ones which encourage Cooperation. It looks to me that the Traveler's Dilemma effectively telescopes repetitions of the Prisoner's Dilemma into a one time decision. The Repetitions become Virtual Ones contained in the Thinking of the participants rather than actual ones being physically transacted. The Tit-for-Tat type strategies employed in Repeated Prisoner's Dilemmas become incorporated into the one time decision of the Traveler's Dilemma. These are the kinds of concepts that need to be looked at in my opinion. I leave it to the professional game theoreticians to continue work on it.

Meanwhile, I'm interested in what people in SMP have to say about the Parasite Dilemma I presented on another thread. I think it may further isolate some of these concepts.

PairTheBoard

CallMeIshmael
06-23-2007, 03:11 PM
GMontag,

Back to what you said about the prisoner's dilemma

Assume that both players in the game are perfectly rational. Also assume that one player goes first, and the response (coop/defect) is wrtitten down on a piece of paper.

That paper is brought into the other person's room, but they are not allowed to look at it.


It is your belief that, regardless of what the second player chooses, it was ALWAYS be the same as what is on the paper, yes?

GMontag
06-23-2007, 03:16 PM
[ QUOTE ]
"A cannot "pretend" to be irrational while still being rational, as that would be predicted by C, being perfectly rational. "

How?

Like, how does C know exactly what A is thinking? The definition that they are both rational, simply means they are going to examine the structure of the game, and use logic and mathematics to deduce the optimal strategy.

Mutual assumption of rationality doesnt equal telepathy

Is A constantly thinking "maybe I should play 99... no, [censored], wait, he just heard that... no, 100.. we gotta play 100"?

[/ QUOTE ]

It's not telepathy. It is merely a consequence of the fact that C being perfectly rational and having accurate knowledge of A's rationality and the rest of the infinite inductive chain, will be able to come to the same conclusion as A. That is, unless there are two different valid rational arguments that lead to different strategies.

GMontag
06-23-2007, 03:20 PM
[ QUOTE ]
GMontag,

Back to what you said about the prisoner's dilemma

Assume that both players in the game are perfectly rational. Also assume that one player goes first, and the response (coop/defect) is wrtitten down on a piece of paper.

That paper is brought into the other person's room, but they are not allowed to look at it.


It is your belief that, regardless of what the second player chooses, it was ALWAYS be the same as what is on the paper, yes?

[/ QUOTE ]

If you state that as one of the premises of the problem, as was done in this problem, then yes. It's not the rationality that does it, it's the knowledge of the other player's rationality, et al that is the condition for superrationality.

CallMeIshmael
06-23-2007, 03:35 PM
[ QUOTE ]
[ QUOTE ]
GMontag,

Back to what you said about the prisoner's dilemma

Assume that both players in the game are perfectly rational. Also assume that one player goes first, and the response (coop/defect) is wrtitten down on a piece of paper.

That paper is brought into the other person's room, but they are not allowed to look at it.


It is your belief that, regardless of what the second player chooses, it was ALWAYS be the same as what is on the paper, yes?

[/ QUOTE ]

If you state that as one of the premises of the problem, as was done in this problem, then yes. It's not the rationality that does it, it's the knowledge of the other player's rationality, et al that is the condition for superrationality.

[/ QUOTE ]

Here's the problem with this:

Under your framework, the answer to the question "what is the solution to a one shot prisoner's dilemma under the common knowledge assumption of rationality?" is "Cooperate"

But, that is not the accepted answer.

Now, it IS sort of lame to appeal to what is essentially a tautology, but it is what it is.

The common use of the "assumption of common rationality" isnt used the way you want to use it. I mean, I understand what you mean, but that is not what the term means.

GMontag
06-23-2007, 04:03 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
GMontag,

Back to what you said about the prisoner's dilemma

Assume that both players in the game are perfectly rational. Also assume that one player goes first, and the response (coop/defect) is wrtitten down on a piece of paper.

That paper is brought into the other person's room, but they are not allowed to look at it.


It is your belief that, regardless of what the second player chooses, it was ALWAYS be the same as what is on the paper, yes?

[/ QUOTE ]

If you state that as one of the premises of the problem, as was done in this problem, then yes. It's not the rationality that does it, it's the knowledge of the other player's rationality, et al that is the condition for superrationality.

[/ QUOTE ]

Here's the problem with this:

Under your framework, the answer to the question "what is the solution to a one shot prisoner's dilemma under the common knowledge assumption of rationality?" is "Cooperate"

But, that is not the accepted answer.

Now, it IS sort of lame to appeal to what is essentially a tautology, but it is what it is.

The common use of the "assumption of common rationality" isnt used the way you want to use it. I mean, I understand what you mean, but that is not what the term means.

[/ QUOTE ]

Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem. It is that premise that I'm basing my argument for (100,100) on, and my statement about the PD was if you use that premise there as well, the solution would be (keep silent, keep silent).

CallMeIshmael
06-23-2007, 04:08 PM
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]

Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

GMontag
06-23-2007, 09:21 PM
[ QUOTE ]
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]

Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

[/ QUOTE ]

Then the accepted answer is wrong. Come up with a couterargument that isn't an argument from authority.

Sephus
06-23-2007, 10:48 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]

Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

[/ QUOTE ]

Then the accepted answer is wrong. Come up with a couterargument that isn't an argument from authority.

[/ QUOTE ]

someone smarter than i am can probably do this better, but my best guess is there's some sort of circularity i can't quite pin down.

maybe the entire rationality of your strategy depends on it being the rational strategy.

maybe it's a problem that you expect to use the same strategy because only one solution is rational, and you use that fact to decide which is the rational one.

maybe you have to say "we will both choose the same solution given that we choose the rational one" and not just "we will both choose the same solution." (100/100) maximizes your payoff if you both choose the same strategy, which we know happens if you both choose the rational strategy, which makes 100 the rational strategy. it seems like there could be a problem somewhere.

even though you say you are sure that 100 is the only rational solution, you don't claim that you know for sure (beforehand) that your rational opponent will bid 100. it seems like it should follow from "there is only one rational solution" that you also know your opponent will play it ahead of time.

maybe once you start assuming that your opponents bid depends on your own, you've already left rationality behind, even though the whole thing appears to be rational on the surface.

playing 100 maximizes your payoff as long as it's the rational strategy, and it's the rational strategy because it maximizes your payoff. but of course it doesn't maximize your payoff against an opponent playing "the rational strategy," but we've already proven it's the rational strategy, so it must be the case that sometimes the rational strategy does not maximize its own payoff given the rational strategy of the opponent.

i dont know, [censored] it.

CallMeIshmael
06-24-2007, 12:48 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]



Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

[/ QUOTE ]

Then the accepted answer is wrong. Come up with a couterargument that isn't an argument from authority.

[/ QUOTE ]



We are talking about the definition of "the assumption of common/infinite rationality."

Your definition is wrong. I read through a book here to get a better understanding of it, and posted a wiki link (wiki isnt the greatest, but Im sure there are better readings online available).


If we disagree on a definition, then what means do I have beyond referencing authority to give credence to the claim that my definition is superior?


I mean, imagine the following dialogue:

A: That movie was brilliant!

B: No way. It was full of plot holes, had poor acting, the camera work was terrible and the ending made no sense.

A: While all that is true, the movie's name began with the letter 'Q', which means that it must be brilliant!

B: Well, thats not really the definition of 'brilliant.' Let me get a dictionary to show you.

A: OMG. APPEAL TO AUTHORITY.

GMontag
06-24-2007, 12:52 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]



Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

[/ QUOTE ]

Then the accepted answer is wrong. Come up with a couterargument that isn't an argument from authority.

[/ QUOTE ]



We are talking about the definition of "the assumption of common/infinite rationality."

Your definition is wrong. I read through a book here to get a better understanding of it, and posted a wiki link (wiki isnt the greatest, but Im sure there are better readings online available).


If we disagree on a definition, then what means do I have beyond referencing authority to give credence to the claim that my definition is superior?


I mean, imagine the following dialogue:

A: That movie was brilliant!

B: No way. It was full of plot holes, had poor acting, the camera work was terrible and the ending made no sense.

A: While all that is true, the movie's name began with the letter 'Q', which means that it must be brilliant!

B: Well, thats not really the definition of 'brilliant.' Let me get a dictionary to show you.

A: OMG. APPEAL TO AUTHORITY.

[/ QUOTE ]

I wasn't disagreeing with the definition. I was disagreeing with the "accepted answer" to the PD given that definition. The appeal to authority I was talking about was your reference to the accepted answer.

CallMeIshmael
06-24-2007, 02:04 AM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
Then the common use of the "assumption of common rationality" is different than the "assumption of infinite rationality" that you used in setting up this problem.

[/ QUOTE ]




Nope.

http://en.wikipedia.org/wiki/Common_knowledge_(logic)

[/ QUOTE ]

Then the accepted answer is wrong. Come up with a couterargument that isn't an argument from authority.

[/ QUOTE ]



We are talking about the definition of "the assumption of common/infinite rationality."

Your definition is wrong. I read through a book here to get a better understanding of it, and posted a wiki link (wiki isnt the greatest, but Im sure there are better readings online available).


If we disagree on a definition, then what means do I have beyond referencing authority to give credence to the claim that my definition is superior?


I mean, imagine the following dialogue:

A: That movie was brilliant!

B: No way. It was full of plot holes, had poor acting, the camera work was terrible and the ending made no sense.

A: While all that is true, the movie's name began with the letter 'Q', which means that it must be brilliant!

B: Well, thats not really the definition of 'brilliant.' Let me get a dictionary to show you.

A: OMG. APPEAL TO AUTHORITY.

[/ QUOTE ]

I wasn't disagreeing with the definition. I was disagreeing with the "accepted answer" to the PD given that definition. The appeal to authority I was talking about was your reference to the accepted answer.

[/ QUOTE ]


It really does just boil down to a definitional thing

You believe that "common assumption of rationality" means that players are deciding between symetrical payoffs, and are choosing the best option therein.

This is not part of that definition.

Nicholasp27
06-25-2007, 10:32 AM
in callme's oot example, u have to go $2, because otherwise u get nothing since ur opponent is john nash; u can go 100, but he'll go 2 and he'll get 4 and u'll get 0

the only way u get money playing against nash is to go 2, in which case u get 2

if u were playing some random person off the street, u could go in the high 90s, but against nash, being rational and knowing u /images/graemlins/heart.gif game theory, u have to go 2

Nicholasp27
06-25-2007, 10:43 AM
[ QUOTE ]
I say again, 99 is optimal.

The problem here is that game theory is butting its head in where it's not needed. This is a simple maths/logic question.

Every number is dominated by the number 1 lower. However, it dominates every choice 2 numbers lower

[/ QUOTE ]

apparently, u have not properly read the question

if u pick 100 and i pick 98, i'll get paid 100 and u'll get paid 96


if u pick 100 and i pick 2, i'll get paid 4 and u'll get paid 0


u seem to think that u will just lose 2 off of YOUR bid...but u lose 2 off of the lowest bid


that's why u HAVE to go 2 against another rational like nash if he knows u are rational, as he will go 2, so ur choices in this game are to get $2 or to get $0...take ur pick

Nicholasp27
06-25-2007, 10:50 AM
i agree that people are biased by the fact that it's 'only' $2

if u were playing against nash and it was a number between $2 billion and $100 billion, and u were wearing a "i heart nash equilibrium" tshirt, so he knew u were rational and knew game theory, then would u REALLY not pick $2 billion?

i would and i'd go home with $2 billion; whomever doesn't goes home with $0

Nicholasp27
06-25-2007, 10:53 AM
$100 is dominated by 99, thus should be eliminated from consideration


if opp goes: u go

100: 100->100, 99->101
99: 100->97, 99->99
98: 100->96, 99->96
... they are equal

thus 100 should be eliminated

we now look at the game as 2->99

well 99 is dominated by 98...

opp goes: u go
99: 99->99, 98->100
98: 99->96, 98->98
97: 99->95, 98->95
...

so remove 99 from your choices

play game with 2->98

and keep doing that all the way down to NE of 2,2

that's how game theory works; u shouldn't pick a strategy that is dominated by another; so eliminate that strategy and re-run the game without that as an option...do this until u find either a dominant strategy (over all other strategies) or u eliminate all dominated strategies

if u still have more than 1 left, then u may need to rand options; if u have only 1 strategy left, then u have equilibrium

Nicholasp27
06-25-2007, 11:07 AM
the traveler's dilemma experiment with game theorists also built in a bias that changed the results of the experiment, thus changing the game

they said they'd give a rand() player $20 times their average payout from playing the game

that gives people the incentive to go for higher numbers to try to get some meaningful amount of money; again, the low $ amount biases players...they could go $2 every time and average between 2-4, so $40-80 ev divided by 151 players...let's say 3 on average...that's 60/151=40 cents expected value

instead, players went for high amounts, and the rand winner got $1700 by averaging $85...

also, 10/51 'game theorists' chose 100, which is dominated by 99, for every round

so while interesting, that experiment was not the same game

wazz
06-26-2007, 12:38 AM
apparently you haven't read anything i've said

Siegmund
06-26-2007, 04:20 AM
In trying to resolve alleged paradoxes like these, people tend to forget the details behind how the equilibria are defined: the minimax solution of a game is its value against malevolent opponents, a Nash equilibrium is a position where you would play the same even if you had been told your opponent's strategy in advance, or, in other words, the best that a completely non-reactive player who ignores your plays even after they are revealed in an iterative game, can guarantee for himself.

There is a large class of games, including the traveller's dilemma, the prisoner's dilemma, the multi-person prisoner's dilemma, and the new 100-person game in the other thread, in which all players in the game face symmetric situations.

It's easy to prove, for any of the accepted definitions of optima or equilibria, that the solution of a symmetric game (if it exists) must be symmetric.

A very natural question, when faced with a symmetric game, is to find an optimal strategy over the sub-game where we know all players in the game will play the same (possibly mixed) strategy - that is, a 1-D optimization along the main diagonal of the payoff matrix, instead of the usual simultaneous and opposing row and column optimizations. Finding the solution is usually computationally very easy, and very often leads to what a lot of people believe to be the "right answer" to games like these.

I'm sure someone has studied this way of solving symmetric games before, and discussed solutions found in this manner before, but I don't recall reading about it. In the meantime, by analogy with the Cauchy Principal Value of an improper integral(*), I propose to call this the Diagonal Principal Value of a game, and the corresponding stragetics the Diagonal Optimal Strategies.

(*) - Quick calculus refresher: the improper integral from -infty to +infty only exists if the limit of the definite integral from a to b exists as a goes to -infty and b goes to +infty. Even when the improper integral is not defined (for instance, the integral of 1/x), the Cauchy Principal Value, defined as the limit of the integral from -a to a as a goes to infty, may exist (for 1/x it is zero.)