PDA

View Full Version : Interesting Mathematical Paradoxes


sirio11
03-06-2006, 05:45 AM
Please post a nice mathematical paradox that you know of,

I like this one:

Let S = 1 - 1 + 1 - 1 + 1 - 1 + 1 - 1 + ................

then S = 1 - (1 - 1 + 1 - 1 + 1 - 1 + ................)

so, S = 1 - S

therefore S + S = 1 , 2S = 1

therefore S = 1/2

so 1 - 1 + 1 - 1 + 1 - 1 + 1 - 1 + ..... = 1/2

DougShrapnel
03-06-2006, 06:08 AM
If we evaluate S at after random significant operations, the average would be .5.

traz
03-06-2006, 06:23 AM
There's a similar one that goes like this:

0 = (1 - 1) + (1 - 1) + (1 - 1) + . . .

expand...
0= 1 - 1 + 1 - 1 + 1 - 1 +...

rearrange...
0= 1 + (-1 + 1) + (-1 + 1) + (-1 + 1) + . . .

simplify...
0= 1 + 0 + 0 + 0 + . . .

Therefore, 0 = 1.

ffredd
03-06-2006, 06:45 AM
I like the shooting room paradox...

You bring a person into a room and then roll two dice. If the result is two sixes you shoot him. If the result is anything else you let him go. Then you bring two new people into the room and roll the dice again. If the result is two sixes you shoot them both. If the result is anything else you let them go. Then you bring four new people into the room...

You continue to bring twice as many people into the room as last time until you roll two sixes and are forced to kill everyone in the room. Then you just stop.

What is the probability that a person who is brought into the room is going to get shot? It seems obvious that the the answer must be 1/36 until you realize that more than half the people that enter the room will get shot. To be precise, if you shoot n people, only n-1 people have left the room alive, so n/(2n-1) of the people have been shot. This is more than 50% and tends to exactly 1/2 when n goes to infinity. Doesn't this mean that the probability is 1/2?

How worried should this person be?

evil twin
03-06-2006, 11:28 AM
None of these are mathematical paradoxes. They are examples of errors being made in the maths.

Nottom
03-06-2006, 11:40 AM
[ QUOTE ]
I like the shooting room paradox...

You bring a person into a room and then roll two dice. If the result is two sixes you shoot him. If the result is anything else you let him go. Then you bring two new people into the room and roll the dice again. If the result is two sixes you shoot them both. If the result is anything else you let them go. Then you bring four new people into the room...

You continue to bring twice as many people into the room as last time until you roll two sixes and are forced to kill everyone in the room. Then you just stop.

What is the probability that a person who is brought into the room is going to get shot? It seems obvious that the the answer must be 1/36 until you realize that more than half the people that enter the room will get shot. To be precise, if you shoot n people, only n-1 people have left the room alive, so n/(2n-1) of the people have been shot. This is more than 50% and tends to exactly 1/2 when n goes to infinity. Doesn't this mean that the probability is 1/2?

How worried should this person be?

[/ QUOTE ]

This is a pretty bad "paradox". Obviously your chances of being shot if you enter the room are 1/36. Also if you know n people are going to die then you know that double 6s will be rolled for the lg n-th group. So in addition to all the people who left alive before, all the people who would have to enter later also get to survive.

A more interesting problem would be something like:
Same basic set-up. Except we have a set number of people (say 1023), and we know that exactly one group of them will be shot. What are your chances of being shot?

Who Shot JR
03-06-2006, 12:01 PM
Are these problems actually paradoxes, or just mathematical slight of hand?

Let S = 1-1 + 1-1 + 1-1 + 1-1 + ...
[If this pattern repeats infinitly, then S=0]

Then S = 1 - (1-1 + 1-1 + 1-1 + ...) = 1 - S = 1 - 0 = 1
[After rearranging would seem that S=1]

However, this rearrangement is still actually equal to 0. If each +1 in the series has a corresponding -1 that cancels/zeros it such that the entire sum is 0, then there must be an equal (and even total) number of positive and negative 1's in the series. If we group all but the very first 1 in parenthesis, then there are an unequal (and odd total) number of +1's in the parenthesis, specifically there will be one more positive 1 than negative 1's.

Consider this example when there are only four 1's.

S = 1-1 + 1-1 = 0
S = 1 - (1-1 + 1) = 1 - (0 + 1) = 1 - 1 = 0

I know the point is that this is an infinite series, however the infinite series inside the parenthesis equals 1, not 0. That is, it equals 1 plus all the 1-1 pairs.

ffredd
03-06-2006, 12:28 PM
[ QUOTE ]
None of these are mathematical paradoxes. They are examples of errors being made in the maths.

[/ QUOTE ]
We know that.

BruceZ
03-06-2006, 12:32 PM
[ QUOTE ]
Are these problems actually paradoxes, or just mathematical slight of hand?

Let S = 1-1 + 1-1 + 1-1 + 1-1 + ...
[If this pattern repeats infinitly, then S=0]

Then S = 1 - (1-1 + 1-1 + 1-1 + ...) = 1 - S = 1 - 0 = 1
[After rearranging would seem that S=1]

However, this rearrangement is still actually equal to 0.

[/ QUOTE ]

Neither one of these, without parentheses around each pair of 1s, is equal to either 0 or 1 or to any number. The ... means that this is an infinite series, and its sum, if it exists, is defined as the limit of the sequence of partial sums. For the first case, the sequence of partial sums is 1,0,1,0,1,0,... but this sequence does not converge, so the infinite series S does not have a finite sum.

(1-1) + (1-1) + (1-1) + ... = 0 + 0 + 0 + ... does converge to 0.

Who Shot JR
03-06-2006, 12:50 PM
[ QUOTE ]
For the first case, the sequence of partial sums is 1,0,1,0,1,0,...

[/ QUOTE ]

I understand how this doesn't converge if this is indeed the sequence of partial sums. But if it is assumed that for every positive 1 there is a corresponding negative 1 - which I believe is what the problem is implying - and we sum these values, how is 1,0,1,0,1,0,... the sequence of partial sums, and not 0,0,0,0,0,... ?

Since addition is associative, aren't the parenthesis meaningless?

BruceZ
03-06-2006, 12:55 PM
[ QUOTE ]
[ QUOTE ]
For the first case, the sequence of partial sums is 1,0,1,0,1,0,...

[/ QUOTE ]

I understand how this doesn't converge if this is indeed the sequence of partial sums. But if it is assumed that for every positive 1 there is a corresponding negative 1 - which I believe is what the problem is implying - and we sum these values, how is 1,0,1,0,1,0,... the sequence of partial sums, and not 0,0,0,0,0,... ?

[/ QUOTE ]

The sequence of partial sums means:

1, 1-1, 1-1+1, 1-1+1-1, ... = 1,0,1,0,1...

where the nth element of the sequence is the sum of the first n terms.

The sequence of partial sums of (1-1) + (1-1) + (1-1) + ... is 0, 0+0, 0+0+0,...

These are two completely different series.

BruceZ
03-06-2006, 01:36 PM
Not a paradox, but a good logical fallacy:

An announcement is made that a fire drill will occur next week. The announcement states that it will occur between the hours of 9 AM and 5 PM on one of the days Monday through Friday, but you will not know on which day it will occur until it happens.

You reason that it can't happen on Friday, since you will know that when it doesn't occur by 5 PM Thursday. Since Friday is ruled out, the last day it can happen is Thursday. Then you reason that it can't happen on Thursday since you will know that when it doesn't occur by 5 PM Wednesday. Continuing this reasoning, you proceed to rule out each of the days until you are convinced that it can't happen at all and satisfy the claim that you won't know the day in advance.

Yet on one of the days the fire alarm rings out, and you are surprised. What happened?

Who Shot JR
03-06-2006, 03:58 PM
[ QUOTE ]
[ QUOTE ]
[ QUOTE ]
For the first case, the sequence of partial sums is 1,0,1,0,1,0,...

[/ QUOTE ]

I understand how this doesn't converge if this is indeed the sequence of partial sums. But if it is assumed that for every positive 1 there is a corresponding negative 1 - which I believe is what the problem is implying - and we sum these values, how is 1,0,1,0,1,0,... the sequence of partial sums, and not 0,0,0,0,0,... ?

[/ QUOTE ]

The sequence of partial sums means:

1, 1-1, 1-1+1, 1-1+1-1, ... = 1,0,1,0,1...

where the nth element of the sequence is the sum of the first n terms.

The sequence of partial sums of (1-1) + (1-1) + (1-1) + ... is 0, 0+0, 0+0+0,...

These are two completely different series.

[/ QUOTE ]

I see what you are saying, and that makes sense if the series is (-1)^k. The difference in our interpretation is that I see the series as 0 expressed as 1-1 (read: one minus one) repeated. That is, there are exactly as many +1's are there are -1's, so the partial sums are always 0.

Perhaps the problem just isn't defined rigorously enough.

BruceZ
03-06-2006, 04:45 PM
[ QUOTE ]
I see what you are saying, and that makes sense if the series is (-1)^k. The difference in our interpretation is that I see the series as 0 expressed as 1-1 (read: one minus one) repeated.

[/ QUOTE ]

If you write it as (1-1) + (1-1) + (1-1)..., then that IS zero, and the partial sums are zero. If you leave off the parentheses, it means something completely different which does not converge. Without parentheses the partial sums change to reflect the left to right summation, which bounces back and forth between 1 and 0.


[ QUOTE ]
That is, there are exactly as many +1's are there are -1's, so the partial sums are always 0.

[/ QUOTE ]

That is completely false for an infinite series. The sum of an infinite series depends on the order that you sum the terms. For a finite series your statement would be correct by the associative and commutative laws. I told you that the sequence of partial sums is the sequence for which the nth term is the sum of the first n terms. How can you say that this sequence is all zeros when every other element is 1?

Again, the sequence of partial sums for 1 - 1 + 1 - 1 ... is

1,
1 - 1 = 0,
1 - 1 + 1 = 1,
1 - 1 + 1 - 1 = 0,
...


For (1-1) + (1-1) + (1-1) + ... the sequence of partial sums is 0,0,0,0, as you suggest. I told you that this is a different series.


[ QUOTE ]
Perhaps the problem just isn't defined rigorously enough.

[/ QUOTE ]

What I am giving you IS the rigorous definition of this infinite series from real analysis, which is the branch of pure mathematics that forms the basis of calculus.

billygrippo
03-06-2006, 05:15 PM
not a paradox but this always blows my mind:

there are infinite numbers between 1 and 2. there are infinite numbers between 2 and 3. there are 2x infinite numbers between 1 and 3?

ive heard all kinds of explinations of this. not sure which one (if any) is right.

pzhon
03-06-2006, 05:46 PM
[ QUOTE ]
[ QUOTE ]

What is the probability that a person who is brought into the room is going to get shot?

[/ QUOTE ]

This is a pretty bad "paradox". Obviously your chances of being shot if you enter the room are 1/36.

[/ QUOTE ]
It's also obvious that the conditional probability is more than 1/2, which makes it a good paradox.

BruceZ
03-06-2006, 05:49 PM
[ QUOTE ]
The sum of an infinite series depends on the order that you sum the terms.

[/ QUOTE ]

How you group the terms, is what I meant to say here. It is also true for some infinite series (not this one) that the sum depends on the order that you sum the terms, and there are infinite series of real numbers for which it is possible to obtain any real number by simply rearranging the series.

threeonefour
03-06-2006, 05:56 PM
[ QUOTE ]
not a paradox but this always blows my mind:

there are infinite numbers between 1 and 2. there are infinite numbers between 2 and 3. there are 2x infinite numbers between 1 and 3?

ive heard all kinds of explinations of this. not sure which one (if any) is right.

[/ QUOTE ]

the above is not right. there ISN'T '2x infinite' between 1 and 3.


here is a really weird (but absolutely true) 'paradox'.


the natural numbers are just the positive intergers.

1,2,3,4,5,6,..........


the integers are the naturals plus there arithmatic inverses (the negatives) and zero.


........... -6,-5,-4,-3,-2,-1,0,1,2,3,4,5,6........


the rational numbers are expressed as n/k where n and k are integers and k !=0.



all those sets of numbers are the same size! there are as many natural numbers as there are integers as there are rationals. it makes absolutely no sense if you only use your intuition but its all mathematically proven (real proofs not mathematical voodoo).

BruceZ
03-06-2006, 06:06 PM
[ QUOTE ]
not a paradox but this always blows my mind:

there are infinite numbers between 1 and 2. there are infinite numbers between 2 and 3. there are 2x infinite numbers between 1 and 3?

[/ QUOTE ]

Infinite sets are said to have the same "cardinality" (or size) if their elements can be placed in one-to-one correspondence with each other. The set of real numbers between 1 and 2 can be placed in one-to-one correspondence with the numbers between 2 and 3 by simply adding 1, so these sets have the same cardinality. The real numbers between 1 and 2 can also be placed in one-to-one correspondence with the numbers between 1 and 3 by multiplying by 2 (to get 2 to 4) and subtracting 1, so these sets also have the same cardinality, even though one is a proper subset of the other. In fact, any interval of the real numbers can be shown to have the same cardinality as the entire set of real numbers.

It can be shown that the set of integers can NOT be placed in one-to-one correspondence with the set of real numbers, so these sets do not have the same cardinality. On the other hand, any infinite subset of the integers does have the same cardinality as the entire set of integers, and this includes the positive integers, negative integers, even integers, odd integers, and prime integers. These sets also have the same cardinality as the set of rational numbers.

billygrippo
03-06-2006, 06:12 PM
thats why it had a "?" at the end /images/graemlins/smile.gif

Sharkey
03-06-2006, 06:23 PM
[ QUOTE ]
An announcement is made that a fire drill will occur next week. The announcement states that it will occur between the hours of 9 AM and 5 PM on one of the days Monday through Friday, but you will not know on which day it will occur until it happens.

[/ QUOTE ]

Is the announcement considered infallible?

holmansf
03-06-2006, 08:39 PM
[ QUOTE ]
There's a similar one that goes like this:

0 = (1 - 1) + (1 - 1) + (1 - 1) + . . .

expand...
0= 1 - 1 + 1 - 1 + 1 - 1 +...

rearrange...
0= 1 + (-1 + 1) + (-1 + 1) + (-1 + 1) + . . .

simplify...
0= 1 + 0 + 0 + 0 + . . .

Therefore, 0 = 1.

[/ QUOTE ]

I like 1 = 0 "proofs," and this is a pretty good one. Of course they range from the blatantly incorrect to the one's that take a while to figure out.

For example, at the bad end of the spectrum we have:

Let x = y. Then x - y = 0 => 1 = 0/(x-y) = 0

Pretty lame. Getting a little better we have the classic one using the magic of complex numbers where i = Sqrt(-1):

-1 = i^2 = Sqrt(-1)*Sqrt(-1) = Sqrt(-1*-1) = Sqrt(1) = 1
Therefore -1 = 1 => 0 = 1.

Another complex one:

1 = e^(2*pi*i). Taking the natural log of both sides gives 0 = 2*pi*i => 1 = 0.

One of my favorites, for those who are familiar with integration by parts try integrating 1/x by parts:

Int(1/x) = x(1/x) + Int(x/x^2) = 1 + Int(1/x)
Subtracting Int(1/x) from both sides gives 1 = 0.

TomCollins
03-06-2006, 09:12 PM
[ QUOTE ]
[ QUOTE ]
An announcement is made that a fire drill will occur next week. The announcement states that it will occur between the hours of 9 AM and 5 PM on one of the days Monday through Friday, but you will not know on which day it will occur until it happens.

[/ QUOTE ]


Is the announcement considered infallible?

[/ QUOTE ]


The announcement is a paradox in itself. It is impossible to be uncertain of which day it will happen.

madnak
03-06-2006, 09:19 PM
[ QUOTE ]
The announcement is a paradox in itself. It is impossible to be uncertain of which day it will happen.

[/ QUOTE ]

It's impossible to be certain of being uncertain. But it's not impossible to be uncertain.

ffredd
03-06-2006, 10:16 PM
About the shooting room paradox, described above...

[ QUOTE ]
This is a pretty bad "paradox".

[/ QUOTE ]
I think you're just missing the point. I will try to explain it.

[ QUOTE ]
Also if you know n people are going to die then you know that double 6s will be rolled for the lg n-th group. So in addition to all the people who left alive before, all the people who would have to enter later also get to survive.


[/ QUOTE ]
We obviously don't know what n is in advance, and we're not concerned about the people who might have to enter later. We only care about the probability that a person who has already entered the room will get shot.

[ QUOTE ]
Obviously your chances of being shot if you enter the room are 1/36.

[/ QUOTE ]
I agree, but I think you're dismissing the argument for a probability of at least 50% a little to easily.

Consider this problem: 52 people are each given one card from a randomly shuffled deck of cards. You are one of those people and you haven't looked at your card yet. What is the probability that your card is an ace?

This problem is easy to solve. You know that 52 cards have been dealt and that 4 of them are aces. To get the probability you just divide 4 (the number of cards with the property of "being an ace") with 52 (the total number of cards) and get the answer 1/13.

This method of calculating the probability is obviously correct, right? And it's not a method that only applies to playing cards. Whenever we need to know the probability that an object chosen at random, from some group of objects, has a certain property, all we need to do is to divide the number of objects that have that property by the total number of objects in the group.

Now consider the thoughts of a person who just entered the shooting room. Let's assume that he and everyone else who enters the shooting room are blindfolded, so they have no way of knowing how many others were brought in with them.

I know that n people will eventually end up being shot, for some value of n. When this happens, 2n-1 people will have entered the room. n of them will have the property of "being dead". Hence the probability that I am one of them is n/(2n-1) = 1/(2-1/n) > 50%.

This argument is much too strong to be dismissed without motivation. Do you see what's wrong with it?

I actually don't know the complete answer myself, but I'm convinced that it has to have something to do with the fact that there's no upper bound on how big the number n can be.

gumpzilla
03-06-2006, 11:52 PM
I thought this one was cute the first time I saw it.

x^2 = x + x + x . . . + x ; x x's on the right hand side

Differentiating both sides,

2x = 1 + 1 + 1 . . . + 1

2x = x

2 = 1.

Nottom
03-07-2006, 12:09 AM
[ QUOTE ]
I know that n people will eventually end up being shot, for some value of n. When this happens, 2n-1 people will have entered the room. n of them will have the property of "being dead". Hence the probability that I am one of them is n/(2n-1) = 1/(2-1/n) > 50%.

This argument is much too strong to be dismissed without motivation. Do you see what's wrong with it?

I actually don't know the complete answer myself, but I'm convinced that it has to have something to do with the fact that there's no upper bound on how big the number n can be.

[/ QUOTE ]

The problem is it doesn't count the people that never enter the room.

Say there are 5 Billion people waiting, eventually N of them are shot, but before then about N of them survived and assuming this wasn't the last group then there are at least 2N (and likely many many more) left alive outside.

BonusPros
03-07-2006, 01:18 AM
i like math

ffredd
03-07-2006, 05:26 AM
[ QUOTE ]
The problem is it doesn't count the people that never enter the room.

Say there are 5 Billion people waiting, eventually N of them are shot, but before then about N of them survived and assuming this wasn't the last group then there are at least 2N (and likely many many more) left alive outside.

[/ QUOTE ]
No, that's not it. If you count those 5 billion people, that still doesn't make [number of people with the property of being dead]/[total number of people] = 1/36. Besides, 5 billion people aren't enough. You need a countably infinite number of people, since there is no upper bound on how many times you may have to roll the dice. The infinity is the problem. I just don't see exactly how.

If you don't like the idea of an infinite supply of people imagine that we're talking about e.g. small pieces of iron that we reshape into cubes when we bring them into the room, and that we "kill" them by reshaping them into spheres. What is the probability that a cube in the shooting room will be reshaped into a sphere?

It's still 1/36 of course, but how does that not contradict that [number of cubes]/[total number of cubes and spheres] > 50%?

Isura
03-07-2006, 11:02 AM
Banach-Tarski Paradox:
A solid ball can be decomposed and reassembled into two balls the same size as the original.

Nottom
03-07-2006, 11:57 AM
[ QUOTE ]
[ QUOTE ]
The problem is it doesn't count the people that never enter the room.

Say there are 5 Billion people waiting, eventually N of them are shot, but before then about N of them survived and assuming this wasn't the last group then there are at least 2N (and likely many many more) left alive outside.

[/ QUOTE ]
No, that's not it. If you count those 5 billion people, that still doesn't make [number of people with the property of being dead]/[total number of people] = 1/36. Besides, 5 billion people aren't enough. You need a countably infinite number of people, since there is no upper bound on how many times you may have to roll the dice. The infinity is the problem. I just don't see exactly how.

If you don't like the idea of an infinite supply of people imagine that we're talking about e.g. small pieces of iron that we reshape into cubes when we bring them into the room, and that we "kill" them by reshaping them into spheres. What is the probability that a cube in the shooting room will be reshaped into a sphere?

It's still 1/36 of course, but how does that not contradict that [number of cubes]/[total number of cubes and spheres] > 50%?

[/ QUOTE ]

My thoughts are that

a) number of dead people (often significantly) < 50% of total people.
b) this number has nothing to do with 1/36.

If I were sitting in the room waiting to get shot or not the only thing I know is that IF I go into the room I have a 1/36 chance of being shot. But since I don't have a 100% chance of entering the room (unless I am the first person in line) my actual chance of dieing is actually less than 1/36.

By doubling the number of people each time you are essentially doing the same thing a gambler does when using a martingale system, but I see no real paradox in the situation.

ffredd
03-07-2006, 02:38 PM
[ QUOTE ]
since I don't have a 100% chance of entering the room (unless I am the first person in line) my actual chance of dieing is actually less than 1/36.


[/ QUOTE ]
The problem doesn't ask for the probability that a person who hasn't been brought into the room will get shot, so that's not relevant.

By the way, that probability isn't just "less than 1/36". It is exactly zero, since only a finite number of people will get shot and an infinite number of people will never enter the room. (Yes, probabilites get weird when infinities are involved).

The problem asks for the probability that a person who has entered the room will die.

[ QUOTE ]
IF I go into the room I have a 1/36 chance of being shot.

[/ QUOTE ]
There is a very strong argument for that, but there's also a very strong argument that says the probability is at least 50%. You haven't explained why that argument is wrong.

[ QUOTE ]
a) number of dead people (often significantly) < 50% of total people.


[/ QUOTE ]
The number of dead people is 0% of the total until the moment that n people get shot. Then the number of dead people is 1/(2-1/n) of the total, if you only count people who have entered the room. But it's exactly 0% of the total if you also count everyone who hasn't. Neither of these numbers is equal to 1/36 (unless n=-1/34, which it clearly can't be).

And why would you include people who haven't even entered the room? That's like including playing cards that haven't been dealt in the calculation of the probability that your card is an ace in the easy problem I described above. Suppose I destroy one playing card that hasn't been dealt. This obviously can't change the probability that your card is an ace, but if you include all cards that exist in the calculation, the result is no longer exactly 1/13. It's obvious that in the easy card problem, the "total" to be used in the calculation is the 52 cards that have been dealt, not every card in existence.


[ QUOTE ]
I see no real paradox in the situation.


[/ QUOTE ]
Every person who enters the room has a 1/36 chance of getting shot, and yet most of them will get shot. You don't think that's even a little bit weird?!

fluorescenthippo
03-07-2006, 03:05 PM
[ QUOTE ]
I thought this one was cute the first time I saw it.

x^2 = x + x + x . . . + x ; x x's on the right hand side

Differentiating both sides,

2x = 1 + 1 + 1 . . . + 1

2x = x

2 = 1.

[/ QUOTE ]


so how can this be true if its not true? im missing the flaw in this math

BruceZ
03-07-2006, 03:12 PM
[ QUOTE ]
[ QUOTE ]
I thought this one was cute the first time I saw it.

x^2 = x + x + x . . . + x ; x x's on the right hand side

Differentiating both sides,

2x = 1 + 1 + 1 . . . + 1

2x = x

2 = 1.

[/ QUOTE ]


so how can this be true if its not true? im missing the flaw in this math

[/ QUOTE ]

x^2 = x + x + x . . . + x

is only true for integer x. These are different functions which are only equal for integer x, so their derivatives do not have to be equal. The fuction on the right is defined only for integer x, so it is not even differentiable.

Nottom
03-07-2006, 05:40 PM
[ QUOTE ]
Every person who enters the room has a 1/36 chance of getting shot, and yet most of them will get shot. You don't think that's even a little bit weird?!

[/ QUOTE ]

No, but maybe its jsut because I understand what is happening.

Like I said the best comparison is a martingale progresion. Given infinite time and money along with no max bet you will eventually win your 1 bet dispite the casino having an advantage. I wouldn't call this a paradox yet it is basically the exact same thing.

felson
03-07-2006, 06:15 PM
[ QUOTE ]
Not a paradox, but a good logical fallacy:

An announcement is made that a fire drill will occur next week. The announcement states that it will occur between the hours of 9 AM and 5 PM on one of the days Monday through Friday, but you will not know on which day it will occur until it happens.

You reason that it can't happen on Friday, since you will know that when it doesn't occur by 5 PM Thursday. Since Friday is ruled out, the last day it can happen is Thursday. Then you reason that it can't happen on Thursday since you will know that when it doesn't occur by 5 PM Wednesday. Continuing this reasoning, you proceed to rule out each of the days until you are convinced that it can't happen at all and satisfy the claim that you won't know the day in advance.

Yet on one of the days the fire alarm rings out, and you are surprised. What happened?

[/ QUOTE ]

You would have been surprised even if the fire alarm rang on Friday, since you have already ruled out that possibility. So the entire line of reasoning is in error. I think the fallacy lies in believing that the "surprise" can be guaranteed by the authority. Strangely, it does happen, but I don't think it can be logically guaranteed.

felson
03-07-2006, 06:18 PM
Here's an excellent paradox that was discussed in a game theory book I read. It takes some explanation so I will just post a link.

Newcomb's problem (http://en.wikipedia.org/wiki/Newcomb%27s_problem)

madnak
03-07-2006, 06:23 PM
[ QUOTE ]
No, but maybe its jsut because I understand what is happening.

Like I said the best comparison is a martingale progresion. Given infinite time and money along with no max bet you will eventually win your 1 bet dispite the casino having an advantage. I wouldn't call this a paradox yet it is basically the exact same thing.

[/ QUOTE ]

Every wager you make is -EV. But the sum of your wagers is +EV. I've never understood this about an infinite Martingale progression. How can it be +EV when every wager is -EV? Given any finite number you have a -EV situation, what changes with infinity? Is it just the monkey/typewriter effect in action?

I understand the basis is that the diminishing chance of losing as your bankroll goes up approaches 0. And since the probability of losing is 0, you will win. But it doesn't seem intuitive that infinity changes a clearly -EV situation into a clearly +EV situation. Then again, it doesn't seem intuitive that the proportion of numbers which doesn't inclue the numeral 3 is 0, since I can come up with distinct examples of such numbers. Infinity really destroys the results of finite thinking.

_brady_
03-07-2006, 06:55 PM
[ QUOTE ]

I know that n people will eventually end up being shot, for some value of n. When this happens, 2n-1 people will have entered the room. n of them will have the property of "being dead". Hence the probability that I am one of them is n/(2n-1) = 1/(2-1/n) > 50%.

This argument is much too strong to be dismissed without motivation. Do you see what's wrong with it?

I actually don't know the complete answer myself, but I'm convinced that it has to have something to do with the fact that there's no upper bound on how big the number n can be.

[/ QUOTE ]

Maybe my thinking is wrong, but here it goes:

If you don't know what group you'll be placed in, and assuming that we have unlimited people to do this with then I think at the beginning your chance of dying is 50%. However once you are in the room your chance of dying is 1/36. So I don't think the 1/36 is number you want to use.

Since we know that one group of people has to die eventually then the dice are basically irrelevant, and we should think of it more like this:

Let's say every person on earth is assigned to a group in numbers of 1+2+4+6+8 until everyone has a group, and one group is randomly chosen to die. If you know that you are the lucky person in your own group then your chance of dying is 1/n where n=Total Groups. However, if you don't know what group you are in then your chance of dying is much much higher, and I think it would be 50%. Not exactly sure right now how to prove that mathematically.

I hope it makes sense what I'm trying to get across. I always have a hard time making sense when I try to explain what I'm thinking when it comes to math.

Edited because n=Total Population was wrong, it's n=Total amount of groups.

Ah! Now I'm thinking it wouldn't be 50%, but still greater than 1/36. This has got my head spinning! I think I'm on the right path though with regards to completely disregarding the dice.

Double Down
03-07-2006, 06:55 PM
OK, so I think I have found a couple flaws in the logic. First of all, you're trying to figure out the chances of a person being shot. Well, as opposed to the card example, the number of people in this case ARE infinite, just because it stops after a certain point doesn't mean that those who would've gone next are not in consideration. They are.

Every roll of the dice is independent, and it does not matter how many people are in the room with you or have been in the past or in the future. Right now, you only have a 1/36 chance of being shot.

As long as you might need an infinite number of people to complete this task, they need to be considered, even if they end up not going into the room. So YES, more than half of the people who eventually enter the room will be shot, but the ratio of people who are shot compared to all people involved is much less.

In other words, when you walk into the room, it is illogical to say to yourself that you have a better than 50% chance of being shot just because more than half of those who walk in the room are shot. You have to reason that wait a minute, what about the people who come in after me?

Actually, yes, the Martingale system is a perfect example of this logical flaw.

housenuts
03-07-2006, 07:30 PM
how about this classic:

You and 2 friends go to a hotel. The man at the front desk charges you $30, so you each pay $10. The front desk realizes he should have only charged you $25 so he sends the bellboy up to give you a $5 refund. Instead the bellboy gives you each $1 back and keep $2 for himself.

So now you've each paid $9 ($9 x 3 = $27) and the bellboy kept $2, where is the extra $1 to make the original $30?

_brady_
03-07-2006, 07:47 PM
[ QUOTE ]
how about this classic:

You and 2 friends go to a hotel. The man at the front desk charges you $30, so you each pay $10. The front desk realizes he should have only charged you $25 so he sends the bellboy up to give you a $5 refund. Instead the bellboy gives you each $1 back and keep $2 for himself.

So now you've each paid $9 ($9 x 3 = $27) and the bellboy kept $2, where is the extra $1 to make the original $30?

[/ QUOTE ]

The math is done at the wrong time.

- $30 paid ($10 each)
- $5 returned so $25 paid ($8.33 each)
- they each take a dollar and split the $2 tip three ways ($2/3 equals $0.67)
- They each paid $8.33+0.67=$9, 9*3=27, and each customer pockets one dollar, 27+3=30

madnak
03-07-2006, 07:48 PM
But this doesn't address the problem that on entering the room, your chance of death is 1/36. However, more than 50% of those who enter will die.

It is essentially identical to a Martingale system in which you have a ~2.8% to double your investment and a ~97.2% chance to lose it. Despite the massive -EV nature of each transaction (only a 1/18 payoff), it is still a +EV situation overall (more than 1:1 payoff ratio).

Incidentally, if you're part of the infinite pool of people, your chance to die will be exactly 0. No matter how many people are actually killed. Even if you are one of the people who are killed, the probability of your having been such will have been 0.

I think the idea has something to do with the fact that if your bankroll is infinite, winning a bet won't change your bankroll. Neither will losing a bet. By definition, no matter what happens your bankroll remains unchanged. Yet, over the course of infinite bets made according to a Martingale progression, those bets will have a positive outcome. (Any finite number of bets will have a negative outcome)

ffredd
03-07-2006, 08:28 PM
[ QUOTE ]
Every roll of the dice is independent, and it does not matter how many people are in the room with you or have been in the past or in the future. Right now, you only have a 1/36 chance of being shot.


[/ QUOTE ]
This is the part we all agree on.

[ QUOTE ]
As long as you might need an infinite number of people to complete this task, they need to be considered, even if they end up not going into the room. So YES, more than half of the people who eventually enter the room will be shot, but the ratio of people who are shot compared to all people involved is much less.


[/ QUOTE ]
To be more precise, it is exactly zero.

[ QUOTE ]
In other words, when you walk into the room, it is illogical to say to yourself that you have a better than 50% chance of being shot just because more than half of those who walk in the room are shot. You have to reason that wait a minute, what about the people who come in after me?


[/ QUOTE ]
Is it more logical to get the result that there is exactly 0% chance of getting shot, which is what you just did? /images/graemlins/smile.gif

I think a lot of people get confused by thinking about people who never enter the room. I've spent a lot of time thinking about this too. That's why I suggested a modified version of this problem in a previous post, with iron cubes instead of people. These cubes aren't created until they are brought into the room. (Imagine that when it's time to bring a new group into the room, we cut the necessary number of pieces from an infinitely long block of iron, bring them into the room, and finally shape them into cubes before we roll the dice). Instead of getting shot, they get reshaped into spheres. After the experiment is over, there are n cubes and n-1 spheres, and there are NO cubes who never entered the room.

ButTheyWereSuitd
03-07-2006, 08:38 PM
A friend asked me this. I'm sure there is a logical explanation, but I don't know what it is

If you want to walk a distance of 100', you must first walk 50', but before that you must cover 25' and so on. Obviously it can be done in a set amount of time, but I would like to see the logical reasoning behind this.

Does it have something to do with our perception of time?

(I hope this isn't too lame for my first post.)

ffredd
03-07-2006, 08:44 PM
[ QUOTE ]
Like I said the best comparison is a martingale progresion. Given infinite time and money along with no max bet you will eventually win your 1 bet dispite the casino having an advantage. I wouldn't call this a paradox yet it is basically the exact same thing.

[/ QUOTE ]
Yes I thought about this too a couple of years ago when I first heard of the shooting room paradox, and it made me feel a little bit more comfortable, even though I still didn't see how the paradox could be resolved. The analogy with the Martingale system (I had no idea that it's called that) is what made me think that the solution must be related to the fact that there's an infinity involved in this.

madnak
03-07-2006, 08:45 PM
But there must be an infinite number of potential spheres in order for the scenario to work.

Nottom
03-08-2006, 12:16 AM
[ QUOTE ]
A friend asked me this. I'm sure there is a logical explanation, but I don't know what it is

If you want to walk a distance of 100', you must first walk 50', but before that you must cover 25' and so on. Obviously it can be done in a set amount of time, but I would like to see the logical reasoning behind this.

Does it have something to do with our perception of time?

(I hope this isn't too lame for my first post.)

[/ QUOTE ]

This is a classic example of an infinite series. (http://en.wikipedia.org/wiki/Infinite_Series)

holmansf
03-08-2006, 04:41 PM
[ QUOTE ]
Banach-Tarski Paradox:
A solid ball can be decomposed and reassembled into two balls the same size as the original.

[/ QUOTE ]

Yet another reason not to believe the Axiom of Choice /images/graemlins/wink.gif

pzhon
03-08-2006, 05:16 PM
[ QUOTE ]
Banach-Tarski Paradox:
A solid ball can be decomposed and reassembled into two balls the same size as the original.

[/ QUOTE ]
... using finitely many pieces, with each piece moved rigidly.

Dazarath
03-10-2006, 07:49 AM
[ QUOTE ]
I like the shooting room paradox...

You bring a person into a room and then roll two dice. If the result is two sixes you shoot him. If the result is anything else you let him go. Then you bring two new people into the room and roll the dice again. If the result is two sixes you shoot them both. If the result is anything else you let them go. Then you bring four new people into the room...

You continue to bring twice as many people into the room as last time until you roll two sixes and are forced to kill everyone in the room. Then you just stop.

What is the probability that a person who is brought into the room is going to get shot? It seems obvious that the the answer must be 1/36 until you realize that more than half the people that enter the room will get shot. To be precise, if you shoot n people, only n-1 people have left the room alive, so n/(2n-1) of the people have been shot. This is more than 50% and tends to exactly 1/2 when n goes to infinity. Doesn't this mean that the probability is 1/2?

How worried should this person be?

[/ QUOTE ]

Someone already pointed it out, but your 1/2 isn't the probability of a person being shot once they've entered the room. I think we can all agree that if you take any particular person, if they enter the room, the chance of them being shot is 1/36. Now, you could also ask the question, "what is the chance of a given person being shot?" Assuming an infinite pool of people, then the answer is 0, for reasons already stated earlier by someone else. The 1/2 is basically (the number of people shot)/(the number of people who entered the room). This is something that can only be seen in retrospect. So you're basically asking, "of the people who will have entered the room, what's the probability that a given person will be shot?" That's a pretty ridiculous question to ask. You're comparing the wrong number to the wrong question.