PDA

View Full Version : Something about probablility



Pages : [1] 2 3

mike_g
03-08-2008, 09:05 AM
I was talking to this guy that loves gambling the other day, and he had this theory about how to win lots of money.

Using a roulette example (ignoring 0 for simplicity sake) he said that each time one colour follows a colour of the same type the probability of the ball landing on the opposing colour doubles. EG: 3 blacks in a row means that on the next spin theres a 1:8 probability that by betting red you would win.

I cant see how that would work. The way i see it is that you would still 2:1 odds (or perhaps only a tiny fraction less) as the result would be relevant to the point in time that you place the bet.

His argument was that it was due to the law of averages, which to him meant that he was right. AFAIK the law of averages was made up to measure the behavior of atoms when there was no accurate way of doing so and only really applies to orders of great magnitude.

Anyone know whats right here?

vart
03-08-2008, 09:11 AM
the right is - each next event does not depends on the previous one, so red probability is still even to black...

Thing changes when you talk about cards hoever - each card removed from the stack changes probability of the cards left... of course not in so drastic way how he wants to believe...

mike_g
03-08-2008, 09:18 AM
Ok, cool. I was pretty sure I must have been right. Shame I couldnt seem to convince the dude tho. I guess hes going to have to keep paying his stupid peoples tax then.

vart
03-08-2008, 09:33 AM
lottary is just another tax for people that do not know mathematics

abh!shek
03-08-2008, 09:41 AM
lottary is just another tax for people that do not know mathematics

That's a good quote :)

whiteflags
03-08-2008, 10:44 AM
Interestingly, I heard that there are more permutations resulting from shuffling Western playing cards than there are stars in the universe.

CornedBee
03-08-2008, 10:57 AM
Good quote indeed, but it's "lottery".

"Dice have no memory."

You can try to explain it to your friend this way: if I flip a coin 10 times, probability says that the most likely outcome is 5 heads and 5 tails. The probability of the same result on all ten throws is 1:2^9 == 1:512; the probability of specifically heads or tails on all ten throws is 1:2^10 == 1:1024.
If I throw 9 heads, what's the probability that the tenth throw is tails? Your friend would think it's higher, but it's not. The probably is 1:2, just as for any individual throw. The thing is, if the tenth throw is heads, I've thrown 10 heads in a series, with a probability of 1:1024, BUT as I'm about to throw it for the tenth time, I've already thrown 9 heads, with a probability of 1:512. That was the hard part, so to say.

Or to put it another way: there are 1024 possible series of heads or tails for 10 throws, like hhhhhhhhhh, hthhttthtt, or something like that. Every specific series has a probability of 1:1024. In this sense, the series hhhhhhhhht (nine heads followed by a tail) is just as unlikely as hhhhhhhhhh (ten heads). Therefore, if we've reached the nine heads, the last throw must be equally likely heads or tails, because otherwise hhhhhhhhht would be more likely than hhhhhhhhhh.

Or: throwing one tail and 9 heads, regardless of order, has a probability of 10:1024 (there are ten possible series of throws that contain 1 head, each with a probability of 1:1024). However, having that tail exactly in the last place, that's not any less remarkable than throwing only heads.



Roulette is the same, except for the 0, which guarantees that the bank always wins in the long run.

Oysterman
03-08-2008, 11:05 AM
Interestingly, I heard that there are more permutations resulting from shuffling Western playing cards than there are stars in the universe.
If there are less than 10^68 stars in the universe, then that would indeed be correct.


Roulette is the same, except for the 0, which guarantees that the bank always wins in the long run.
Not to mention that many casinos have roulette wheels with a 00 field, as well.

ting
03-08-2008, 11:39 AM
he had this theory about how to win lots of money

Can you ask the smart dude how much he actually win using this method?

He is right about one thing: if all first 9 turns are red/even, the tenth turn is more likely to be black/odd. If the payout is x2 and he had invested 1 dollar at each turn, he is very likely to win 2 dollars on the tenth turn, but he had to lose 9 dollars to get there.

and he forgot about the 00 where the house automatically wins.

tell the smart dude to play blackjack, at least there is the black jack that pays x1.5 principal, and the player has some control to make the odds even against the house.

--TING

vart
03-08-2008, 11:46 AM
Can you ask the smart dude how much he actually win using this method?

He is right about one thing: if all first 9 turns are red/even, the tenth turn is more likely to be black/odd.
Even if you like it more - the chance is the same

ting
03-08-2008, 01:04 PM
i think i got what the smart dude means. he has to seat besides the roulette and wait for probably 5 odd's to show up, then start betting money hoping that 6th turn on even because it's more probably according to the "law of average"

why does it sound logical & illogical at the same time? i learned probability before, but i want to disapprove this smart dude's "laws of average" based on predicate logic and reasoning. Besides i never believe probability theory until i studied it myself. the smart dude probably won't be convinced by probability alone.

i think the problem is that he neglects that averages not only refers to chance but also to the pattern; If his theory were true, then in a 10 roll bet, if all of the first five rolls are even, the next 5 rolls must be all odd to make up for the lost probability in the first 5 rolls. this sounds ridiculous, but that's what the smart dude's "laws of average" dicates.

--TING

DavidP
03-08-2008, 01:32 PM
The human mind wants to believe that in a 50:50 situation, the more you get one output, the higher the chance you will get the opposite output the next time around.

That is false, however. There is always a 50:50 chance on every iteration. It is just our notion to want to believe otherwise.



It is true, however, that although you always have a 50:50 chance of getting one or the other - you have less of a chance of getting a long string of the same thing.

For example, take a coin. If you have a 50% chance of getting heads, and you flip the coin 5 times, then you have a:

0.5 * 0.5 * 0.5 * 0.5 * 0.5 = 0.03125

3.125% chance of getting 5 heads in a row. That does not change the fact that you have a 50% chance each individual toss.

DavidP
03-08-2008, 01:35 PM
By the way, I don't want to create too large of a tangent, but in my statistics class my professor showed us a clip from a movie in which a guy kept flipping a coin 150 times and he got heads every single time.

Does anyone know what movie that is?

Mario F.
03-08-2008, 01:49 PM
Well, there is nothing wrong in the laws of probabilities if I throw a perfect 6-sided dice 1^64 times and they all fall on 1. There is always the chance the next 1^64 * 6 throws will even out the result. Meanwhile my next throw has a 1:6 chance of being 1.

Or, if you want... if we somehow counted all dice ever thrown by mankind ever since they were invented to this day, the odds of the next throw being 1 would be 1:6.

Daved
03-08-2008, 02:21 PM
>> Does anyone know what movie that is?
Rosencrantz and Guildenstern Are Dead? Fiction, though.

The gambler probably isn't hurt by his bad prob-abilities because the odds of it hitting black are still the same whether he picks it for good or bad reasons.

----------------------

Reminds me of another probability question (that I often ask in interviews):


On the game show Let's Make a Deal, the host is Monte Hall. Monte shows the final contestant three doors, 1, 2 and 3. Behind one of the doors is $100,000 and behind the other two are donkeys.

Monte will ask the contestant to pick a door. Then, he will open a different door to reveal a donkey leaving two doors still closed (one of which is the door picked by the contestant). Monte then offers this deal to the contestant, "If you want to switch to the other closed door, you can do it for free." Once the contestant decides then the doors are opened and the contestant wins what is behind the one that was chosen.

If you're the contestant, should you switch doors? Should you stay on your door? Does it matter?

laserlight
03-08-2008, 02:24 PM
If you're the contestant, should you switch doors? Should you stay on your door? Does it matter?
I will switch to the opened door. Each of those golden donkeys are worth $200,000.

CornedBee
03-08-2008, 02:32 PM
I will switch doors. After one of the wrong doors has been removed from play, the chance that I get the right door by switching equals the chance of hitting a wrong door on the first try, i.e. 66%.

Mario F.
03-08-2008, 02:44 PM
It really doesn't matter. MY chance of hitting the right door is 1:2 whether I switch or not.

laserlight
03-08-2008, 02:51 PM
It really doesn't matter. MY chance of hitting the right door is 1:2 whether I switch or not.
So you disagree with CornedBee's reasoning? Remember, the host responds to what you originally chose.

Incidentally, I'll probably have to keep to my chosen door since the fellow does not allow me to grab the donkey he revealed :p

mike_g
03-08-2008, 02:57 PM
I'd go berserk rip the doors out, grab the money, mount a donkey and charge off.

Mario F.
03-08-2008, 03:00 PM
So you disagree with CornedBee's reasoning? Remember, the host responds to what you originally chose.

CornedBee reasoning is interesting, but falls to the same trap we have been discussing. What generates the probability event is that last instant when I'm given the choice of switching or not.

Since one door has a donkey and another the money, odds are 1:2

EDI: As for the host, no matter what door I choose initially, the host can always open a door with a donkey in it.

Daved
03-08-2008, 03:03 PM
The discussion/reasoning is an important part of the interview question.

Let's hear what others have to say before we settle it.

laserlight
03-08-2008, 03:10 PM
The discussion/reasoning is an important part of the interview question.
That's what I thought, since this problem has been circulating for years. Just blurting out a "model answer" should not be good enough.

Daved
03-08-2008, 03:15 PM
For the people who have heard it, I ask them to explain the answer to me. The ones who have not often get it wrong, so I listen to their reasoning and also pay attention to how well it seems they understand the answer when I explain it.

It's actually better if they've heard it before, because I really get a feel for how well they can express and explain something (which is very important when coding in teams). It's harder to tell if they really understand you when you explain it to them.

Mario F.
03-08-2008, 03:18 PM
I haven't heard it before and just to be safe I went back and checked for the possibility of it being a trick question :)

It doesn't seem so. I stand by my answer.

mike_g
03-08-2008, 03:21 PM
Yeah, I'm with Mario on this. My reasoning would be exactly the same.

robwhit
03-08-2008, 03:40 PM
I'd keep my choice, but I wouldn't tell the host. That way, I'd always be right.

Heisenberg's cat theory is stupid.

laserlight
03-08-2008, 03:43 PM
I haven't heard it before and just to be safe I went back and checked for the possibility of it being a trick question
I do not think it is a trick question when phrased this way, though on a game show it probably would be a trick question (which is why I would never want to be a game show contestant... I will probably freak out on stage when presented with trick questions).

My reasoning would be based on counting:

Let A be the winning door.

Suppose you pick door A. If you switch, you lose.
Suppose you pick door B. If you switch, you win.
Suppose you pick door C. If you switch, you win.

Since you pick the door at random and switching wins 2/3 of the time, you should switch to win. Although the host has the choice of opening either B or C if you pick A, his choice does not matter since you already played your move (pick A then switch).

The reason this happens is that the host responds to your choice by eliminating a losing door ("after one of the wrong doors has been removed from play"), so it is a matter of counting the number of wins versus losses when a switch is made after each of the original three choices.

If the host eliminates a door at random (e.g., instead of opening the door, he claims that the closed door he selected is a poor choice, but he does not know what is behind the door), then indeed it would make no difference.

mike_g
03-08-2008, 03:52 PM
Actually that makes sense now as the host would not open the winning door as that would not make good tv. So by switching you would be moving from 1/3 to 2/3 chance of winning. So yeah I get it now :D

CornedBee
03-08-2008, 03:59 PM
I hate statistics, but I love probability.

Mario F.
03-08-2008, 04:06 PM
Actually that makes sense now as the host would not open the winning door as that would not make good tv. So by switching you would be moving from 1/3 to 2/3 chance of winning. So yeah I get it now :D

This should be interesting...

Step 1. What's the odds of choosing the right door on your first try? 33%

Step 2. The host eliminates a door and essentially asks you to choose again from 2 doors.

Step 3. What's the odds of choosing the right door? 50%

The whole point of the exercise is to confuse the player and probably get a few giggles out of his indecision. But the matter of the fact is that the game was all about choosing one of two doors.

mike_g
03-08-2008, 04:13 PM
Step 3. What's the odds of choosing the right door? 50%
Yes, but because this is a gameshow we can assume the host knows the winning door. Now he is not going to open a door with the treasure behind it; what would be the point in making a choice then? Its either lose or lose. So by switching you do infact move to a 2/3 chance of winning.

If it were in a non-gameshow scenario and the door to be revealed was picked from random it would still be a 50:50 chance.

Neo1
03-08-2008, 04:18 PM
I think Mario is right, the presented problem is not about choosing one of three doors, the last door is irrelevant.

Whichever door you choose, one of the doors will be removed, and since the door with the treasure won't be removed, and the one you pick won't be removed either, we know that one of the doors with a donkey behind it gets scrapped, it's a 50% chance all the way through.

Mario F.
03-08-2008, 04:22 PM
So by switching you do infact move to a 2/3 chance of winning.

Not really, lol.

I'm getting confused now too, hehe. I'm struggling to find the words here... This is one of those times I wished English was my mother tongue.

Ok... Let's see....

Probabilities are non deterministic in the sense that events prior to the one being calculated don't affect the outcome and our results won't affect the next event. It's the coin toss thing we were discussing before. Just because I threw 1^64 coins and they all fell on heads, it doesn't mean there's a higher probability of the next toss fall on tails.

Now...

The host naturally knows the winning door. But we don't. And it is we that are calculating the odds. Not the host. So from out point of view we have two doors and we can choose one of them. Our odds of winning is 1:2

Whatever happened before that, is irrelevant because our probability event is right there, the decision to swap or not.

Swap or not swap. 1:2. This is the final result. Consequentely, since the odds are even, it doesn't matter what we do.

laserlight
03-08-2008, 04:24 PM
Whichever door you choose, one of the doors will be removed, and since the door with the treasure won't be removed, and the one you pick won't be removed either, we know that one of the doors with a donkey behind it gets scrapped, it's a 50% chance all the way through.
How would you explain why, by mere counting, a strategy of "pick at random then switch" wins 2/3 of the time?

Mario F.
03-08-2008, 04:27 PM
How would you explain why, by mere counting, a strategy of "pick at random then switch" wins 2/3 of the time?

Because you are making the mistake of assuming your first choice mattered. It didn't. By eliminating one door, the rules have changed. You are now picking from one of two doors.

CornedBee
03-08-2008, 04:30 PM
You're still using intuition to counter counting. That won't work. The entire purpose of the puzzle is that it goes against intuition.

mike_g
03-08-2008, 04:30 PM
Whichever door you choose, one of the doors will be removed, and since the door with the treasure won't be removed, and the one you pick won't be removed either, we know that one of the doors with a donkey behind it gets scrapped, it's a 50% chance all the way through.
Look at it this way, when you start you pick a door. lets split the doors between you and the host:

You: door
Host: door, door
We dont know whats behind any doors; only the host does.

At this point in time your door has a 1/3 chance of winning, and the host doors 2/3 if you coul pick both of them

Now the host has to reveal a door. If your 1/3 chance choice was correct, it wouldent matter what door he reveals, if you switch you lose.

On the other hand if either of the doors he holds has the money, you will win because he is not going to reveal the winning door. So out of the 2 doors he has you get the one (if any) that will be the winner.

robwhit
03-08-2008, 04:35 PM
Look at it this way: whats the probability that the door will be the one the host chooses? One half and one half and zero. So the chance that the door that you don't choose is one that you can switch to is 1/2.

Which door the host chooses is part of the problem from square one.

Mario F.
03-08-2008, 04:39 PM
No. The host knows the winning door. The problem specifically states the host choses a losing door. And he can always choose a losing door no matter what is your initial choice.

laserlight
03-08-2008, 04:47 PM
Because you are making the mistake of assuming your first choice mattered. It didn't. By eliminating one door, the rules have changed. You are now picking from one of two doors.
The first choice does not matter. The strategy matters. I am presenting two possible strategies:
1. Switch. (That is, pick the other door.)
2. Don't switch. (That is, pick the door you first chose.)

Based on the fact that there are two choices of strategy, you are proposing that the strategies have an equal chance of winning.

Now, suppose that all contestants, when faced with this problem, choose strategy #2. They do not switch. We would expect that, in the long run, half of them would win.

Suppose also that door A is always the winning door, and that the contestants choose the door at random (i.e., they do not suspect that A is always the winning door).

So, a billion contestants choose door A and stick with it, another billion contestants choose door B and stick with it, and yet another billion contestants choose door C and stick with it. However, only 1/3 of them win, since only 1/3 of them chose door A.

This is a contradiction, which implies that the original assumption is false. The strategies do not have an equal chance of winning.

robwhit
03-08-2008, 04:50 PM
No. The host knows the winning door. The problem specifically states the host choses a losing door. And he can always choose a losing door no matter what is your initial choice.how does that contradict my post?

mike_g
03-08-2008, 04:59 PM
Mario, robwhit: try splitting the doors into 2 categories.

One door you have chose.
Two doors you have not.

Out of both of the two doors there is a 2/3 chance to win.

Now make a truth table of what the host can have:

1: donkey, donkey
2: money, donkey
3: donkey, money
The host HAS to reveal one door. The host will NOT reveal the money. So lets look at what we have left:

donkey
money
money
Make sense?

Perspective
03-08-2008, 05:01 PM
CornedBee et. al. are correct. Mario, think of it this way:

The probability that you picked the right door first is 1/3 (I don't think there is any argument here). Thus, the probability that the other door is correct *must* be 2/3 (probabilities sum to one, and the eliminated door has a probability of 0 for being correct).

Mario F.
03-08-2008, 05:05 PM
So, a billion contestants choose door A and stick with it, another billion contestants choose door B and stick with it, and yet another billion contestants choose door C and stick with it. However, only 1/3 of them win, since only 1/3 of them chose door A.

This is a contradiction, which implies that the original assumption is false. The strategies do not have an equal chance of winning.

Oh, no. You can't do it that way. You proved there is a 1/3 chance of winning if nothing else happens. That is, the available doors is kept at 3.

But something changed. The host removed a losing door and asked you to start all over again. Choose one of two doors. That is your event, laserlight. The previous choice will have no consequence on the outcome of this new choice.

CornedBee mentioned this exercise goes against intuition and I've been wracking my brain ever since trying to see where I'm failing. However I can't. This is really that simple. In the end you are just asked to choose from two doors.

For your odds to become 66% you would have to be able to pick 2 doors out of three and not know what's behind the 3rd door.

robwhit
03-08-2008, 05:05 PM
Make sense?No...

Daved
03-08-2008, 05:10 PM
>> The previous choice will have no consequence on the outcome of this new choice.
The previous choice does have a consequence on what door the host opens. That is how the outcome is affected by the original choice, and that is the piece that I believe you are missing in your analysis.

Unlike the gambler's situation that started this thread, in this case the previous events do have an effect on later events. If you pick a door with a donkey, that forces the host to open the other door with a donkey. Therefore, your choice affects what the final options will be, and it is that effect that shows up in the different probability.

Mario F.
03-08-2008, 05:13 PM
Yes Daved. But the host always opens a losing door, no matter my choice. So his action has a null value.

I'm still firmly convinced of my answer, but can't ignore the overwhelming majority. So, I'll give this some hard thought before I post again.

I just think you guys are trying to mix events and forgetting that the one event that matters is that last instant where the contestant is faced with two doors.

Meanwhile, sorry robwhit. Didn't read your post correctly the first time.

laserlight
03-08-2008, 05:15 PM
Oh, no. You can't do it that way. You proved there is a 1/3 chance of winning if nothing else happens. That is, the available doors is kept at 3.

But something changed. The host removed a losing door and asked you to start all over again. Choose one of two doors. That is your event, laserlight. The previous choice will have no consequence on the outcome of this new choice.

CornedBee mentioned this exercise goes against intuition and I've been wracking my brain ever since trying to see where I'm failing. However I can't. This is really that simple. In the end you are just asked to choose from two doors.

For your odds to become 66% you would have to be able to pick 2 doors out of three and not know what's behind the 3rd door.
*sigh*
Read the question: "If you're the contestant, should you switch doors? Should you stay on your door? Does it matter?"

The question is not "which of the doors has the highest probability of winning?" The question is: "which of the two strategies is better, or are they equally viable?"

robwhit
03-08-2008, 05:18 PM
1x + 2y - 1y = 1x + 1y no matter how you look at it.