• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Monty Hall Problem... For Newbies

Don't answer this poll until AFTER your read the OP!


  • Total voters
    141
Brian-M has it right. The probability is 1/3. I challenge anyone who is unconvinced to try a simulation.

If it was "you know the first coin toss is heads, now what is the probability of two heads?" then it would be 50/50.
Yup; the chance of two coins landing both heads is 1 in 4. If you discount the 1 in 4 chance of both being tails, the chance two heads becomes 1 in 3.
 
The point of looking at the variations is to figure out which assumptions are necessary to make "1/3 stay, 2/3 switch" the right answer. For example, it's fairly clear that it matters whether Monty has to open a door, buy you might have thought it does not matter whether Monty has to open a goat door: as long as he does open a goat door that is all that matters. However, above it was shown that if he randomly chooses a door to open (and so could open the car door), then it is 50/50 when he opens a goat door.



You might have thought that as long as Monty has to reveal a goat, it doesn't matter how he picks which goat to reveal when he has a choice. However, it was shown that if he has a known bias toward one door, then the probabilities change.



Some of the explanations given for why the answer is "1/3 stay" do not work due to these facts, but it is hard to see that without considering variations on the original problem.
 
Why 1/2 and not 1/3?

Maybe I'm not seeing the answer because it's almost midnight here and I need to get some sleep, but it seems to me that the following outcomes of the coin-toss would be equally likely...

Heads + Heads
Heads + Tails
Tails + Heads
Tails + Tails


Since each of the three outcomes containing heads are equally likely and only one of those outcomes has two heads, it seems to me that the chances of both coins being heads is 1/3.

But I'm assuming that the question is only going to be asked if at least one of the coins comes up heads. If you were going to report the outcome of the first coin regardless of whether it comes up heads or tails and then ask us to guess if the other coin came up the same, then the odds of the other coin being the same would be 1/2.

So maybe the people who are getting the "wrong" odds are misunderstanding the question?


Aaaah. Now I understand what the Girl/Boy Paradox is all about. Thanks.
 
Brian-M has it right. The probability is 1/3. I challenge anyone who is unconvinced to try a simulation.

If it was "you know the first coin toss is heads, now what is the probability of two heads?" then it would be 50/50.

This very puzzle (phrased differently*) once caused an absolute storm of controversy among gambling types in the letters page of the "Racing Post" back in the UK :D

It still confuses me. My letter was along the lines of "So he has a son. What were the odds that the egg leading to the birth of the other child was fertilised by an X or Y sperm?"

If the first child was a daughter then the odds are 1/2 that the next child will be a son.

If the first was a son then the odds are 1/2 that the next will also be a son.

That the first was either male or female is a given, and those eventualities are equally likely, so it's 1/2 that the other is a boy.

I suspect I might well be wrong, but I've forgotten why ;)

*eta: "Mr Jones has two children. We know one is a son. What are the odds he has two sons". The given answer was 1/3, based on the HH, HT, TH idea.
 
Last edited:
In practice game shows don’t normally do anything to help the contestants win the main prize. To assume that Monty wouldn’t have a predictable goat door bias would therefore be a reasonable default assumption if not a given.


Knowing that Monty will always open Door 2 if there is a goat behind it (the "Preference Monty" variant) doesn't give the player an advantage over the standard setup, where Monty will open one of the two doors with goats behind them, if the contestant has chosen the door with the car. In either setup, "always change doors" is an optimal strategy.

Say you're playing Preference Monty and you have chosen Door 1. Then, if Monty opens Door 3, the probability is 1 that the car is behind Door 2, so switching is obviously the correct strategy. However, assuming the placement of the car was random, the probability of this scenario is 1/3. On the other hand, if Monty opens Door 2, then the probability that the car is behind your door or Door 3 is 1/2 each, so switching does you no harm. The probability of this scenario is 2/3. So the strategy "always switch" gives you a probability of winning of (1/3)(1) + (2/3)(1/2) = 2/3—the same probability as the strategy "switch if Monty opens Door3; otherwise, don't." You can do no better, so "always switch" is an optimal strategy (though not uniquely optimal). Moreover, this is the same probability as the (uniquely optimal) "always switch" strategy in standard Monty. Thus, Preference Monty offers no advantage to the player over standard Monty.
 
Generally, the purpose of an illustration or analogy is to illucidate an obscure concept by reference to a more common concept. Instead, you are purporting to illucidate an obscure concept by reference to an even more obscure concept.

Elucidate?

And, as a long-time-ago very occasional bridge player, I was interested.
 
Knowing that Monty will always open Door 2 if there is a goat behind it (the "Preference Monty" variant) doesn't give the player an advantage over the standard setup, where Monty will open one of the two doors with goats behind them, if the contestant has chosen the door with the car. In either setup, "always change doors" is an optimal strategy.

Say you're playing Preference Monty and you have chosen Door 1. Then, if Monty opens Door 3, the probability is 1 that the car is behind Door 2, so switching is obviously the correct strategy. However, assuming the placement of the car was random, the probability of this scenario is 1/3. On the other hand, if Monty opens Door 2, then the probability that the car is behind your door or Door 3 is 1/2 each, so switching does you no harm. The probability of this scenario is 2/3. So the strategy "always switch" gives you a probability of winning of (1/3)(1) + (2/3)(1/2) = 2/3—the same probability as the strategy "switch if Monty opens Door3; otherwise, don't." You can do no better, so "always switch" is an optimal strategy (though not uniquely optimal). Moreover, this is the same probability as the (uniquely optimal) "always switch" strategy in standard Monty. Thus, Preference Monty offers no advantage to the player over standard Monty.

Then the probability of winning the car if you don't switch must be 1 - 2/3 = 1/3. So even if we go through a phase where the odds are 50/50, in the end it is still the same as choosing the right door from the start.
 
Then the probability of winning the car if you don't switch must be 1 - 2/3 = 1/3. So even if we go through a phase where the odds are 50/50, in the end it is still the same as choosing the right door from the start.

I'm not sure if we're in disagreement here, but... In the Preference Monty, once Monty opens the preferred door, then your odds of having picked the right door at the start go up to 1/2.

If you use a "don't switch no matter what" strategy then your odds will be 1/3 overall. However once Monty opens a door, they change to either 0 or 1/2.
 
I'm not sure if we're in disagreement here, but... In the Preference Monty, once Monty opens the preferred door, then your odds of having picked the right door at the start go up to 1/2.

If you use a "don't switch no matter what" strategy then your odds will be 1/3 overall. However once Monty opens a door, they change to either 0 or 1/2.
Ahem...... No. The odds never go to zero unless Monty shows the car, which is not part of the Monty Hall Problem. The only way the odds can go to 1/2 is if the contents behind the doors are reshuffled after Monty removes one goat. Again, not part of the Monty Hall Problem.

In the preference Monty the only way to increase your odds is if you have prior knowledge of Monty's preference. If you don't pick Monty's preference and neither does Monty, then the car is there. If you don't pick the preference and Monty does, you still have the same 1/3 2/3 stay switch, and if you pick the preference then you also still have the same 1/3 2/3 stay switch.

So preference Monty CAN potentially be used to increase your odds if switching, but only if you have foreknowledge of the preference door and don't use it as your initial pick. Again, not part of the Monty Hall Problem.
 
Last edited:
Ahem...... No. The odds never go to zero unless Monty shows the car, which is not part of the Monty Hall Problem. The only way the odds can go to 1/2 is if the contents behind the doors are reshuffled after Monty removes one goat. Again, not part of the Monty Hall Problem.

We're talking about the "Preference Monty" variant.

ETA: The odds of winning when you stick go to zero when Monty does not open his preference door.
 
Last edited:
In the preference Monty the only way to increase your odds is if you have prior knowledge of Monty's preference.

Even then your overall odds of winning the game will still be 2/3 if you play properly. But if you reassess your odds after seeing which door Monty opens (i.e. whether or not it is his preferred) then they will either be 1/2 or 1.

If you don't pick Monty's preference and neither does Monty, then the car is there. If you don't pick the preference and Monty does, you still have the same 1/3 2/3 stay switch

No, it becomes 1/2 and 1/2. Try it.

and if you pick the preference then you also still have the same 1/3 2/3 stay switch.

Agreed. If you pick the preference then it becomes a normal Monty Hall game.
 
I agree, it is a reasonable default assumption by applying the Principle of Indifference (although it's generally not stated in the problem definition).

Whilst that assumption is almost always implicitly made when discussing the MHP, in other 'paradoxes' where it is equally applicable it is often ignored. For instance in this problem:

'If I toss two fair coins, examine both of them and truthfully tell you that at least one of them landed heads. What is the probability both coins landed heads?'

Most people will answer 1/3. But making exactly the same assumption as in the MHP gives the (correct) answer of 1/2.
I agree the correct answer is 1/2 but the confusion that causes “most” to say 1/3 is created by the misleading way the question is worded. Asking “What is the probability both coins landed heads?” reads as if the question is asking the odds of a two coin toss before the toss is made. But it isn’t. The question is asked after the two coins have been tossed and the result of one coin has been given. The question then only relates to the other coin as a single coin toss which obviously has 1/2 odds.

It’s like asking “What’s half of two plus two?” Most people answer “two” because they wrongly bracket “two plus two” to equal four then halve it to equal two. But the question doesn’t bracket two plus two so the correct answer is “three” because half of two is one, plus two equals three.
 
If it was "you know the first coin toss is heads, now what is the probability of two heads?" then it would be 50/50.
That is exactly what you know . . .
'If I toss two fair coins, examine both of them and truthfully tell you that at least one of them landed heads.'
It's a mistake to assume the question is asking about the odds of a double coin toss.
 
Last edited:
Since the OP specified Monty knows where the car is, case B is irrelevant. And case C requires the use of information not in the OP, so it is also irrelevant.

Should we also explore the cases where the original car location is not random?
Or that Monty is such an incompetent game show host his body language always gives away the location of the car every time. It all gets very silly very quickly.
 
That is exactly what you know . . .

Are you saying there's no difference between "at least one of the two coins landed heads" and "the first coin landed heads"?

As I see it, "at least one of the two coins landed heads" is equivalent to: (HH or HT or TH).

And "the first coin landed heads" is equivalent to: (HH or HT).
 
Are you saying there's no difference between "at least one of the two coins landed heads" and "the first coin landed heads"?

As I see it, "at least one of the two coins landed heads" is equivalent to: (HH or HT or TH).

And "the first coin landed heads" is equivalent to: (HH or HT).

One of the coins is known to have landed heads, so :

That was the first coin and it landed heads, so it's HT or HH
or
That was the second coin and it landed heads, so it's TH or HH

50% of those have two heads (or two boys, in the Mr Jones' kids version)

I suspect it's a quesion of the weighting of probabilities based on partial knowledge - as we noted in the MH analysis - but wouldn't want to commit to that at 11:30 pm ;)
 
One of the coins is known to have landed heads, so :

That was the first coin and it landed heads, so it's HT or HH
or
That was the second coin and it landed heads, so it's TH or HH

There's a difference between "it is known that at least one of the coins landed heads" and "one of the coins is known to have landed heads". I'm talking about the former. In which case the possibilities are HH, HT or TH. JohnnyG's simulation supported this, also.

It might make more sense like this... You have a bag with three teddy bears inside. One of them has a green right ear and a brown left ear. One of them has a green left ear and a brown right ear. One of them has two green ears. Now, we know that if we randomly pull out a bear, it will have at least one green ear. One could argue:

"Either its right ear will be green in which case there is a 50% chance that its left ear will also be green... Or its left ear will be green in which case there is a 50% chance that its right ear will also be green. Therefore there is a 50% chance that we will pull out a bear with two green ears."

But clearly this would be wrong, because only 1/3 of the bears in the bag has two green ears.
 
Last edited:
Even then your overall odds of winning the game will still be 2/3 if you play properly. But if you reassess your odds after seeing which door Monty opens (i.e. whether or not it is his preferred) then they will either be 1/2 or 1.



No, it becomes 1/2 and 1/2. Try it.



Agreed. If you pick the preference then it becomes a normal Monty Hall game.
I guess the confusion lies in communication and what you mean by "reassessing" the odds.

The only change made by "reassessing" in "preferred Monty" is that 2/3rds of the time you have 50% chances and 1/3rd of the time you have 100% chances,
(2/3 X 1/2) + (1/3 X 1) = 2/3 to switch.

So you are right back to the 1/3 2/3 stay switch you started with in the original Monty Hall Problem, which was my point, although I probably communicated it poorly.
 
Last edited:

Back
Top Bottom