• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Monty Hall Problem... For Newbies

Don't answer this poll until AFTER your read the OP!


  • Total voters
    141
Goat's certainly are irrelevant to the MH problem. From a probability perspective but not an emotional one. A goat represents failure or worse, having something pulled over on you. That is unless you love goats per the XKCDimage at the start of this thread.

And of course Monty Hall is presumed to be trying to deceive you. None of this is part of the actual problem but is the baggage we come with when first seeing the problem. In many ways the Monty Hall problem is similar to mentalism with its success dependent on playing in part to pre-existing beliefs. Restated absent these misdirections:

There are three containers. One has a rock inside. The other two are empty. A machine randomly chooses one container and shoves it out door A. Another machine that knows the contents of all three containers selects an empty container from the remaining two and discards it while shoving the remaining container out door B.

What are the probabilities the container that went out door A has a rock in it? What are the probabilities that the container that went out door B has a rock in it?

Pretty obvious stated that way.
 
Last edited:
A goat represents failure or worse, having something pulled over on you. That is unless you love goats per the XKCDimage at the start of this thread.

And of course Monty Hall is presumed to be trying to deceive you. None of this is part of the actual problem but is the baggage we come with when first seeing the problem.

That's interesting. I love goats for one. And for two I always thought Monty Hall was trying to help you. Even as a child watching the show I always figured Monty was trying his best to help you, without telling you directly. Always giving you every opportunity to trade up and always warning you over and over when you had the years supply of Eskimo pies instead of the new car.

PS Of course the show was a bit different. Instead of two goats and a car it was an A gift, a B gift, and a "goat". Monty almost always showed the goat. So you better switch if you didn't see the "goat" (actually usually a lifetime supply of Eskimo pies as stated above). That meant you had it!
 
Last edited:
I would posit that if you were to turn up with 100 pound notes which were legal tender, then the time machine you used to get there would be more valuable than any potential monetary win.

:D

Thank you! I actually laughed out loud.

I had written the post in dollars and went back to edit it to pounds. Not having been to the UK, I fell on that one.
 
What does "is it wrong to stick?" even mean? Are we talking about a moral question here? I don't think there's any moral implication in the Monty Hall problem, so no, it's not morally wrong to stick.


Is it wrong to go for the car when there's a perfectly lovely goat that needs a good home? :D
 
And now for something completely different:

Suppose we allow for more than three doors. Monty Hall repeatedly will open a door to expose a goat and the contestant allowed to change selection until only two unopened doors remain.

  1. What is the contestant's best strategy?
  2. Suppose the contestant is required to change doors each round. What is the contestant's best strategy?
  3. Can Monty's choice of doors he opens influence the outcome? (Well, the probabilities associated the outcomes.)
  4. Suppose the contestant is required to select a different door each round? That is, the contestant may not select any door selected in a prior round.
 
Last edited:
Aren't you still going to finish up with two doors, one with a car and one with a goat? I cannot see how alternative scenarios by using more doors or different rules changes anything, except improving your odds when you switch at the last two doors:

Four doors - switch, Monty opens a goat door, stick, Monty opens a goat door. Switch or stick again, Monty opens a goat door. You are back to two doors. Except the odds for switching the last two doors improve you chances of winning to 75%

Five doors - switch, Monty opens a goat door, stick, Monty opens a goat door. Switch or stick again, Monty opens a goat door. Switch or stick again, Monty opens a goat door. You are back to two doors. Except the odds for switching the last two doors improve you chances of winning to 80%

Norm
 
Aren't you still going to finish up with two doors, one with a car and one with a goat? I cannot see how alternative scenarios by using more doors or different rules changes anything, except improving your odds when you switch at the last two doors:

Four doors - switch, Monty opens a goat door, stick, Monty opens a goat door. Switch or stick again, Monty opens a goat door. You are back to two doors. Except the odds for switching the last two doors improve you chances of winning to 75%

Five doors - switch, Monty opens a goat door, stick, Monty opens a goat door. Switch or stick again, Monty opens a goat door. Switch or stick again, Monty opens a goat door. You are back to two doors. Except the odds for switching the last two doors improve you chances of winning to 80%

Norm

I almost posted the first question as a statement: "It should be obvious the best strategy is to...". The mathematician in me thought it was obvious, anyway. Glad I went with the question.

So, your answer is only the final switch matters? Switch or not switch before that does not matter? And, apparently, too, which door with a goat Monty exposes doesn't matter either?
 
And now for something completely different:

Suppose we allow for more than three doors. Monty Hall repeatedly will open a door to expose a goat and the contestant allowed to change selection until only two unopened doors remain.

1. What is the contestant's best strategy?

Stick until there are only 2 unopened doors left. Then switch. Your probability of getting the car should be (k-1)/k where k is the number of doors.

2. Suppose the contestant is required to change doors each round. What is the contestant's best strategy?

If there are four doors, the contestant will be required to switch once and then will have the option of switching or sticking at the end. They should always switch.

With more than four it's a little more tricky because I think it actually matters where you switch to in the intermediate switches (i.e. to a door you had selected previously or to a door you haven't selected previously). I believe the best strategy is to always switch back to a door that you had selected previously when possible. Then switch at the end.

3. Can Monty's choice of doors he opens influence the outcome? (Well, the probabilities associated the outcomes.)

If the contestant has an optimal strategy then it shouldn't matter.

4. Suppose the contestant is required to select a different door each round? That is, the contestant may not select any door selected in a prior round.
[/LIST]

I'm pretty sure it's still best to switch at the end.

Aren't you still going to finish up with two doors, one with a car and one with a goat? I cannot see how alternative scenarios by using more doors or different rules changes anything, except improving your odds when you switch at the last two doors:

Four doors - switch, Monty opens a goat door, stick, Monty opens a goat door. Switch or stick again, Monty opens a goat door. You are back to two doors. Except the odds for switching the last two doors improve you chances of winning to 75%

In the four door scenario there is only one chance to switch or stick before you are back to two doors. It's : Pick - Monty opens a door - Switch or stick - Monty opens a door - Switch or stick. You seem to have a switch before any door is opened and Monty opening three doors, which would only leave one door unopened.

As for the initial decision to stick or switch not mattering, I disagree. If you switch twice your chance of winning should go down to 62.5%, but if you stick and then switch your chance will be 75%.
 
Last edited:
How can it be behind door B when he's already shown you a goat behind door B?

I'm sure that was a typo,

:o Yes, that was a typo.

but it illustrates my point: When you pick door A there's a 1/3 chance that it's behind door A and a 2/3 chance that it's not. At no point does that that change. If you think it changes, please explain how, as Jiggeryqua said, telling you something you already knew (that one of the other doors has a goat) tells you something about whether there is a car behind door A. It doesn't.

I DON'T think the odds change. I think the game changes. That's why I said all this back and forth about odds is a red herring. "The Monty Hall Problem" isn't really a "The" it's a "They", so to speak. You're playing two successive, but independant games. In the first game, you choose one of three doors, and have a 1/3 chance of choosing the one with the car. Before you finish and learn the result of that game, you begin the second game, in which you choose one of two doors (The one you chose before, or the one remaining after Monty eliminates a safe door), and you have a 1/2 chance of winning that game.

My real argument is that you cannot alter your chances of winning a single instance of the game partway through it. If you go into it knowing in advance that you will switch, then you are playing a game in which you have a 2/3 chance of winning from the outset. If you go into it having already decided that you will not switch, then you have a 1/3 chance of winning from the outset. If you don't make that decision in advance, but instead wait to make up your mind until Monty opens a safe door, then your previous choice become moot; you forfeit the first game and instead play a different game in which you have a 1/2 chance of winning from the outset. (If this isn't obvious, then imagine a variation in which, after you're shown a goat, you decide whether or not to switch by flipping a coin).

This is why this debate is so frustrating. The two sides are actually both right, but they're talking past each other each arguing about the odds of a different game, depending on whether or not they have in mind that they're following a strict switch or no-switch policy from the outset.
 
Last edited:
I DON'T think the odds change. I think the game changes. That's why I said all this back and forth about odds is a red herring. "The Monty Hall Problem" isn't really a "The" it's a "They", so to speak. You're playing two successive, but independant games. In the first game, you choose one of three doors, and have a 1/3 chance of choosing the one with the car. Before you finish and learn the result of that game, you begin the second game, in which you choose one of two doors (The one you chose before, or the one remaining after Monty eliminates a safe door), and you have a 1/2 chance of winning that game.

Well, no, it's really only one problem. The problem is "should you stick or switch?" That is what people mean when they refer to "The Monty Hall Problem". It's the same problem regardless of what the first pick was and what the reveal was.

It doesn't quite make sense to say you have 1/2 chance to win. If you make a random decision to either stick or switch, then your probability of winning the car is 1/2, yes. But what the persons' decision is isn't assigned a probability value, because the problem is "what should the person do?" not "what is the probability they will win the car if we don't know what they will do?"

My real argument is that you cannot alter your chances of winning a single instance of the game partway through it. If you go into it knowing in advance that you will switch, then you are playing a game in which you have a 2/3 chance of winning from the outset. If you go into it having already decided that you will not switch, then you have a 1/3 chance of winning from the outset. If you don't make that decision in advance, but instead wait to make up your mind until Monty opens a safe door, then your previous choice become moot; you forfeit the first game and instead play a different game in which you have a 1/2 chance of winning from the outset. (If this isn't obvious, then imagine a variation in which, after you're shown a goat, you decide whether or not to switch by flipping a coin).

I don't get it. How about you pick a door without thinking. Then when presented with the decision to stick or switch, you intelligently deduce that switching is better and so you switch. Why would the choice need to be random?

This is why this debate is so frustrating. The two sides are actually both right, but they're talking past each other each arguing about the odds of a different game, depending on whether or not they have in mind that they're following a strict switch or no-switch policy from the outset.

I didn't think there was any debate over the actual probabilities at this point. I thought it was just a question of why the probabilities are what they are.
 
Last edited:
You're playing two successive, but independant games.
The are clearly not independent: which door Monty opens is dependent upon which door you choose (because he can't open the door you choose, regardless of whether a goat or a car lies behind it). Thus, the layout of the "second game" is dependent upon your choice in the first.

In the first game, you choose one of three doors, and have a 1/3 chance of choosing the one with the car. Before you finish and learn the result of that game, you begin the second game, in which you choose one of two doors (The one you chose before, or the one remaining after Monty eliminates a safe door), and you have a 1/2 chance of winning that game.
As I said, which two doors are available as choices is dependent upon your choice in the "first game", and thus that game does have an effect on the second one. You clearly do not have a 1/2 chance of winning.

If you disagree, rather than continuing to post various different arguments, please address what I said in the first post of mine to which you replied: you first choose a door that has a 1/3 chance of having a car behind it. How does showing you a goat behind one of the other doors (a fact which you already knew) tell you anything about whether or not there is a car behind the one you chose?

(If this isn't obvious, then imagine a variation in which, after you're shown a goat, you decide whether or not to switch by flipping a coin).

That same argument would apply if Monty opened both other doors showing you which door had the car, then asked if you if you want to switch or not. If you flip a coin you'll have a 50/50 chance of making the correct choice, but if you simply choose the car you have at 100% chance of making the correct choice. Yes, flipping a coin can destroy the effect of information that you already have.
 
In the first game, you choose one of three doors, and have a 1/3 chance of choosing the one with the car. Before you finish and learn the result of that game, you begin the second game, in which you choose one of two doors (The one you chose before, or the one remaining after Monty eliminates a safe door), and you have a 1/2 chance of winning that game.


The highlighted is flat out wrong. In what you call the "second game" it stays at 1/3 and 2/3. It is not 50/50.

I explained what I think you are missing here:


What you are forgetting is that when he shows you a goat from one of the remaining two doors it is not only giving you new information about the one that he picks (obviously), it is giving you new information about the one that he doesn't pick as well (not so obvious).


I strongly recommend that you watch the mythbusters episode.
 
Last edited:
My real argument is that you cannot alter your chances of winning a single instance of the game partway through it.


Yes you can.

Look at the 100 door variation and that is clear.

You can pick any door of the 100 and have a 1/100 shot at the prize.

MH then removes 98 of the doors you did not pick, showing goats behind all of them. As I said in the above post, seeing those goats not only gives you information about the door they were in, but about the remaining door you did not pick.

That 99th door (the remaining door that you did not pick) almost definitely has the prize.

Stick with your original door and it is still 1/100. Yet the other door is up to 99/100 (or is it 98/100?) even though originally it would have been 1/100.

Sure there is still a 1/100 chance that the prize was in the door you originally picked. But it is nowhere near 50/50.

Your chance of winning in that instance has clearly changed "partway through it".

Either you do not understand that, or you are perpetuating the marble fallacy.

I don't remember that exactly but it goes something like this: Say you have a bag with 99 black marbles and 1 white marble. Just because there are 2 colors does not mean you have a 50/50 chance of picking the white marble.
 
Here's a game in which I can alter the odds partway through:

We flip a coin, and before the flip I call heads or tails. The winner gets $1,000,000.

After the flip, you show me the coin and ask me if I want to change my choice. If I call heads and see tails (or vice versa), I certainly will choose to switch. The odds go from 0 to 1.

Clearly it's possible for the odds to be changed part-way through. But how? By the addition of information about the coin-flip. That information needn't be complete in order to be information.

ETA: And guess what, if I flip another coin and use it to decide whether to change my choice or not, my odds go back to 50/50. That's because I've just erased (the effect of) the information I have about the outcome of the coin flip.
 
Last edited:
:o Yes, that was a typo.



I DON'T think the odds change. I think the game changes. That's why I said all this back and forth about odds is a red herring. "The Monty Hall Problem" isn't really a "The" it's a "They", so to speak. You're playing two successive, but independant games. In the first game, you choose one of three doors, and have a 1/3 chance of choosing the one with the car. Before you finish and learn the result of that game, you begin the second game, in which you choose one of two doors (The one you chose before, or the one remaining after Monty eliminates a safe door), and you have a 1/2 chance of winning that game.

My real argument is that you cannot alter your chances of winning a single instance of the game partway through it. If you go into it knowing in advance that you will switch, then you are playing a game in which you have a 2/3 chance of winning from the outset. If you go into it having already decided that you will not switch, then you have a 1/3 chance of winning from the outset. If you don't make that decision in advance, but instead wait to make up your mind until Monty opens a safe door, then your previous choice become moot; you forfeit the first game and instead play a different game in which you have a 1/2 chance of winning from the outset. (If this isn't obvious, then imagine a variation in which, after you're shown a goat, you decide whether or not to switch by flipping a coin).

This is why this debate is so frustrating. The two sides are actually both right, but they're talking past each other each arguing about the odds of a different game, depending on whether or not they have in mind that they're following a strict switch or no-switch policy from the outset.
I'm afraid that you're going to have to explain how it makes a difference to the probabilities if you make the decision to switch before or after the revealing of the goat.

In either case you have picked a door and know that there is a goat behind at least one of the other two.

The timing of the decision to stick or switch is irrelevant.
 
I didn't intend the word holding that spot to have moral implications. It would be 'wrong' as in 'incorrect' if your goal is to take one shot at glory.
Cool, thanks for making that clear. I wasn't sure what you were getting at, so I tried to cover all possibilities that I could see.

If we want to maximise the best result, we change. If we want to minimise the worst result, we stick.
This is true whether the game has a single iteration or many.

You really weren't that confused, sorry. The comparative value is a good point too - what is the worth of the change, to the door that the undisputed and comprehensively explained MH odds apply? The odds do not dictate the action of the player, who is only a foolish woo (regardless of his ultimate choice) if he plays on the misunderstanding that his odds are anything other than 1 in 3 to stick (or believes Thor will intervene on his behalf, or such other nonsense as may prevail in his noggin).
Sure, anyone playing can make any choice they like for any reason they like.

I'm a little confused, though, by the distinction you're making between a one-off game and playing with several iterations. The logic is still the same in either case: if all you care about is getting the car or not, switch. You still may not get the car, but you've got a better chance, so what possible reason is there not to switch?

Not that any of this matters. I'd bet that the total amount of time in man-hours that people have wasted talking about the Monty Hall problem is worth more than the value a many cars.

Oh, and yes, I did. Especially the reversal in the final paragraph, that was a neat embellishment. It's a lower case j, though ;)
:)
 
Okay, here's where I put a cat amongst the pigeons.

Sort of.

I decided to run a trial of Prometheus' coin toss version of the problem.

I took three cards, one black and two red, and randomly shuffled them between trials.
I then chose one at random and tossed a coin - heads I stuck with my original choice, tails I swapped.
I then turned over the chosen card;
if it was black and I stuck it was a win,
if it was black and I swapped I lost,
if it was red and I stuck I lost,
if it was red and I swapped I won.

I performed 100 trials and the results are truly fascinating.

I got 57 heads and 43 tails, resulting in sticking with the original choice 57 times and swapping 43 times. I won 45 times and lost 55 times. It would therefore seem at first glance that it is indeed a 50/50 chance. As I was doing the trials I was starting to get worried that I had been wrong about this all along, because I could see that the number of wins and losses were roughly even.

However, I did some further tallying and the underlying numbers are even more interesting.

I also counted the number of times I won and lost when sticking or swapping. The random coin toss made me stick with the original choice 57 times, but I won only 15 of those, losing 42 times. When I swapped I won 30 of 43 trials, losing just 13 times. When sticking I lost almost 3 times as often as I won, and when I swapped I won more than twice as often as I lost.

So it appears that choosing randomly whether to stick or swap does indeed result in a 50/50 outcome overall, but underneath that interesting outcome is the undeniable result that swapping gives you a greater chance of winning than sticking does.

The maths works out because what you're doing by randomly selecting whether to stick or swap after the choice is made is averaging out the 2/3 winning probability of swapping with the 1/3 winning probability of sticking.

Regardless of whether it appears to be a 50/50 choice and outcome, the numbers show that swapping is always the better option to maximise your chance of winning, whether you choose after the door is opened or before.

Think of it this way Prometheus, what does it matter whether you make that random choice to stick or swap before or after the door is opened? You know that MH is going to open a door to reveal a goat, that's what he always does. Randomising the trial at the two door stage makes no difference. You could toss the coin before you even choose your original door and the results would be the same. All the coin toss does is average out the 2/3 and 1/3 winning probabilities of swapping and sticking.

Swapping doors gives you a better chance of winning, 2 out of 3 on average.

Well, that's my bit of scientific investigation for the day, I'm off for a cup of tea.
 
I don't remember that exactly but it goes something like this: Say you have a bag with 99 black marbles and 1 white marble. Just because there are 2 colors does not mean you have a 50/50 chance of picking the white marble.

Hey, good example for our Monty Hall problem :-)

Monty hands you a bag with 99 black marbles and one white marble. If you manage to draw the white marble, you win the car, otherwise you get an angry goat. You draw one marble, but you do not look at it yet (you keep it in your closed hand). Monty takes the bag and removes 98 black marbles, leaving in either the white marble (or, if you should have it in your hand, a black one). Of course you do not get to see which one he leaves in the bag.

He then offers you to take the bag with the one marble left in it instead of the one marble you have in your hand. Which do you choose?

When you took out the first marble, it had a 1% chance of being the white one, so there is a 99% chance that the white marble still is in the bag. Removing the 98 black marbles does nothing to reduce that chance or give you any new information, you already knew that there would be at least 98 black marbles in there no matter which marble you have in your hand. So there is still a 1% chance that you are holding the white marble, and there still is a 99% chance that it is still in the bag.
 
I first read about the MH problem in an excellent book which I would like to thoroughly recommend -

The Drunkard's Walk - How Randomness Rules Our Lives

Fascinating read on probabilities in life. Cannot recommend it highly enough.

Fantastic book. I try to recommend it often.

The author's tale about being erroneously diagnosed as HIV positive is riveting and shows how even smart people still have a problem with statistics.
 

Back
Top Bottom