• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Deal or No Deal: Two Cases

rwp

Thinker
Joined
Sep 6, 2006
Messages
207
I have been thinking about this one for a while now:

On Deal or No Deal, if a contestant gets down to two cases and the top prize is still on the board, what is the probability that the case he or she holds contains the top prize?

My first thought was to relate this to the Monty Hall puzzle and claim that there is a 1/26 probability (there are 26 cases to start); however, I have since changed my mind and now agree with 1/2.

Here is why:


I tried a "brute force" method on this and produced all the possible outcomes for a Deal or No Deal game with only four cases to start. With four cases there are less outcomes to observe (4! = 24). Here they are in order of the following:

Chosen case, next pick, next pick, last case remaining.

The letter "A" represents the top prize.

ABCD
ABDC
ACBD
ACDB
ADBC
ADCB

BACD
BADC
BCAD
BCDA
BDAC
BDCA

CABD
CADB
CBAD
CBDA
CDAB
CDBA

DABC
DACB
DBAC
DBCA
DCAB
DCBA

Of course you will only choose the "A" case 1/4 of the time and will not choose it 3/4 of the time. [P(Pick A on first try) = 6/24 = 1/4]

There are 12/24 times when you are down to two cases and A is left in play. 6/12 or 1/2 are winners, and 6/12 or 1/2 are losers. That seems to definitively answer the question for me.

Is there anything wrong with that analysis?
 
You're going to have to explain the game in more detail. I have no idea what you're talking about.
 
You're going to have to explain the game in more detail. I have no idea what you're talking about.

Basically, the contestant picks one of N urns, each of which have prizes of varying amounts in them.

The contestant then opens successive unchosen urns, and the prizes inside them are revealed.

At various points, the MC will offer the player the chance to swap the prize inside his (chosen) urn for a sum of money related to the average of the prizes that have not yet been seen.

Since the player is choosing urns without any information about their contents, the Monty Hall information "paradox" does not arise; if the last two prizes unaccounted for are A and B, then the chosen urn has a 50/50 chance of having A or B in it, irrespective of the values of A and B or the starting number N.
 
Since the player is choosing urns without any information about their contents, the Monty Hall information "paradox" does not arise; if the last two prizes unaccounted for are A and B, then the chosen urn has a 50/50 chance of having A or B in it, irrespective of the values of A and B or the starting number N.

It is making sense to me now. The difference between the two scenarios is the fact that the host doesn't just get rid of all of the losing cases and leave one behind in Deal or No Deal as he would in the Monty Hall game.
 
It will just be 50%. The Monty Hall problem introduces an extra variable in that the quizmaster gives you additional information after you have chosen your box. When you have just two boxes on the table you have no additional information to change the probabilities from 50:50.
 
Basically, the contestant picks one of N urns, each of which have prizes of varying amounts in them.

The contestant then opens successive unchosen urns, and the prizes inside them are revealed.

At various points, the MC will offer the player the chance to swap the prize inside his (chosen) urn for a sum of money related to the average of the prizes that have not yet been seen.

Since the player is choosing urns without any information about their contents, the Monty Hall information "paradox" does not arise; if the last two prizes unaccounted for are A and B, then the chosen urn has a 50/50 chance of having A or B in it, irrespective of the values of A and B or the starting number N.


Since the revealed urns have been chosen at random, yes, exactly.
 
yes. I also have little idea about how the game is played other than picking random briefcases with some listed amounts of money in them.

Vorticity said:
You're going to have to explain the game in more detail. I have no idea what you're talking about.

1. A player chooses 1 case from 26. Each case has a different monetary value, with the lowest being $0.01 and the highest being $1,000,000. The player is not allowed to look in his or her case.

2. The player then eliminates case by case, which are opened to reveal their values.

3. The question is, if the player luckily manages to keep the top prize on the board until there are two cases left (the one on the board plus the one he or she holds), what are the odds he or she holds the winning case?

I say it is 50/50 but was tricked earlier by recent analysis of Monty Hall puzzles and similar situations.
 
1. A player chooses 1 case from 26. Each case has a different monetary value, with the lowest being $0.01 and the highest being $1,000,000. The player is not allowed to look in his or her case.

2. The player then eliminates case by case, which are opened to reveal their values.

3. The question is, if the player luckily manages to keep the top prize on the board until there are two cases left (the one on the board plus the one he or she holds), what are the odds he or she holds the winning case?

I say it is 50/50 but was tricked earlier by recent analysis of Monty Hall puzzles and similar situations.

Yeah, once down to two cases the probability is 50/50. as others have stated this is not analagous to the Monty Hall problem because the contestant is choosing cases to eliminate without knowledge of what is in the chosen case at the time of choosing and can eliminate the top value (as you know happens quite often if you watch the show.

The real probabilistic tool for analyzing this show is expected value, not just raw probability. in this case a simplified formula for that at any given point in the show is
E = S/n
E = expected value if play continues.
S = sum of all values still on board
n= number of cases (including the one you chose) left in play.

Once this is calculated just compare it to the banker's offer. If E is higher continue to play. If banker's offer is higher take the offer.

In my experience of watching the show the banker's offer is typically between 60% and 70% of E so it is almost never the best course of action to accept the banker's offer.

using this formula if you are left with two cases and have the $1,000,000.00 and the $0.01 left then E = $500,000.00. If the banker offers you less you have a better chance for opening your case. If the banker offers more, you should take the offer.

What makes the show so interesting is that it is real money on the line and it is psychologically difficult for most people to pass up a sure $350,000.00 for a 50-50 shot at $1,000,000.00
 
The real probabilistic tool for analyzing this show is expected value, not just raw probability. in this case a simplified formula for that at any given point in the show is
E = S/n
E = expected value if play continues.
S = sum of all values still on board
n= number of cases (including the one you chose) left in play.

What makes the show so interesting is that it is real money on the line and it is psychologically difficult for most people to pass up a sure $350,000.00 for a 50-50 shot at $1,000,000.00


Well, one could argue that that's because expected value is NOT the most appropriate probabilistic tool for analyzing this show, precisely because the psychological "payoff" is not linear with the dollar value. One of the fundamental assumptions underlying game theory is that players are "risk-neutral," but most studies show that real people are somewhat risk averse.

And if you think about it -- how would $1,000,000 change your life that $350,000 would not? I could pay off all my debts, including my house, with either and still have a comfortable chunk of change in the bank afterwards. There comes a point where money is no longer useful as money; it's just a way of keeping abstract score. Bill Gates is worth more than three times as much as Larry Page -- but they're both deca-billionaires.

What can you imagine that extra $30+ billion buying? A nice car? An evening on the town? A professional sports team? What does Larry want that he can't afford?
 
Last edited:
In my experience of watching the show the banker's offer is typically between 60% and 70% of E so it is almost never the best course of action to accept the banker's offer.

using this formula if you are left with two cases and have the $1,000,000.00 and the $0.01 left then E = $500,000.00. If the banker offers you less you have a better chance for opening your case. If the banker offers more, you should take the offer.

What makes the show so interesting is that it is real money on the line and it is psychologically difficult for most people to pass up a sure $350,000.00 for a 50-50 shot at $1,000,000.00

I'd take the 100% guaranteed $350,000 rather than toss a coin for a million and have an equal chance of getting nothing but a penny. $350,000 is a lot of money. If someone offered me a choice on the street of 350K or that I could flip a coin and heads was a million, I'd take the cash (unless I could flip the coin in which case I would use trickery to guarantee a head). ;)
 
Analyzing this further, it appears that in my simple version of the game, where there are 4! outcomes, your odds of picking the "A" prize increase as follows:

You pick a case: 1/4 chance of being "A"
You eliminate a case: 1/4 chance of it being "A"
You eliminate another case: 1/3 chance of it being "A"
You eliminate the final case: 1/2 chance of it being "A"


Since I just used "brute force" to see these probabilities, I'm not sure how to say it in the language of math. Is it safe to assume that in the real version of Deal or No Deal (26! outcomes) that the next case you choose has a increased probabilities of being the top prize in the following manner?

1/t-n where n is the cases that have been opened and t is the total number of cases?

So the probability that you will open the million on the next case is 1/(remaining cases)?
 
Well, one could argue that that's because expected value is NOT the most appropriate probabilistic tool for analyzing this show, precisely because the psychological "payoff" is not linear with the dollar value. One of the fundamental assumptions underlying game theory is that players are "risk-neutral," but most studies show that real people are somewhat risk averse.

And if you think about it -- how would $1,000,000 change your life that $350,000 would not? I could pay off all my debts, including my house, with either and still have a comfortable chunk of change in the bank afterwards. There comes a point where money is no longer useful as money; it's just a way of keeping abstract score. Bill Gates is worth more than three times as much as Larry Page -- but they're both deca-billionaires.

What can you imagine that extra $30+ billion buying? A nice car? An evening on the town? A professional sports team? What does Larry want that he can't afford?
That's really the only reason I've watched the show a few times -- I try to estimate what the banker's offer will be. It pretty quickly became apparent that the offers were somewhat lower than the expected value of the outstanding prizes, but I haven't been able to formulate an algorithm to determine how much lower. It doesn't help that I haven't calculated the expected values more precisely than I can do in my head by eyeballing the board, which isn't very precise at all until very late in the game.

Does anybody know whether the value is calculated formally and the same way for all participants (i.e. always offering the same buyout price for a given suite of unopened cases)? Or is the "risk aversion deduction" somehow adapted to individual players' demeanors? Or randomized?
 
I'd take the 100% guaranteed $350,000 rather than toss a coin for a million and have an equal chance of getting nothing but a penny. $350,000 is a lot of money. If someone offered me a choice on the street of 350K or that I could flip a coin and heads was a million, I'd take the cash (unless I could flip the coin in which case I would use trickery to guarantee a head). ;)

You and drkitten both make a very valid point of the psychology and why to take the money. I was pointing out that mathematically, the offer is almost never the better choice.

The other problem that I know of using expected value is that it describes the pattern of results over a large set of trials. A contestant on the game only gets one trial.

looking at it from a 1 trial perspective, you have about a 1/5 chance of getting one of the big money cases (100k+) so if you can get an offer over 100k it may be best to take the sure thing than risk it to get more since, as drkitten asked, will the extra money change your life that much more than 'just' the 100k?
 
In other news, my friend's boyfriend won $40k on the Aussie Deal or No Deal.

She was pretty happy with that.
 
That's really the only reason I've watched the show a few times -- I try to estimate what the banker's offer will be. It pretty quickly became apparent that the offers were somewhat lower than the expected value of the outstanding prizes, but I haven't been able to formulate an algorithm to determine how much lower. It doesn't help that I haven't calculated the expected values more precisely than I can do in my head by eyeballing the board, which isn't very precise at all until very late in the game.

Does anybody know whether the value is calculated formally and the same way for all participants (i.e. always offering the same buyout price for a given suite of unopened cases)? Or is the "risk aversion deduction" somehow adapted to individual players' demeanors? Or randomized?

I can only talk about the Australian version of the show, where the top prize is A$200,000. I've watched it a few times (I usually get home after work in time to watch the final half of the show), and they don't seem to be using a definite algorithm to pick the value of the offers; at least not a very simple one.

In most cases the offer is lower than what would be expected from the average of the prizes still on the board, but some times it is higher - and some times it is significantly lower. It looks to me like they do take into account the flow of the game to define the offer, and not simply what prizes are still possible. A participant who just had a very good round (taking out only low values) seems to get a higher offer than would be warranted, while one who had a bad round will get a lower offer than expected.

Also, no formal analysis... but the Freakonomics blog has some suggestions (in the comments, not the main post) at http://freakonomics.blogs.nytimes.com/2005/12/22/the-sad-thing-about-deal-or-no-deal/
 
I've watched the UK version a few times (top prize £250k) and it is clear that the Banker offers about 2/3 of the expected value of the box, with a significant amount of specific fiddling with the offered amount to take into account particular circumstances of the current contestant- sometimes, for instance, the Banker offers sums where the digits are 'neat' in some way.

The expected return at the outset is about £15,000.

The interesting behaviour I have noted is that contestants' tactics alter through the course of the game. I've never seen anyone 'deal' early, but if the offer significantly exceeds £15,000 they probably should, because with a lot of boxes left, the expected value of what remains is likely to show regression to the mean and fall back somewhere near £15,000. But dealing early would make for a shorter and (even) duller show on those occasions.

The other behaviour seems to be 'going for broke'. If the really high-value boxes have gone then people seem to care less whether they get £5,000 or 1p and seem to play the game out just for the heck of it.

It would be interesting to turn those casual observations into some testable hypotheses, but I would guess that if an early offer rises significantly above £15,000, say £20-30,000, then this should be accepted because the long-run outcome of not accepting at that stage is too often worse even than £15,000. I wonder, even, whether the show's producers rely on contestant's desire to keep playing the game and therefore be on stage for a good length of time not to exploit this early bail out strategy.

In the background, of course, is that the success of the game depends on the boxes containing a range of prizes ranging from amounts that most people would find 'life-changing' down to trivial sums, but that the transition point from 'life-changing' to 'worth having' varies among contestants. As has been pointed out, the possibility of winning £250k seems to lead to behaviour that flies in the face of the probability of winning £250k.
 
Well, one could argue that that's because expected value is NOT the most appropriate probabilistic tool for analyzing this show, precisely because the psychological "payoff" is not linear with the dollar value. One of the fundamental assumptions underlying game theory is that players are "risk-neutral," but most studies show that real people are somewhat risk averse.

Quite. The expected value calculation makes sense when you're playing an unbounded number of games where the randomness evens out. But you're not. You get to play exactly one game.
 

Back
Top Bottom