• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Dumb probability question?

Badly Shaved Monkey

Anti-homeopathy illuminati member
Joined
Feb 5, 2004
Messages
5,363
OK, I'll await the howls of derisive laughter, but bear with me...

John Maynard Keynes is much in the news at the moment as we try to spend our way out of financial catastrophe. An anecdote of him reported in a news bulletin a few days ago said that he was staying in Monte Carlo and heard that one of the casinos had removed the zero from the roulette wheel so it had become a completely fair bet and, because of that, he went off to spend the evening gambling.

Am I right in thinking that this was a rational thing to do because if a roulette bet is completely fair then you can always win? I don't think I'm being completely stupid and I think I am agreeing with Keynes' actions.

If a bet is completely fair, but you know that you will stop when you have achieved some specified set of winnings then you must win because, although the long-run probability is of a tie between you and the casino, if you stop playing at the point when there is a sufficient excursion from the tie then you can permanently bank that win. So, in my estimation, actually it's not a fair competition between you and the casino, because the punter has an extra piece of information- the knowledge that he will stop once a specified excursion from a tie has been achieved.

But, if I can do that, does it mean the casino will be broken by other individuals adopting the same strategy?

As far as I can see, the only problem is deciding how big an excursion from a tie you aim at and to calculate how long on average that excursion will take to appear. Also you must have enough liquid funds to cover excursions in the casino's direction that occur before your winning point is reached.

I think that seems correct.

Where my head begins to hurt is if I ask what happens if I ever return to the roulette table. Common sense says I can't repeat my tactic, because at some stage the long-run fairness must mean that the tendency is for me to tie with the casino, yet we know that the gambler's fallacy is just that, a fallacy, the wheel has no memory of past outcomes so if I come for a new series of turns on the wheel their outcome is independent of my past play so I should be able to repeat my strategy of waiting for my predetermined excursion. That does not seem right so there must be a flaw in my argument.
 
If a bet is completely fair, but you know that you will stop when you have achieved some specified set of winnings then you must win because, although the long-run probability is of a tie between you and the casino, if you stop playing at the point when there is a sufficient excursion from the tie then you can permanently bank that win.

That works whether or not the odds are in your favor. Just bet red or black and double the bet every time you lose. The problem is that when you reach the table limit or run out of money, you're done, and done badly.
 
The flaw in your argument is to assume that at some point the profits must be even in order to tend to what you would expect by chance.
The table has no memory, either it is a continuous play or not. You can play for very long time before your “luck” to change, any kind of outcome has a probability associated whit it, if you play it long enough the outcome of you wining a million dollars will eventually happen and so does the outcome of losing another million.
Although any outcome whit probability different from zero is likely to happen, it is safer to say that those most expected by change will most likely happen. It doesn’t matter if you won a million times more then you lost, you can still be falling in what it is expected by chance if you needed to made a trillion of plays.
 
To that assumes that the casino pays out amount dirrectly equiv to the odds but lets say it does also lets suppose everyone stops the moment they have made a profit.

And to simplify the game becomes straight 50:50 and all wagers are 1 currency unit.

Start with 128 people

Round 1 64 are now up one currency unit and walk away. 64 are down.

Round 2 64 have left 32 are level and 32 are two currency units down.

Round 3 64 have left 16 are up one CU zero are level 32 are one CU down and 16 are three CU down

Round 4 80 have left 16 are level zero are one CU down 24 are two CU down and 8 are four CU down

Round 5 88 have left 20 are one CU down 16 are three CU down and 4 are five CU down


So after 5 rounds 88 are in profit by one CU 20 are one CU down 16 are three CU down and 4 are five CU down and the trend by this stage is fairly clear.


The casino is level haveing paid out 88 but taken in 20+(16*3)+(4*5)=88.

The average amount made per person is zero. So in effect by following the stratergy eventualy most people will win but at the risk of significant loses. But in order to get a high chance of winning a small amount you have to follow a stratergy that risks signficant loses.
 
If a bet is completely fair, but you know that you will stop when you have achieved some specified set of winnings then you must win because, although the long-run probability is of a tie between you and the casino, if you stop playing at the point when there is a sufficient excursion from the tie then you can permanently bank that win. So, in my estimation, actually it's not a fair competition between you and the casino, because the punter has an extra piece of information- the knowledge that he will stop once a specified excursion from a tie has been achieved.

You have this more or less backwards. The asymmetry is not in information, it's in the availability of funds. Assuming the casino has infinite funds available and you don't, you will always lose in the long run no matter what betting strategy you follow.

Think of it like this. Since each bet is independent of every other bet, and since each bet is even odds, no matter what strategy you follow your expected earnings are zero. But, you object, I can quit while I'm ahead, and only when I'm ahead. That's true - but the odds are just as good that you'll end up behind. If you keep playing, you might get back into the black, or you might sink lower - and at some point you are out of money and you've lost everything.

A good way to phrase this mathematically is the gambler's formula, which states that in a fair game with one opponent played until either you or your opponent is out of money, your odds of winning all the money are a/(a+b), where a is your total funds at the beginning and b is hers. Note that this goes to zero when b goes to infinity. There is a version of this formula that tells you the optimal amount to bet given the odds, your stake, and the time you have. You must have better than even odds to have a positive expected earnings (for the reason above), and moreover you don't want to bet too much - otherwise the odds are a bad run will come along and wipe you out.
 
Last edited:
The flaw in your argument is to assume that at some point the profits must be even in order to tend to what you would expect by chance.

The Drunkard's Walk theorem says that that will happen.

More accurately, the Drunkard's Walk theorem says that at some point, the profits will (with probability 1) be EVERY possible number. At some point you'll be even, you'll be up a million, you'll be up a billion, or you'll be down seven bucks.

The problem is which happens first : you run out of money or the casino does. (Alternatively, which happens first, you hit your upside limit where you walk away, or you hit your downside limit where you walk away.) The relevant math says that the amount you win (in all the cases where you win) exactly balances the amount you lose (in those cases).

Are you willing to risk losing your house to win $10? Apparently Keynes was. If you are, too, then go for it.
 
The Drunkard's Walk theorem says that that will happen.

More accurately, the Drunkard's Walk theorem says that at some point, the profits will (with probability 1) be EVERY possible number. At some point you'll be even, you'll be up a million, you'll be up a billion, or you'll be down seven bucks.

The problem is which happens first : you run out of money or the casino does. (Alternatively, which happens first, you hit your upside limit where you walk away, or you hit your downside limit where you walk away.) The relevant math says that the amount you win (in all the cases where you win) exactly balances the amount you lose (in those cases).

Are you willing to risk losing your house to win $10? Apparently Keynes was. If you are, too, then go for it.

Certain and almost certain outcomes are not exactly the same.
Not even Drunkard's Walk theorem gives me absolute guarantees that such will happen in finite time, (although it’s probability converges to 1).
But that was not the point of my statement, the point was that to converge to the odds is no exactly to have null distance to the odds.
 
This is often referred to as the Martingale betting system, which has been around for centuries.
 
This is often referred to as the Martingale betting system, which has been around for centuries.

And it works very well in computer games like Grand Theft Auto that not only have casinos, but also have save and load functions.
 
OK, I'll await the howls of derisive laughter, but bear with me...

John Maynard Keynes is much in the news at the moment as we try to spend our way out of financial catastrophe. An anecdote of him reported in a news bulletin a few days ago said that he was staying in Monte Carlo and heard that one of the casinos had removed the zero from the roulette wheel so it had become a completely fair bet and, because of that, he went off to spend the evening gambling.

Am I right in thinking that this was a rational thing to do because if a roulette bet is completely fair then you can always win? I don't think I'm being completely stupid and I think I am agreeing with Keynes' actions.

If a bet is completely fair, but you know that you will stop when you have achieved some specified set of winnings then you must win because, although the long-run probability is of a tie between you and the casino, if you stop playing at the point when there is a sufficient excursion from the tie then you can permanently bank that win. So, in my estimation, actually it's not a fair competition between you and the casino, because the punter has an extra piece of information- the knowledge that he will stop once a specified excursion from a tie has been achieved.

But, if I can do that, does it mean the casino will be broken by other individuals adopting the same strategy?

As far as I can see, the only problem is deciding how big an excursion from a tie you aim at and to calculate how long on average that excursion will take to appear. Also you must have enough liquid funds to cover excursions in the casino's direction that occur before your winning point is reached.

I think that seems correct.

Where my head begins to hurt is if I ask what happens if I ever return to the roulette table. Common sense says I can't repeat my tactic, because at some stage the long-run fairness must mean that the tendency is for me to tie with the casino, yet we know that the gambler's fallacy is just that, a fallacy, the wheel has no memory of past outcomes so if I come for a new series of turns on the wheel their outcome is independent of my past play so I should be able to repeat my strategy of waiting for my predetermined excursion. That does not seem right so there must be a flaw in my argument.

What about 00?
 
The Monte Carlo casino has not had a double zero since the 19th century. That's a disgusting Americanism (that exists only to increase the house edge.)

Ugh, I feel violated. Somehow having Ketchup and Dill Pickle flavoured chips doesn't seem to compensate for this...
 
Makes perfect sense to me. Assume infinite budget on the part of the house and the player. Now assume a finite limit of money which the player wishes to win. Now assume infinite time.

It is an absolute certainty that the imbalance will be in the player's favor at some point in time.

Now change that to a finite budget on the player's part and finite time. It's very simple to calculate an acceptable risk of ruin and an acceptable time scale and choose a sensible amount to gain for your budget.

The house, in this case, is counting on the fact that most gamblers don't have an amount of money they want to win. They have an amount of money they're willing to lose. And given infinite time (which the house has) and infinite money (which the house basically has, for all intents an purposes) this is a total winner in their book.

Also remember they're counting on other factors too. Roulette is an incredibly BORING activity. Many Roulette players will choose to also play other games, games which tilt in the house's favor. If they make winnings on the Roulette wheel, they will then tend to play other games with their 'free money.'

P.S. Note that the casino will not lose out from sensible players on the Roulette wheel, because eventually those sensible players will hit their 'risk of ruin' percentile, and given the budget that needs to be involved for a sensible risk of ruin, that will counter all the smaller amounts made.

P.P.S. The house always wins, unless you're counting the cards.
 
Last edited:
I have seen payout charts for slot machines (I was a bench tech for the largest casino in the U.S.) where they started out lower than expected, which happens alot, and never got higher than expected, until finally reaching expected payout. That means that if someone started playing the machine the day it was put in, they would never have been at a point when they were ahead, even though at the end the machine was at the expected pay out. Many would go above expected, then below, then above again until settling down to expected. This is not a guarantee though, as shown with the original examples.

This can also be applied to your roulette scenario, where the expected payout is even money. You could start with your first bet, lose, and then never reach a point where you ever come out ahead, but then stay there a year just to get to the point of breaking even. I would say that the chances of this happening are slim, but so are the chances of making any substantial amount of money.
 
Makes perfect sense to me. Assume infinite budget on the part of the house and the player. Now assume a finite limit of money which the player wishes to win. Now assume infinite time.

It is an absolute certainty that the imbalance will be in the player's favor at some point in time.

Now change that to a finite budget on the player's part and finite time. It's very simple to calculate an acceptable risk of ruin and an acceptable time scale and choose a sensible amount to gain for your budget.

The house, in this case, is counting on the fact that most gamblers don't have an amount of money they want to win. They have an amount of money they're willing to lose. And given infinite time (which the house has) and infinite money (which the house basically has, for all intents an purposes) this is a total winner in their book.

Also remember they're counting on other factors too. Roulette is an incredibly BORING activity. Many Roulette players will choose to also play other games, games which tilt in the house's favor. If they make winnings on the Roulette wheel, they will then tend to play other games with their 'free money.'

P.S. Note that the casino will not lose out from sensible players on the Roulette wheel, because eventually those sensible players will hit their 'risk of ruin' percentile, and given the budget that needs to be involved for a sensible risk of ruin, that will counter all the smaller amounts made.

P.P.S. The house always wins, unless you're counting the cards.

Thanks for all the various comments.

Clearly I made some implicit assumptions, specifically infinite time and money.

So, as I see it now, I was right that if you have limitless resources and set a finite target win at which point you will walk away with the money in your pocket, you will always achieve that win, at some point.

But, given finite resources on the part of each player and the casino, some players will fail to achieve their target win before they run out of money and will have to walk away. The number of players out of a population who fail is, I presume to guess, related to the ratio between the players' resources and the casino's. Effectively the proportion who fail is a method of measuring the relative size of the players' and casino's resources.

Is that a fair summary?
 
Thanks for all the various comments.

So, as I see it now, I was right that if you have limitless resources and set a finite target win at which point you will walk away with the money in your pocket, you will always achieve that win, at some point.

No. Some people here have said that, but it is untrue. The only thing close to a certainty is that you will break even eventually, but that doesn't mean that at any time you are guaranteed to be above that break even point.

I have seen a whole display of results on the roulette table end up black or red, (I know it is anecdotal, but still plausible, right?) which means some were winning big, and some were losing big. The losers will win some eventually, but may never win enough to get over the expected, break even point.
 
No. Some people here have said that, but it is untrue. The only thing close to a certainty is that you will break even eventually

If there is no table limit, then with unlimited resources you can eventually win any amount you choose by betting red or black and doubling the bet every time you lose.

If you are betting constant amounts, it is not close to a certainty that you will break even eventually. The distance from even as a percentage of the total bets will tend to decrease, but the absolute distance from even will not.
 
No. Some people here have said that, but it is untrue. The only thing close to a certainty is that you will break even eventually, but that doesn't mean that at any time you are guaranteed to be above that break even point.

The Drunkard's Walk theorem says that, (with an infinite bankroll and infintie time) you will with probability 1 achieve EVERY finite return. You will, with probability 1 eventually win as much as you like.

If you're trying to make some subtle argument about the difference between "always" and "probability 1," you're failing to express it. If you are asserting that there is a finite, nonzero probability that you will never win any arbitrary amount, you're simply wrong.
 
No. Some people here have said that, but it is untrue. The only thing close to a certainty is that you will break even eventually, but that doesn't mean that at any time you are guaranteed to be above that break even point.

Think about what you said: it's certain you'll break even eventually, but not certain that you'll ever be above even. That makes no sense - if you'll be at even once, you'll be at even arbitrarily many times (remember, we have an infinite amount of time and an infinite amount of money). So start from one of those times, and half the time you'll go up, hold the time down.

As drkitten says, you'll hit every amount with probability 1.
 

Back
Top Bottom