Wikipedia:Reference desk/Archives/Mathematics/2016 June 26

From Wikipedia, the free encyclopedia
Mathematics desk
< June 25 << May | June | Jul >> June 27 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


June 26[edit]

Would any of those gambling schemes delay the total loss of the gambler's stack?[edit]

A gambler, like many, would stop betting on a roulette only if he's out of chips. Would playing by any gambling "strategy" delay the total loss? (like doubling the stake or adding 1 chip to it after a loss). Would a "flat" gambling "strategy" (that is 1 chip on red/black) make him play longer?--Hofhof (talk) 17:33, 26 June 2016 (UTC)[reply]

I haven't worked it, but I would think that you continue to have chips longest by minimizing your bet, that is, by only making the minimum possible bet every time. (For convenience, we will call the minimum bet one.) What doubling the stake after every loss does it is to make there be two possible outcomes on any particular betting sequence. Either, after some number of bets, you win, and have a net gain of one, or you will have a run of bad luck, and will eventually tap out your stack and lose everything. This strategy, commonly known as Martingale, reduces the expected length of time that you will stay in the game, because it subjects you to the risk of a run of bad luck. I don't know if there is some other strategy, other than always betting one, that can maximize the time that you stay in the game. Robert McClenon (talk) 20:19, 26 June 2016 (UTC)[reply]
Bet the minimum on both red and black. You would last the longest. 175.45.116.105 (talk) 22:43, 26 June 2016 (UTC)[reply]
You'll last for twice as many spins on average if you just bet the minimum on one color. (But if betting on red and black counts as two bets, then you'll last for the same number of bets.) -- BenRG (talk) 00:07, 27 June 2016 (UTC)[reply]
I don't think that's correct. In a limiting case where there is no house advantage, betting on both will let you play forever, and playing just one at a time will guarantee a loss within finite time. So for a small house advantage, betting on both should still be better.
Betting on both red and black is a bet against the zero. You will lose both bets when the zero comes up. So that isn't a way to last forever. Robert McClenon (talk) 23:26, 27 June 2016 (UTC)[reply]
That's why I said "in a limiting case where there is no house advantage". The zero is the house advantage. -- Meni Rosenfeld (talk) 23:35, 27 June 2016 (UTC)[reply]
I think you haven't taken into account the anticorrelation between multiple bets on the same spin. Of course, anticorrelating your bets eliminates the chance of profit as well. -- Meni Rosenfeld (talk) 01:46, 27 June 2016 (UTC)[reply]
If the odds and payoff are both 1:1 and you always bet one token, the expected number of bets before you go broke is infinite ([1]). I suppose this is a variant of the St. Petersburg paradox. I think what I said about "average" (expected) survival time is correct, but you're right that the distribution of survival times can be different. -- BenRG (talk) 23:08, 27 June 2016 (UTC)[reply]
The discussion about betting on black and red, and an expected number of bets of infinity, overlooks the zero. Since the zero does come up, the number of bets reflects the house edge, which depends on whether the wheel is a European wheel or an American wheel. Robert McClenon (talk) 13:18, 28 June 2016 (UTC)[reply]
Actually, if the house has finite resources then the expected time until you or the house goes broke is probably finite, while betting on red and black can still continue forever, so I guess things are more complicated than I thought. -- BenRG (talk) 23:22, 27 June 2016 (UTC)[reply]
Your expected loss on each bet is the amount of the bet times the house's edge on that type of bet. If you can independently minimize those, then intuitively it should be best to minimize them both. But there are cases where that isn't true: for example, if you have 3 tokens left and the minimum bet is 2, you should bet 3. If the legal bets are the positive integers, then always betting 1 on any minimum-expected-loss outcome is probably optimal, but I don't see how to prove it. -- BenRG (talk) 00:07, 27 June 2016 (UTC)[reply]
Rather than get into the theoretical explanations I whipped out Excel and did a simulation. I assumed an initial stack of 100 and a game where the house wins 10/19 of the time and you win 9/19 of the time, but the payoff is even money so the house has an expected win of about 5%. With a strategy of bet 1 until you run out of money it took and average of about 1900 plays until out of money. With a strategy of double your previous bet if you lose, else bet 1 it took an average of only about 310 plays. (The details of the strategy were a: initial bet 1, b: if out of money don't bet, c: if lost previous bet then bet twice the previous or your current stack, whichever is smaller, d) if won previous bet then bet 1.) --RDBury (talk) 13:50, 27 June 2016 (UTC)[reply]
If you want to stay in the game longer, walk away from the roulette table, which has too much of a house edge,and go to the crap table, and bet with the shooter, where the house edge is less than 1%. Robert McClenon (talk) 23:27, 27 June 2016 (UTC)[reply]

Refraining from gambling is the way of avoiding being ruined by gambling. Bo Jacoby (talk) 11:14, 28 June 2016 (UTC).[reply]

You're right. However, the OP didn't ask how to avoid being ruined, but rather how to delay the (inevitable) total loss as long as possible. --CiaPan (talk) 15:02, 28 June 2016 (UTC)[reply]