Mathletes: Two card-odds questions (blackjack and side bet)

This:

Should be: 1-[(1/38)(20+35)/20 + (37/38)(1/38)(19+35)/20 + … + (37/38)^19 (1/38)(1+35)/20 + (37/38)^20 (0)] = 4.13 %

I’m not clear on what it is you are calculating here, but it is not Expectation as the term is commonly used in reference to gambling games. And I don’t think your idea of examining 21 events is quite kosher, either; you only have 20 events … failing the proposition doesn’t seem to call for it’s own term, although your final term is zero … I don’t get what you’re doing here.

Ah ha. I got it. You must examine all 38 events. But I still don’t get what you are calculating.

House Advantage = (36/38)-1 = -0.0526
Expectation = ($Wager)(House Advantage)

For a $1 bet E=1(36/38)=$0.947
For two $1 bets E=2(36/38)=
For three $1 bets E=3(36/38)=

Each bet is an independent trial and each bet has the same expectation. The total expectation for a series of bets is simply the total dollar amount wagered times the Expectation.

The error is in calculating the house edge by dividing the EV by the initial bankroll. That’s incorrect; the definition of house edge is dividing the EV by the total of all individual wagers.

In your strategy, the total of all bets is often less than the initial bankroll, so the expected loss will often be less than the initial bankroll. In that case, using the initial bankroll to calculate the house edge will give an incorrect result. If you go back and redo the calculation to divide the EV by the actual total of all bets (statistical expected value of all bets, in this case), you should find the same house edge as any single bet.

Consider this: by starting with an initial bankroll of $40, and using the same strategy except quitting after 20 losses, your expected value is $20 more, and your initial bankroll is $20, so the house edge is now halved, according to your calculation method.

Or, more simply, consider this strategy: start with $20 and don’t bet any of it. Your EV is $20, and the intial bankroll is $20. Does that mean the house edge at roulette is zero?

Each spin of the wheel is not independent in the strategy that I laid out. In particular the nth spin does not occur if I win on the first spin through the (n-1)st spin.

If the game was just to play a single number 20 times you guys would be right, but this is not the game I am playing.

Think of it this way, what is the probability that I win the nth spin?

(37/38)^(n-1) is the probability that I have an nth spin at all. Multiply that by (1/38) the probability that I win once we get to that spin to get (1/38) (37/38)^(n-1) .

How much do I win in this case?

I lost 1 on each of the first n-1 spins so I have (21 - n) and then I win 35 on the spin for a total of (56 - n) or (56 - n)/20 times my initial stake.

Multiply and add to get expected value.

While this is correct, Lance did not make an error. Your error is in not understanding the problem, ror understanding Lance’s arithmetic explanation of his solution. Let me repeat the problem:

Puzzle: You have $1000, but want $2000 and have access to an American-style roulette game. … What should you do to maximize that chance?

The fact underlying your post gives a hint as to why Lance’s approach is correct. Do you see it?

Yes, it is. This is a fact.

I am not arguing that the result of a given spin depends on the results of previous spins. You are correct that those things are independent.

My argument is that the occurrence of a given spin depends on the results of previous spins. I’m not sure you could argue against that in the strategy I’ve laid out. We must know the result of the first spin before we consider the second. We must know the result of the second before we consider the third. And so on.

Absolutely. And the argument that “the nth spin doesn’t occur if I win a previous spin, therefore the spins are dependent” is fallacious - of course the nth spin occurs, you’re just not betting any money on it. Evidently, determining the amount of a bet is the player’s free discretion, but that’s not what is meant in stochastics when two events are said to be independent of each other. Independence in that sense means that the outcome of a chance event does not affect the probability distribution in the other chance event, and that is absolutely the case for Roulette (but not blackjack).

Just so I’m sure I’m following along, you guys are essentially saying that you’re playing slow so that you don’t have to expose your latter money to the house edge? That is, you can quit once you have your victory condition, thus protecting your money?

And of course, that’s only because 1.99xstack is considered a loss, right?

(If this turns into an interminable and useless discussion I apologize for the hijack.)

Rather than parsing Lance Turbo’s post incorrectly hoping to score a debating point, best would be to use any new insights and re-answer the posed puzzle explicitly:

Puzzle: You have $1000, but want $2000 and have access to an American-style roulette game. … What should you do to maximize that chance?

(a) Bet $1000 on Red is as good as anything.
(b) The only best strategy is to begin by betting $28.50 on a single number, and, after a loss, to continue in similar fashion.
(c) Other.

Yes. But “protecting your money” may be misleading, as you do plan to go down to zero if necessary. It’s just that the expected total action by you is less than $1000. (BTW, one way of understanding the “paradox”, if that’s what this is, is that any bet is $P[layer’s action] vs $H[ouse’s action] (where, e.g. H=35*B for a single-number roulette bet), but vigorish is a constant portion of P, not H.

Casinos understand this implicitly of course. That’s one reason why Keno tickets have higher vig than roulette. It’s why European casinos leave only even-money bets “en prise” when roulette’s zero appears. (That complication is why I specified American roulette.)

:confused: :confused: This I don’t follow. If you’re suggesting the extra win chance with correct play is smallish, you’re right but IIRC it’s larger than your 1.99 suggests.

Your description of option b (“continue in similar fashion”) is exceedingly vague. I suppose what you mean is:

If you win the bet (which occurs with a probability of 1/38), you reap $1026 in winnings, and still have $971.50 in money not yet bet, giving you a total of $1997.50 - which actually does not fulfil your requirement of having $2000. But this is probably just some rounding difference, and in fact you want to bet an amount slightly higher than $28.50, which ensures that if you win you’ll have a total of $2000. The casino will probably not let you bet this amount, since casinos typically only admit a discrete set of bets (such as whole dollars), but let’s ignore this point for argument’s sake.
If you lose your $28.50 bet, you make a second bet on a single number, which is slightly higher so that in the case of winning, the winnings compensate for the losses sustained so far, pushing your tota money to $2000. The equation to calculate the amount of your second bet (x) is:

971.50 - x + 36*x = 2000

If you win, you reap the money and happily leave the casino with $2000 in your pockets. If you lose, you keep going, slightly increasing your bet again using the same logic.

Is that what your option b is?

If yes, then you should realize that option b is just another useless variation of the Martingale system, for which so many players who were certain to have found a “system” have fallen before: It’s just a tool increasing bets in the case of losses. Agreed, your option b would have the stake grow less fast than the Martingale with its doubling, but that doesn’t mean it’s more useful - you cannot evade the fundamental principle hard-wired into the rules, namely, that the casino enjoys a 5.26 % advantage.

So option a is as close as it gets to a correct answer to your puzzle, even if you won’t believe it.

Chessic was simply asking for a clarification of the rules of your puzzle. If somebody manages to find a “strategy” maximizing the probability of leaving the casino with $1990, would you accept that as a solution to your puzzle, or is (as the puzzle has been presented so far) $1990 just as much a failure as $0 or $10 or any other amount smaller than $2000 would be?

No, I don’t want clarification on the rules, I want clarification on the solution. This theory only works because having 1.99x is considered a failure and you have to keep spinning. If you could walk away with 1.99 and be happy with yourself, then the whole theory breaks down.

But I understand the question whether 1.99 is considered a failure to be a rule question. The solution presented to meet this requirement, imposed by the rules, is a wholly different thing.

Edit: By “rules”, I mean not the rules of roulette but those of septimus’ puzzle: What does it take to be considered having solved the puzzle - is 1.9 sufficient, or does it have to be 2? I think that’s the source of the confusion.

Maybe I wasn’t clear. I was referring to Lance’s calculation of a house edge of 4.15% for his ‘Bet $1 at a time until you win or go broke’ strategy. I wasn’t trying to address whether the strategy is a good one for doubling the original bankroll. And my point was that his calculation of the house edge doesn’t use the same quantities as the typical calculation, and the reason he appeared to have decreased the house edge is merely because of his non-standard calculation (or, you could say, non-standard definition of house edge).
But as far as the best strategy for maximizing the chances of doubling one’s original bankroll, I don’t think that strategy is actually a good one. Let’s compare the simple one-bet-for-the-whole bankroll strategy versus the $1-at-a-time.

First, look at betting all $20 on black. If we win, we double our stakes, if not, we’re done, game over.
Chance of winning the single bet on black = 18/38 = 0.473, so that’s the chance of doubling the stake.

Now let’s look at the $1 at a time strategy. Starting with $20, bet $1 at a time on one number, until we win or go broke. Now if we don’t win by the time the bankroll gets down to $4, we can’t double the bankroll at all (because if we win when the bankroll is $4, we end up with $39, less than twice the original stake). Chance of winning any single bet is 1/38. Easy probability here, the chance of losing all of the first 16 bets is (1-1/38)^16 = 0.653, so the chance of winning at least once in the first 16 bets, therefore doubling the original bankroll, is 1 minus that, or 0.347.

So in fact, the strategy ends up being worse than one bet on black. Which actually makes sense if you think about: the first one bet on black strategy will only end up with $40 total in the best case, whereas the $1 at a time strategy results in $55 in the best case. Higher rewards is always higher risk, and the reverse. You could improve the $1 at a time strategy slightly by changing it to from ‘quit as soon as you win or go broke’ to ‘quit as soon as you double the original stake or go broke’, but it won’t improve things much – you’re not going to win in the last four bets often enough to make a big difference.

In fact, I don’t think it’s possible to do better than ‘bet it all on black’ if the goal is to maximize chances of doubling the original stake. If the goal is only to win a certain fraction of the original stake, then you can arrange things so that you have a good chance of winning, say, a tenth of the original stake, with a small chance of losing it all (in fact, that’s exactly what Martingale systems do), but since you can’t go lower than zero, there’s no way to do a Martingale if you want to double your stake.

Once again my game is not the same as septimus’s game. I thought my game would be simpler to understand since I would be betting a constant amount.

I used Schnitte’s definition (logically equivalent to mine) of house edge later in the thread and arrived at the same answer. How do you define house edge? I’ll use your definition and get the same answer.

In particular show me how you would calculate the house edge for a bet in the field on a single roll on the craps table (pays 1:1 on 3, 4, 9, 10, 11; 2:1 on 2; and 3:1 on 12). I will use your method to get my answer.

Now for septimus’s game the optimum strategy (explained thoroughly and completely in the linked thread) is to bet 1/35 of the difference between your goal and your stack if possible. If not possible bet everything you have. This will result in your doubling your bankroll a little more than 48% of the time. The house edge in this case is slightly less than 4%.

I assume (as Schnitte explained) that your (b) strategy is to always bet exactly the amount that will bring your total to $2000 if you win. That is, your first bet is $28.57, since $28.57*35 = $1000. If you lose that, your second bet is $29.39, since you now need to win the 28.57 you just lost, in addition to the 1000 you’re trying to profit.

I just created a quick Excel file, and there I get that using this strategy you will play up to 24 spins of the wheel. If you lose 24 times in a row, you only have $33.80 left, but you would need to bet over $50 to continue your strategy. The chances of losing 24 straight spins is (35/38)^24 = 0.5273.

So, 47.27% of the time, you win somewhere in the 24 spins. The other 52.73% of the time, you have $33.80 left. So, bet that on a number. 1/38 of the time, you will hit to bring your total to $1216.92, and from there, we can start the (b) strategy again. The chances of winning at this point are at least the 47.27% we had with $1000.

Therefore:

  • Chance we win on one of the 24 spins: 47.27%
  • Chance we lose 24 in a row, then hit the 25th, and then win is at least: 0.5273 * 1/38 * 0.4727 = 0.006559 = 0.66%

Total chance of doubling our stake using this strategy is at least: 47.93%

The chance of doubling on a single even-money bet is 18/38 = 47.39%

A person playing strategy (b) will double their money more often than one playing strategy (a)*.

  • Unless my math is wrong, which is certainly possible.

In the case where we miss 24 then hit number 25 we have enough money to continue betting 1/35 of what we need for 28 more spins before a second desperation spin.

This process continues ad infinitum pushing our overall chance of success to over 48%.

Yeah, I didn’t want to continue through calculating through the diminishing returns after I got to the point where the overall success was better than the single even-money bet. I have to say that I did not expect this to be the result when I started thinking about it.