Mathletes: Two card-odds questions (blackjack and side bet)

I am stunned that this fact has not yet come up in this thread.

A counterquestion to Lance Turbo: Suppose a player, A, walks into a casino, with the intent in mind to play the septimus pattern. He keeps losing for a number of spins, but he’s not worried about that because he’s calculated the entire thing through in advance and knows the probabilities of winning and losing.

Some halfway through, player A changes his mind and decides to quit the system. He takes whatever money he is left with and leaves the casino, not finishing the pattern.

Suppose now we have player B who enters the casino with the same bankroll, but knows nothing about the septimus method. He just plunges bets on the roulette table at random, but by pure coincidence, he always bets the same amount as player A so far. After a few spins, he decides to leave, and coincidentally he leaves after the same amount of lost bets as player B.

I would be interested to know what Lance thinks the house edge for player B is. The obvious, and correct, answer is 5.26 %, and I don’t think Lance would assume differently since all B did was bet money on individual bets with 5.26 % house edge without a system.

But for all objective purposes, the betting pattern of A and B is identical. Can they have different house edges? Hardly. Can then player A’s mental reservation to keep betting according to a predetermined pattern change that? No. Was player A playing with a house edge <5.26 % before quitting, a figure that magically hikes to 5.26 % the second the decision to leave comes to A’s mind? No. What counts is the wager on the table. This is the money that is subject to the house edge, this is the money that will be lost of a player doesn’t hit the right number, and this is the money that will be multiplied by 36 if the player hits. That’s all there is to it. The player’s mental reservation to keep betting according to a given pattern is immaterial.

Reading this entire thread makes my head hurt but it seems that some folks may be confusing house edge with house hold. For every dollar wagered in roulette, a player should expect to lose 5.26 cents. A casino will expect to win 5.26 cents per spin. A casino will pay out comps based on this number.

Naturally small samples of less than an infinite number of plays will result in abnormal results due to a higher deviation. This is why some people win at roulette. The casino doesn’t care if you walk away with a $1000 profit. They know that over the long term, they’re going to keep 5.26% of every bet made at the roulette table. They’re going to keep even more of your money if there’s a side bet and you play it.

ETA: And one other thing, no betting system in roulette will produce better results than the 5.26% expected player loss over an infinite number of plays. There is no way to overcome the house odds by betting on any pattern or combination of numbers (although there are some combos that will result in an even greater house edge, such as 0, 00, 1, 2 3.)

It is almost meaningless to refer to the expectation of a known outcome; your example seems to be implying one particular outcome instead of one where probability is applicable.

If you mean it exactly as a one-time event, the house has all the money so 100% is the house edge.

If you posit a somehow repeated application, where A always gives up after some particular number of losses, then you’d need to specify how many to come up with a meaningful value. 2/38 or worse can be claimed, but there’s really not enough information for that to be useful. There are other ways you could go - for example, the time limit I floated earlier (A only plays x number of spins).

If instead you’re asking whether the expectation changes — from < 2/38 when A enters to something greater when he decides (perhaps after the first few losses) to give up at some future point before losing all his money, then of course it does. He’s changing the game being played.

If A walks into the casino intending to play slots (say, a house edge : 3%), what’s the house edge for him?

If he instead goes off the play blackjack for a few hours, what’s the house edge then?

I haven’t been back to this thread for a while, but reading through the last set of posts, the disagreement seems to just be a definitional one. How about this, maybe:

From the house’s perspective, the house edge is the normal 5+ percent. They don’t know or care about the betting pattern that players might be using.

From the player’s perspective, he can consider the house edge to be the ~4 percent. The player can consider his entire trip into the casino, or whatever he likes, in figuring his overall chances.

Anyway, happy holidays everyone :).

panamajack answered your question already. It doesn’t make much sense to compute house edge after the fact. Also, if a player does not complete the Kathmandu Martingale process wasn’t really playing the Kathmandu Martingale and computing the house edge against such a player bring us no closer to computing the house edge against the Kathmandu Martingale.

A player playing Kathmandu martingale makes no meaningful decisions once the process starts and will not make any until he is either up $1000 or has lost $1000.

Now your question has been answered. Will you answer any of mine?

Well, not really. Suppose our player stops short of playing the Kathmandu Martingale one spin before it ends. Was he on the track of successfully cutting down the house edge up until almost the end? Was the house edge the reduced one for the first n-1 spins, and it is only the last one which, all of a sudden, hikes it up to the unreduced 5.26 %? At which point in time does the house edge magically increase? Is it a sudden one-time increase for the last spin?

Which one do you mean? The one at the end of post 139? The answer is no, for the reasons you yourself emphasised in the question: We don’t know the betting pattern.

The discussion seems to arrived at a dead end here. Some people, notably Chessic Sense and myself, agree with your maths, but disagree with your interpretation of the maths in the sense that what you compute is not the house edge - it is another figure, arrived at by measuring the payouts against the player’s bankroll rather than the player’s wagers. This is not what the term house edge is defined as. Note that the originator of the pattern, septimus, himself seems to agree with this position.

Of course you can play down the disagreement to a mere question of terminology - should we call the result of your maths “house edge” or not? Of course there is nothing preventing us from establishing a new parameter which we compute by measuring the payouts against the player’s bankroll rather than the wagers. You can call this parameter anything you like, including “house edge” if you will. But it is not what the house edge in the commonly understood meaning of the word is, and it does not have the same content and explanatory power as the term house edge in its proper meaning has.

I’m not going to explicitly calculate this, but the house edge against such a player is huge. Stopping one spin short of the end of the Kathmandu Martingale introduces a new stopping rule that is extremely unfavorable to the player. Something like 98% of time this is equivalent to the player losing every dollar he risked for a whopping house edge of 100%. Post event probability questions often get results like that.

Betting pattern is not part of the definition of house edge. To compute house edge we need to know possible outcomes, the probability that those outcomes occur, and the amount risked.

Also could you answer my questions from posts 133 and 137?

This has come up a few times in this thread and it is factually incorrect. I am computing house edge directly from the definition of house edge. At no point am I comparing payouts to bankroll, I am comparing payouts to money risked because that is how house edge is defined.

You believe that the player is not risking $1000 but the only possible negative outcome is losing $1000. I literally do not understand how facing the possibility of losing $1000 is not risking $1000. Look up the definition of risk if you have to.

You seem to think that if the player wins on the first spin he only risked $28.57. The mistake you are making there is comparing an endpoint of the process (player wins) to an intermediate step of the process (player has a slightly greater chance of of losing a total of $1000). Another way to say it is that when the player doesn’t win on the first he is out $28.57 but he is also now forced to play a game where he must turn $971.43 into $2000 a worse situation than the one he was in at the start. When you look at an individual spin you only see the $28.57 but when you look at the process as a whole the risk to the $971.43 is very real.

What I agree with is that “house edge” might be defineable in more than one way, and that when two people are clearly using different, but internally consistent, definitions, it’s silly to argue, especially if they both agree on the math. Checking which definition comes closer to one found in a dictionary might be interesting to some; it doesn’t interest me.

If you try to double your bankroll at the roulette table using only even-money bet, your best strategy is to bet it all at once; your chance of success is 18/38 = .47368. The house vigorish is 2/38 = 0.05263. There is a simple arithmetic relationship between these two numbers.

If you try to double your bankroll at the roulette table using the Kathmandu Martingale, your chance of success is .48089. (Are we all in agreement so far?) To assert that the “house edge … with explanatory power” is, again, 0.05263 is counterfactual since, in this case, the “explanatory power” of vigorish is its clear relation to chance of bankroll-doubling. In this case, .03822 would be the meaningful measure of “vigorish” which “explains” (or quantifies) player’s disadvantage.

Might this conflict with someone’s “standard” definition, however useful it might be in other contexts, of “house edge”? Yes. So what?

It would seem that the best strategy is to go to Europe, place a single even money bet, at a casino that offers “en prise” betting. In this situation the house edge is only 1.35%

The higher payoff bet can be better than the low-payoff even with higher vigorish. In your example, even-money bet would give your best chance of bankroll doubling, but, despite its higher vigorish, the single-number bet would give you better chance of bankroll quintupling.

But since the best strategy at random roulette, given a linear or sub-linear utility function, is not to bet at all, I hope you have other plans to recoup your airfare in your scenario! :smiley:

(Because of their peculiar procedure, player had a big advantage at a roulette-like game in Kathmandu casinos when I visited many years ago. But I supposed they’d change the procedure were someone to try and take big advantage.)