Mathletes: Two card-odds questions (blackjack and side bet)

Is that a disagreement with statement 1), 2), or 3)?

I have to say I’m embarrassed now. I initially thought that there would be significant deviations from the increasing probability with increasing payoff. As it stands, there are points in between that vary (if you use non-integer payoffs). So my earlier guesses I posted are probably all wrong. I would hazard, though, that with each “step” the probability of the payoff increases. That is, a game that allows exactly 25 spins has higher probability of success than one that allows 24 spins. Could be wrong on that.

I knew my script worked with payoff odds, but I misread your 36:1 and 35.13 : 1 as being the same, septimus. Which is why that was the last value I put in there. The tolerance was a stab at trying to compare the number of spins required to reach the “same point”. It’s not quite correct, but I put it up since that’s what I used.

Since there seem to be two discussions going on here, I’ll comment on the other one :

There are two “games” being played here. In one, you always bet all your money. In the other, you will end up betting all your money with probability less than 1. (I think Chessic Sense put a value on this upthread). Is it not obvious that the “house edge” will be lower if you don’t have to bet all your money?

Yes, indeed.

My comments are from the point of view of a professional gambler … so from my point of view it is not kosher to consider only certain subsets of the game. You cannot consider just one “run” of the betting progression. You cannot assume that the game is over once the progression completes. You cannot assume that each game is from a discrete bankroll. If you win and play again, you cannot start with a “new” bankroll; your BR is your BR and if you continue to play, your results will approach Expected Value.

While it does appear true that playing this system once will achieve the stated results, if you enter the real world and the long-run, the house edge of 5.26% applies. The fact that playing this system (and playing it only once) has a different EV than the game of roulette is simply because you are not actually playing roulette, you are looking at only a subset of the game of roulette.

Please don’t be. I’m very delighted you cleared up my ignorance. To recapitulate, the linked page makes conjectures a, b, c, d. a and b seem easy to prove, c plausible but as yet unproven; panamajack demonstrated d to be false.

I’m doing some experiments. For example in the “double your $1000 using only 3-for-1 bets” scenario I find that a starting bet of $55.555 gives exactly the same success probability as $500. (The fact its probability is equal to but not greater than the other gives empirical support to conjecture c.) I also see that, when both 2-for-1 and 3-for-1 are allowed, the Panama solution outperforms Septim only for a very limited range of goals.

BTW, given the popularity of recreational gambling, I do find it curious that the High Payoff is Better fact is not more generally appreciated. (It’s certainly not completely unknown; you can find “Playing in casino” books that specifically encourage smallish single-number bets at roulette, rather than the lower payoff bets.)

AFAICT, the other thread is quibbling about semantics (not reconciling different definitions of vigorish). I prefer to quibble about mathematics. :cool:

Is your claim now that playing the septimus method once has a house edge of 4%, but if I play it a second time it is different? That is so bizarre that I don’t even know how to respond to it.

What I disagree with is the fourth word in your statement 3.

I agree with statement 1.

I also agree with statement 2.

I also agree with statement 3 in so far that a bet that gives you a payout of 2 with a probability of 0.48 and a payout of 0 with a probability of 0.52 has an expected value of 0.96 and thus a house edge of 4 %. I just don’t think (and I’ve stated that over and over again) that the septimus betting pattern is equivalent to such a bet.

Statement 3) follows directly from statements 1) and 2) and the rules of the septimus betting pattern.

If you disagree you must be getting different answers than I am to the following two questions:

What are the possible values of the profit/loss variable for the septimus betting pattern?

What are the probabilities that we attain those values?

I suggest that, given sufficient time and thought, your proper response would be “Oh. Yeah.”

You’re the guy who dissed using sims for this stuff a while back, right? I assure you that a sim would … never mind. I’m going to bed now. Good night.

The sim supported the calculated house edge. It didn’t refute it.

I will quote a professional gambler to support my point:

So applying the septimus betting pattern for a series of times over the long run should result in the total amount wagered times the expectation of 96%. Right?

To calculate that, we’d have to set up a tree diagram where for each spin, we split up the tree into one branch indicating a successful bet (with probability 1/38) and another branch indicating a lost bet (with probability 37/38). The associated value at each branch would be 36 times the wager in the case of a successful bet and 0 for an unsuccessful one. Then at each stage, the tree would end in the case of a successful bet and branch up into two more branches one level later for another spin. In the end, we would have a big tree diagram with lots and lots of ends, representing all possible outcomes of the pattern, and we would multiply the values at each end by the corresponding probability and add up all the factors.

It would be cumbersome, and we don’t yet have the information to do that since septimus has, so far, not given us the precise table of the amounts he’d bet at the various stages of his pattern in case of a sequence of losses. But it could be done. Nonetheless, I can tell you now that the result would be a 5.26 % house edge for the entire table: The table would be made up of individual branches, and at each level we have a branch multiplying 36 by the probability of winning the bet (which is 1/38), giving you an expected value of 36/38 (the other branch would have an expected value of 0, since the payout would be 0 in that case). The entire tree is made up of individual bets, each with an expected value of 36/38 and thus a corresponding house edge of 2/38 = 5.26 %.

We have already done precisely that with the exception of putting it in tree form. We can do this because we know the values of the leaf nodes and how many leaf nodes take each value. What did you think the calculations upthread were about.

Furthermore, we know the precise amounts bet at every stage of the pattern: (1/35) (goal - stack) when we can afford it, entire stack when we can’t. This has been stated repeatedly.

Also, Chessic ran a sim that supported the above calculations. I’m not sure how you missed that.

Let’s calculate the house edge for a simpler betting pattern using a tree. I’m not actually going to draw the tree but one can easily fill one in with information that follows.

Here’s the rules:

I’m going to bet $1 on a single number on the first spin.
If I hit the first spin, I’ll bet $0 on the second spin otherwise I’ll bet $1 on a single number on the second spin.

The leaves of my tree represent:

Win - Win
Win - Loss
Loss - Win
Loss - Loss

My profits/losses are:

35
35
34
-2

The probabilities are:

(1/38)(1/38)
(1/38)(37/38)
(37/38)(1/38)
(37/38)(37/38)

So the house edge is:

[35(1/38)(1/38)+35(1/38)(37/38)+34(37/38)(1/38)-2(37/38)(37/38)]/2 = -0.0519391 => (5.19% house edge)

OK, I’ve found the source of confusion and the point where your interpretation of your (correct) calculations is wrong. Let’s do the maths for your whole thing. Your tree has three ends:

  1. You win the first bet and don’t bet any money on the second spin. Probability: 1/38. Payout if you win: $37 ($36 from the bet you’ve won, plus one dollar from your initial bankroll). Product of probability and payout, in dollars: 37/38

  2. You lose the first spin, but win the second one. Probability: (37/38) * (1/38) = 37/1444. Payout: $36 (the first dollar is lost, the second dollar gives you $36 back). Product: 1332/1444

  3. You lose both spins. Probability: (37/38) * (37/38) = 1369/1444, payout 0, product 0.

Notice that the three probabilities 1/38 ( = 38/1444), 37/1444 and 1369/1444 add up to 1, so the tree is complete.

Adding up the three products for the expected value gives you, in dollars:

37/38 ( = 1406/1444) + 1332/1444 + 0 = 2738/1444.

But this is dollars, so now you’re inclined to argue that since we started with $2, we need to halve that to convert it into an expected value in terms of amount of bet. This calculation would give you an expected value of 1369/1444, which indeed gives you a remainder of 5.19 % to 1, as you calculate.

But: THIS IS NOT THE HOUSE EDGE.

The house edge is the percentage of your wagers reaped by the house in the long run. What you calculate with these calculations is the percentage of your bankroll reaped by the house in the long run. With the betting pattern you describe, you’re not actually (in the average) betting your entire bankroll of $2. In some cases (namely, the 1/38 of cases in which you win the first spin) you bet only $1 in total. The wagers you bet are, on average, lower than the entire bankroll of $2. If you’re then using the bankroll of $2 to calculate the house edge, this will be wrong, because the denominator in that division (the wagers) will be set at too high a value, giving you too high an expected value compared to what you actually bet, which is less than $2.

That does not mean your betting pattern is dumb (the smartest option would, of course, not to bet any money at all); it simply means that it does not reduce the house edge, as understood correctly. What you do is you confuse the house edge as the percentage of wagers reaped by the house with the absolute dollar values reaped by the house. To illustrate the example: Consider person A betting $100 on any bet in American Roulette. The expected value is $94.74, the expected loss $5.26. Person B bets $1000 – expected loss $52.60. Does that mean peson B is suffering from a house edge ten times as high? No, the house edge is the same as a percentage, it’s just that person B bets more money which is subject to the house edge.
In our case, a person who places all of his money on an even money bet will subject, on average, more of his money to the house edge than someone following septimus’ pattern, since septimus will, on average, bet less. The house edge as a percentage is the same for both, however.

Yes, and this is where you’re screwing up. You’re not putting the right number into “amount wagered”. For some septimuses, they’re not wagering their entire bankroll like the colors. They’re wagering less. So yeah, they win 48% of the time but that tells us nothing about the house edge!

No. House edge takes into account how much you bet. If you don’t bet anything, then the house edge is “Undefined”, which is decidedly NOT lower.

Not exactly, no. It would be iterationsstake.96. It’s hard to say how much was wagered in the long run, but I’m sure a computer could give a number fairly easily.

I’ve not been following this part of the discussion, but this comment sounds very wrong. Both panamajack’s calculater and my own implicitly trace every leg of the “tree diagram.” (Although neither of us makes effort to display the entire tree in human-viewable fashion. :smack: )

And, as it happens, essentially the only strategy(s) which septimus has discussed in this thread have the property that you either bet your entire bank (terminating perforce if you lose) or bet enough to reach your goal (therefore terminating if you win); thus the “tree” is quite trivial.

Even if, counterfactually, the tree were complicated, building it within a computer program and making measurements on it is very straightforward. (Please take my word for this!)

Hope this helps. :smiley:

I didn’t see this before post #114 but you’re right. Lance, it’s not your math that’s wrong. It’s your terminology. Rather, it’s the term you’re trying to plug into the math that’s incorrect- amount wagered. As Schnitte says, what you’re calculating “IS NOT THE HOUSE EDGE”.

However, the latter paragraph isn’t really correct. He’s suggesting that your mistake is in looking at absolute losses, but it’s not. I’m just pointing that out so we can nip that tangent in the bud right now. We know that’s not where your error is. It’s in assuming you always bet the entire bankroll.

Some of these discussions could be short-circuited if we gave each other more credit here. A should know that B understands a definition; B should know that A knows B knows it; and so on.

But … at the risk of adding fuel to these flames ( :smiley: ) … let me point out that the “compound wager” (e.g. bankroll-doubling using single-number bets) fashioned by the so-called septimus strategy, smells like a roulette wager, tastes like a roulette wager, and quacks like a roulette wager. The wager comprises nothing else than placing bet(s) at the roulette table, and at the end of the (fashioned compound) wager, you’ve either lost your entire bank or doubled it. The amount the casino wins, when it wins, is precisely your entire bank.

And the “vigorish” computed on that compound wager, if you agree to call it a wager, is not the usual 1/19.

I see you can’t resist the temptation to poke fun of my slightly overdone exclamation. Sorry about that - I just wanted to draw attention to this point, since I think it lies at the heart of the discussion.

Quoted for truth.

You guys, Chessic, Schnitte, et al, are not looking at the compound wager as whole and are paying unnecessary attention to irrelevant intermediate conditions.

Consider the septimus one more time. I think we all agree that it can end in exactly two ways. Either up $x or down $x. We certainly have $x dollars at risk every time we play. How is that different from wagering $x?

Or how about this?

Chessic wrote a sim of applying the septimus. If I bet Chessic $1000 even money on that the next single trial of the sim would produce a winner, what would Chessic’s house edge be?