Mathletes: Two card-odds questions (blackjack and side bet)

Or how about this?

Player A walks up to the roulette table with $1000 and plays the septimus.

Player B makes a $1000 even money side bet with the casino that player A will succeed.

The results for each player are clearly the same. Are the house edges against each player the same or different?

So far you’re right. Let’s assume X is $1000. You start with a bankroll of $1000 and want to leave with $1000, meaning we’re x dollars up. Or we lose everything, in which case we’re x dollars down.

No we don’t. At no time are we betting the full $1000 on a single spin. We’re not betting the full $1000 over the course of all the spins combined either, on average, since in some cases we’re winning and leaving before reaching an aggregated $1000 of bets.

Can you explain how having a 52% chance of losing $1000 is different from having $1000 at risk?

Nevermind. Post 121 is pretty convincing. Any response to that?

I gave you a response to that. The point is: With septimus’ betting pattern, you are, on average, not betting $1000 when you play the scheme since in some of the cases you quit before reaching a total of $1000. Of course, in some cases you do bet all of your money, and in some of the cases when you do this you lose the $1000. But on average your aggregated stakes are less.

That doesn’t really answer the question posed in post 121.

Also, as an aside, there is a non-zero chance that the total of all your wagers in the septimus method will be greater than $1000.

I think it covers that pretty well. The question was how having a 52 % chance of losing $1000 is different from betting $1000, and the answer is that with your scheme, you’re not betting a total of $1000, at least not on average (sometimes you will, if you keep losing until you run out of cash).

That chance is indeed zero under the schedule of the method, and claiming otherwise indicates that you haven’t really understood the scheme. At each stage of the scheme, you have two possibilities:

  • You win the spin, in which case you quit
  • You lose a spin, in which case you keep betting with more money which you take out of your bankroll

The scheme does not allow you to add money to your bankroll. You either win a spin, in which case the game is over for you, or you lose, in which case you keep betting, but since you haven’t won before (if you had won before you wouldn’t be betting again!) the money for the new bet has to come out of your initial bankroll, which decreases correspondingly.

Actually, if you win on the 25th spin you are neither broke nor at your goal so you continue to place bets.

No, it doesn’t cover it. Here’s post 121 again.

Of course you can keep repeating “No it doesn’t” like a parrot, but that doesn’t change the point: In the septimus pattern, you are not, on average, betting a total of $1000. But you calculate the house edge on the basis of moneywagered. Calculating the “house edge” on the basis of the assumption that you bet a total of $1000 is, therefore misleading and will not give you an accurate result.

I’m going to assume that your last response indicates that the house edge against player B is not the same as against player A in post 121. Since B quite clearly wagers $1000.

Do you agree that the house edge against player B is 4%?

Oh, and I wasn’t just repeating, “No it doesn’t,” like a parrot. I was trying to understand your response to the question in 121 in order to reach a mutually agreeable statement. If you’ll answer my question in the previous post, I think we’re almost there.

I’ll agree to that. The house edge is 4% against player B. Now please tell us: What does “house edge” have to do with the chances of winning? Answer: Nothing. The odds of success have absolutely nothing to do with the house edge. That’s not a variable in the calculation.

How much does B put down on the table? $1000.
How much does A put down on the table? <$1000.
Difference between A’s winnings and B’s winning? $0.

Remember back on page two when I said it’s all semantics? Well if you two would just agree on a definition of house edge, we could move past this petty bickering.

You are talking about two different things. There is the game of roulette, in which every bet has a house advantage of 5.26%. Distinct from that, there is the septimus Proposition Bet, which is a wager on how often during the sequence you will get lucky and win an arbitrarily chosen amount.

If you play the septimus system a million times, the house will win 5.26% of the total money wagered. If a million people each play the system once, the house will win 5.26% of the total money wagered. The fact that 48% of the people will win their prop bet is irrelevant when calculating the house edge of the game of roulette.

The calculation of the house edge of the game of roulette and the calculation of the probability of success of the prop bet are distinct and separate calculations. The fact that the house edge at roulette is a factor in calculating the probability of success of the prop bet does not in any way alter the house edge of the game of roulette.

If, instead of choosing the condition of success as 2xBankroll, you chose 1.5xBankroll or 3xBankroll, the house edge still remains the same, while the probability of success of the prop bet would change.

In a game of independent trials in which the house has a mathematical advantage on each bet, there is no betting method which will alter the house advantage. You might (or might not) enjoy reading the chapter in Gambling Theory and Other Topics by Mason Malmuth entitled The Extremely Silly Subject of Money Management.

Ok, I think we can agree on three statements with respect to players A and B from post 121.

  1. Player A is playing the septimus system.

  2. The result of the actions of players A and B are exactly the same.

  3. Player B is placing an even money bet with a house edge of 4%.

Do we agree?

Thank you, Lance. I’ve not been scrutinizing the subthread but I have a strong sense that you guys may agree on the mathematics, and there’s just a refusal to adapt terminology. In addition to Lance’s requests for agreement, I’d like to see all participants agree simply that:

  1. Expected house vigorish is always 1/19 of expected total player action.

  2. Use of the Kathmandu Martingale does increase the probability of doubling one’s money at the roulette wheel compared with any single even-money bet.

Let’s look for areas of agreement and relax to have a Very Merry Christmas (and/or participate in other exciting threads here at SDMB!)

Oh. Please stop calling the method “septimus.” If it must have a name, let’s go with Kathmandu Martingale. (In honor of the occasion I did win money with a girl named Jenny at a roulette wheel in Kathmandu, albeit with a system other than the Kathmandu Martingale. :cool: )

I have to disagree with this. House edge is a weighted average of all values attained by the profit/loss variable where the weights are the probabilities of attaining each value. House edge is expressed as a percentage of the amount bet.

What we’re disagreeing about now is the amount bet.

I disagree that the amount A has bet is determined by how much money touches the roulette table. A has put $1000 dollars at risk by initiating the process and looking at intermediate values of the amount A is risking is misleading. Saying that A only risked $28 when he wins on the first spin ignores the fact that winning on the first spin also removed any chance of losing the other $972.

Or more simply:

How much does A lose when he loses?
How much does B lose when he loses?

When we think of the process as one compound bet, the amount A is betting is precisely the maximum amount he can lose. Looking at anything else is a red herring.

::sigh:: Then there’s no hope for you.

Where do you think this boost in performance by playing the numbers comes from? By keeping money off the roulette table! That’s the only reason it works in the first place! If you say “Player A essentially bets $1000 even when he doesn’t” then you’re just making stuff up. If the player exposed his entire stack to the house vigorish, then you wouldn’t be seeing the bump up to 48% in the first place!

By that logic, I could just walk into a casino, bet nothing, and then thumb my nose at the croupier, saying “Haha! I’ve reduced your house edge to zero!” Or maybe I can just bet one dollar, and if I lose, I walk away. Then I can just say “Well, I lost $1 but I have $999 left, so the house’s edge is only .1%!”

I really, honestly, can’t come up with another way to explain it to you. For some reason, you think the player’s bankroll matters. You think “amount bet” includes amount he hasn’t bet. There’s no way to argue through that.

There is hope for you.

I’ve been saying that the boost in performance comes from keeping money off the roulette table for over a week now. I don’t see how you can believe that there is a boost in performance and yet still believe that there is no difference in house edge. I also don’t see how one can think that player A is not risking $1000 by playing when the only possible bad result player A can get is losing $1000.

I’m going to try one more time.

Put a little table between player A and the roulette table. Player A puts his $1000 on the little table. Before each spin the dealer takes the predetermined amount of money off the little table and places it on the number of player A’s choosing. If player A wins a spin the winnings go back to the little table, but player A can not take money off the little table until there is a total of $2000 there.

Is player A risking $1000 dollars? Note I am not asking if player A is risking $1000 on the roulette table I am asking is $1000 at risk?

Careful now. A lot of system hucksters are going to be awfully angry with you for revealing the powerful secret of the Little Table Method. I heard a rumor that John Patrick was going to include it in his next book for $29.95.

Player C walks up to a table and plunks down $1000. Some process determines if he wins or loses, but we don’t know what that process is. We do know, however, that 48% of the time he wins $1000 and 52% of the time he loses the $1000.

Do we have enough information to calculate the house edge against player C?

Well, yes, but only because of the rules you have set forth for when the money can come off the table.

If the player wins 5 spins in a row on an even money bet at $200 per bet, he will have only risked $200 of his original bankroll, but will have risked $1000 total getting to the threshold for leaving.

For what it’s worth, the house edge in American roulette is 2/38 or approximately 5.26%, in European roulette it is 1/37 or approximately 2.7%. Roulette has some of the worst odds of any game in Vegas.

Edit: On second thought, that’s only for even money bets. It’s easier to achieve the goal using 3-1 bets or the 35-1 bets. But then you run the risk of having poorer odds against you, increasing the likelihood of running through your bankroll before making your goal.