Deadly Sixes Game: Explain the apparent probability paradox to me

It’s a sequence; 1, 10, 100, 1000, 10000, 100000, 1000000, …

If this sequence could produce an infinite number then there would have to be a last finite number somewhere in the sequence; a finite number whose value is one tenth of infinity. That’s obviously absurd. Every number in the sequence will be a finite multiple of the finite number that preceded it.

Ahh, if the number of players is finite, there is chance that there are x winners and 9x losers, and a chance that everyone wins and the game runs out of players.

So in a universe of less than 1000 people

1/36 of the time 0 winners 1 death
35/36 x 1/36 1 winner 10 dead
35/36 x 35/36 x 1/36 11 winners 100 dead
35/36 x 35/36 x 35/36. 111 winners

With an infinite supply of people or money the last game must be a loss. But finite the last game is 35/36 a victory.

Okay, that part makes sense.

How about this angle: Who says the game is guaranteed to resolve? With an infinite player pool, there is a non-zero chance that it never resolves and nobody ever dies.

I’d say the scenario where you roll two dice forever and never roll a double six is a pretty extreme fringe case. I’m not sure what the mathematical rule here is but I don’t think such an unlikely event invalidates the overall odds.

I agree that in a real world situation, the game is likelier to end due to running out of people rather than having a losing group. The eleventh roll of the dice would require you to have ten billion people in the group, which is not possible in the real world. So the game would end by default after ten rolls. And if my quick calculation is correct, you only have a 25% chance of rolling a double-six in ten rolls, which means that 75% of the time such a game would end in default.

But it’s nonzero, and the result of that extremely unlikely case is an infinite number of winners, right?

This is just a variation on a common gambling fallacy. It’s basically the same idea that if you keep doubling your bet at the roulette table you are guaranteed to win. Just like the deadly sixes game, at each stage you increase the “bet” exponentially, so the deadly sixes increases 1, 10, 100, 1000, etc. and the roulette increases 1, 2, 4, 8, 16, 32, 64, 128, etc. Both games end when the improbable event occurs (double sixes, or hitting your number on the roulette wheel). While the double sixes has a very bad result at that point (lots of deaths) and the roulette game has a good result (you win lots of money), it’s still the same basic principle either way.

The problem is that real world limits make it impractical. In the deadly sixes game, you reach the entire population of the world in only 11 rolls. In roulette, you reach the betting limit of the table (or you run out of money) very quickly.

It’s a fun exercise in how probabilities work, but these sorts of things almost never have any real world application. Exponential growth is a bitch.

No, as I posted above, each group will have a finite number of players in it. There will never be a group that has an infinite number of players.

Yes but my intent in the OP was not to address the real world version of this game. I made it clear that I was discussing the odds in the idealized version of the game.

It’s zero. The probability of going forever without ever getting boxcars is zero.

However, at the same time, the expected number of players is infinite, as can be seen by the series:
\frac{1}{36} \cdot 1 +\frac{35}{36} \frac{1}{36} \cdot 11 + \frac{35^2}{36^2}\frac{1}{36} \cdot 111 + \frac{35^3}{36^3}\frac{1}{36} \cdot 1111 + ...

Infinity is weird.

Speaking of weird, the probability of going forever without ever getting boxcars is zero. Yet it’s still possible to continue rolling and never get boxcars.

If you never roll 12, there would be an infinite number of groups.

I continue to maintain that there is no paradox because infinity is introduced into the premise. It’s essentially like introducing division by zero into an algebra problem. You can “prove” that 1 = 2, but you have to divide by zero to do it. Which invalidates the entire premise. Similarly, I feel that introducing the infinity into the deadly sixes problem invalidates the entire premise.

The probability of never rolling 12 is zero, though. If it weren’t, it would have to be some number greater than zero. But \frac{35}{36}^N is smaller than any strictly positive number you can come up with for sufficient N. Therefore it must be zero.

Think of a random number between 0 and 10. Assuming you are using a flat distribution, the odds of you picking \pi are zero. And yet \pi is a real number between 0 and 10. That it is in the set of possibilities doesn’t mean the probability is non-zero.

Right, that’s where the contradiction comes in. The moment there’s an infinite supply of people, then we are in a world in which probability works differently, if at all.

In other words, you effectively said:
(1) Here’s a description of a game
(1a) The game takes place in a hypothetical with an infinite supply of people
(2) Here’s one argument about what the odds are of winning the game, with some reasonable-sounding math
(3) Here’s a different argument about the odds of winning the game, with some reasonable-sounding math
(4) But the probabilities I got from steps 2 and 3 are different!
(5) Can anyone explain the paradox?

And the seeming paradox is hidden in plain sight in 1a… because lots of fairly basic-seeming probability just stops working if there’s an infinite/unbounded population. And in fact, I can prove it doesn’t work, by precisely the 5 steps above. Nothing is wrong with the basic math that a zillion people have looked at in steps 2 and steps 3. The above argument DOES demonstrate that the odds of winning the game are both 1/10 and 35/36. And those numbers are not the same. But you can go over the math in (2) and (3) forever and not find a glaring error. So, yes, something must be wrong. And more or less by process of elimination the only thing that can be wrong is the assumption that it’s OK to add 1a without anything else changing.

No, the reason there is no paradox was resolved long ago in this thread. It’s because people are not rolling individually; they’re rolling as members of a group. So each die roll will affect a large group of people (and the exact sizes of those groups will differ substantially).

This means that while the odds of a group rolling a double-six might be one out of thirty-six, the odds of an individual being in a group for which a double-six is rolled is nine out off ten.

Here’s another example that might make this distinction clear. Suppose you have four groups of people. These groups have two members, three members, six members, and twelve members. A standard coin is flipped for each group and based on the flip, the group (and all of the people in the group) will be designated as heads or tails. What are the odds that the game will result in an equal amount of heads people and tails people?

If you just look at the coin flips, you might think there are four coin flips, so there are sixteen possible outcomes; HHHH, HHHT, HHTH, HHTT, HTHH, HTHT, HTTH, HTTT, THHH, THHT, THTH, THTT, TTHH, TTHT, TTTH, and TTTT. Six out of those sixteen outcomes produce an even amount of heads and tails, so you’d conclude the odds are 37.5%.

But the odds are actually zero percent. Look at the amounts of heads and tails people that result from these flips:

HHHH 23 H - 0 T
HHHT 11 H - 12 T
HHTH 17 H - 6 T
HHTT 5 H - 18 T
HTHH 20 H - 3 T
HTHT 8 H - 15 T
HTTH 14 H - 9 T
HTTT 2 H - 21 T
THHH 21 H - 2 T
THHT 9 H - 14 T
THTH 15 H - 8 T
THTT 3 H - 20 T
TTHH 18 H - 5 T
TTHT 6 H - 17 T
TTTH 12 H - 11 T
TTTT 0 H - 23 T

As you can see, there is no possible set of coin flips that will result in an equal number of heads people and tails people. The odds of the coin flips are not the same as the odds of the people because the coin flips do not apply to equal sized groups of people. Just like the dice rolls in the deadly sixes game.

This has nothing to do with infinity.

And yet if you apply a fixed cutoff to the original example, the paradox disappears. Because just before the cutoff, you have a large group of people very likely to survive, and that counteracts all the smaller cases where almost everyone dies.

When you take the game to allow infinite sizes, there is no “last” group. So there’s no consistent way to account for it, even though it’s the most important one.

The paradox disappears in the original question as I just explained.

I don’t think anyone on the thread agrees with your line of reasoning that the size of the group controlled by a single dice roll matters. I certainly don’t.

Again, if you work the numbers for any fixed cutoff, there’s no paradox, despite the fact that you still have an increasing group size. It’s only when you let the group size go to infinity that there’s a problem.