Deadly Sixes Game: Explain the apparent probability paradox to me

Slightly more accurately, infinity and the fact that the weighted round size is divergent.

If the round size doesn’t scale at all, there’s no paradox, even for an infinite game. But it’s also not very interesting. We look at the probability of dying in each game length and see that it’s 100%, 50%, 33%, 25%, etc. The limit is 0% and so all we can conclude is that the real probability is greater than that. It’s consistent with the real probability of 1/36. You can also scale the round size down without a problem.

You can scale up the round size by a little bit without a paradox, either. But not by so much that it exceeds 1.0 when multiplied by the chance of progressing to the next round. Otherwise, it diverges and the non-existent “last” round for the infinite case dominates and screws up the rest of the calculation.

Not sure what you mean by “wrong”.

I think here’s the key thing with respect to infinity: “Infinity” is not a number. Sometimes, it’s useful to talk about and use infinity as a limit. As a very simple example in “what are the chances of picking the number 1 at random from the numbers 1…infinity”. Intuitively, we want to say “zero” (with some squabbling about whether “Zero” means “impossible”, or just "lower than any calculatable chance).

Now, I keep saying that probability math gets funny near infinity. But I’m reasonably happy saying that the chance of randomly picking 1 from 1…infinity is zero. Why? Because of the limit. What’s the chance of randomly picking 1 from 1…10? from 1…100? from 1…100000? from 1…googolplex? Each of those chances is a calculatable rational number, and they get smaller and smaller as the set of numbers gets higher and higher. The limit of that chance, as it approaches infinity, is zero.

BUT, compare that to the current question under discussion.
What is the chance of someone who play the game winning, calculated with the “total group size” method, for a population of 100? 35/36
What is for a population of 10000? 35/36
What is for a population of googolplex? 35/36

Therefore if you’re claiming that that probability, at infinity, is 9/10, despite the fact that the limit as it approaches infinity (and in fact the answer for every finite n) is 35/36, well, then something fishy is going on… which suggests that the intuitive “well, as long as there are infinite players, then the game MUST finish, then we can do (some math) and…” approach just doesn’t work.

Suppose your pool allowed only game sizes up to 5 rounds. If you only look at games with 1-4 rounds, it indeed looks like the death rate is >90%. But then take the 5-round game into account, where there is a huge pool of people that mostly survive, and the stats go back to the expected 1/36.

And this is true for any round size. Have enough people for 100 rounds, and if you only count games with 1-99 rounds, the death rate is >90%. Then you account for the 100-round game and you get the expected 1/36.

For infinite rounds, there is no “last” round. And yet accounting for it is crucial. Therefore there’s a paradox.

If the influence of later rounds became negligible, we could ignore them. It’s pretty much why limits work. You can keep raising the cutoff, but the contribution of subsequent terms keeps decreasing.

But that doesn’t happen for the proposed game. The contribution of each subsequent round keeps increasing. It doesn’t matter what cutoff you choose; there’s a later round that has even more influence than the ones you already counted.

Ah! In that case, the answer is -1/12.

ETA: A friendly DM pointed out my mistake. Thanks, WaM!

I wonder if the idea is actually less insane than it sounds. The contribution of those divergent, infinite terms is whatever is required to reduce the probability from the 90% from ignoring the games longer than the cutoff down to the expected 1/36. So it ends up having a weird negative contribution, just like in the -1/12 case. But negative probabilities are even stranger than negative numbers.

OK, I guess we’re in agreement.

For a game that is guaranteed to complete, you lose 9/10. And because infinity is very large, it seems intuitively obvious that a game with an infinite population is “guaranteed to complete”. But it isn’t, at least not in the sense we need it to here… even though the probability of it completely approaches 1 as the number of rounds goes to infinity.

Yep, I was just giving some additional perspective on it. There are a few slightly different valid interpretations here, but ultimately they come down to it being an infinite game where the later rounds have more influence than the early ones. All the finite cases work as expected (for some version of what’s meant by a finite version of the game).

Here’s an interesting variation…

There’s a fixed population of players. But the dice rolls, while generated by random and fair dice, are known ahead of time, and kept secret.

So, 1,111 players. Max 4 rounds, of 1, 10, 100, 1000. Dice will be rolled four times, results written on slips of paper, kept by those Academy Awards accountants in a hidden briefcase, opened as necessary when the game begins.

Does this change anything? So far, no. All players odds are still 35/36.

But, add a new wrinkle: you have an “in” with the person who rolled the dice, and he will tell you head of time either “yes the game will complete” or “no the game will not complete”. Obviously, if he says “no the game will not complete”, then your odds of winning are 100%.

But… what if you also know what your “competitor number” is. So, if it’s “1”, you will be the sole player in round #1, if it’s 2 through 11, you’ll be one of the next 10, etc. If you know your competitor number but also know that the game will complete, what are your odds of winning?

I think this is yet a third set of numbers… neither 35/36 nor 9/10. Because if you’re in the final group, then you have zero chance of winning. If you’re in one of the earlier groups you could win or could lose but… I don’t think it’s either 35/36 or 9/10.

I’d have to think about that more, but one consideration is that knowing your competitor number should change your belief about how many rounds the game will go for. Clearly it does if you find out you’re #1111, since that fixes the round count at 4. But I think the same is true (to a lesser extent) if you’re #1 as well.

It depends on how you define player. If a player is someone who actually was subject to a dice roll, then if the game ended we know that a random player has a 90% chance of having lost.

But the assumption is that there are an infinite number of people who agreed (or we were forced) to participate. These are really the players. Regardless of when the game ended there are n winners, 10n losers, and an infinite number of players who sat on the bench. So your chance of losing, if you agreed to be a player, is essentially 0.

This is it. You just have to apply the numbers.

There are 1,111 players in this game. They are divided into groups of one, ten, one hundred, and one thousand.

If the 35/36 odds were correct, then you would expect the likeliest result to be 1,080 players winning and 31 players losing. Yet it’s easy to see that there’s no possible combination of dice rolls that will produce those numbers.

If the first three groups had rolled a safe number and the final group had rolled a double six, then 111 players would have won and 1000 players would have lost. Which is 9.99% of the players winning and 90.01% of the players losing.

There it is; approximately ninety percent of the players in this game lose. And that percentage holds true regardless of how many rounds the game goes on for before a double-six is rolling, as long as you play the game until it’s completed.

You’re ignoring the 90% of the time that all four rounds win.

I guess it’s just going in circles, but this last paragraph is where the impasse exists - “no matter how many rounds are played” and “as long as you play the game until it’s completed” are not, strictly speaking, mutually compatible.

As above, it ignores the cases where you have only 1111 players yet the game does not end, which is most of the time. And this will be the case if you specify a set maximum number of potential players. Make it 111111111111111111 players, and it will still have the problem of ignoring the cases where the game is won in every round.

This issue exists if you specify a maximum number of players (thus fixing the maximum number of potential rounds). But disappears if you let it continue arbitrarily long until the double sixes are rolled.

I made it 10^80 players (turning each particle in the universe into a player) and about 97% of the players win. Math continues to work!

Good point. I guess you don’t need infinity to end up close to the expected result. Just something close enough compared to 1/36. And then run that version enough times for Bayes to matter

I’m talking about the game I described in the OP. If you didn’t roll a double-six in the fourth round, then you’d have a fifth round. If it didn’t end in the fifth round, you’d have a sixth round. And so on. You would keep playing until there was a final round where a double-six was rolled and the players in that final round would be the losers.

As I’ve said, if people want to discuss different games, they can do so. But don’t take the odds from those different games and argue that they apply to the original game. They don’t. The odds in the original game say that ninety percent of the players lose - regardless of how many rounds occur.

It’s basically a twisted version of the lottery, so I still have trouble seeing where the disconnect is.

If nobody wins the lottery, the prize rolls and they try again until somebody does. A single ticket/player has the same fixed chance of winning each drawing (very low), which is different from the chance the lottery will resolve in a winner eventually (very high) or the odds of which round will win (variable for the lottery since it depends on number of players). It’s the same principle here, even if the specific numbers/events are different.

But that’s introducing infinity into the premise, invalidating the entire thing.

No, it doesn’t. The losing rate remains approximately ninety percent no matter how many rounds occur.

Let me try one more time.

What is the value of this expression: 2 + 3 + X - X

It’s 5.

It’s always going to be 5. It doesn’t matter what the value of X is. X can be 1. X can be 42. X can be 593,277,360,838,276,268,155. X can even be infinity. But the value of the expression will always be 5.

So if a person claimed you can’t know the value of the expression because X might be infinity, that person would be wrong. And if that person said they don’t agree with you that the answer is always 5, they would still be wrong.

If that person said that you can’t determine the value of 2 + 3 + X they would be right. But if they said that means you can’t determine the value of 2 + 3 + X - X then they’d go back to being wrong because the math that applies to one expression doesn’t apply to the other.

Let’s call the number of rounds X. For all positive integers, there is no paradox. Name any positive integer for X, and no paradox will be shown. The only value of X that shows a paradox is infinity.

Also, I do not know that it is a given that 2 + 3 + infinity - infinity = 5. Can any math experts confirm? (My understanding is that infinity + 1 = infinity.)