Deadly Sixes Game: Explain the apparent probability paradox to me

Yes, you do have a ninety percent chance of losing the game if you play it. You can predict probabilities before events occur.

Not these events. How do you calculate the probability? The number of people who died divided by the number of people who played. But we cannot know the number of people who played (or died) until after the game ends.

While true on the surface, it is a meaningless statement unless you can also say what the probability is that you will play the game.

This times 10000000!

Round number (infinity) has a 1/36 chance of delivering (infinity) losers and a 35/36 chance of delivering (infinity) winners.

Here’s a vastly simplified version to demonstrate the concept:

Flip a coin until you get heads. What was the chance you flipped heads on the last flip?

Is the chance you flipped heads on the last flip 100%? Is that a paradox? Why isn’t it 50%?

I like this example!

Yes. As I said before, you need to clearly define the sample space for a probability question.

@Little_Nemo, when you talk about a “random player,” what are you considering as the sample space?

You can’t just say “The set of all players,” because that’s ambiguous.

It might mean “The set of all people who ever entered the room after one playthrough of the game, which ended after n rounds (where n is some positive integer, only known after the game is concluded) when double sixes were rolled.”

It might mean “The set of all people who ever enter the room in all of a large number of times that the game was played repeatedly.” (Here I use “a large number” to mean either an infinte number, or a finite number large enough to give a good representative sample of results, though if the number of rolls is unbounded, that second interpretation might lead to problems.)

Or it might mean “The set of all people who, before the game begins, have a chance of being called into the room.” This set is infinite in the original formulation, finite but extremely large in the reformulation from Post #216.

And maybe I’ve missed some possibilities.

The original, pure, game?

Incorrect. You have a 35/36 chance of winning the game, if you play it. Because your winning/losing depends on a roll of the dice and nothing else.

The following statements are all true:
(1) The chance of any individual player who plays the game winning is 35/36
(2) If the game is actually played, with any finite populace of potential players (and thus potentially stopping without double sixes ever being rolled if there aren’t enough players for the next round), 35 out of 36 players, on average, will win
(3) Every time the game is played AND ENDS, only 1/10 of players will win and 9/10 will lose
(4) As the number of rounds of a game approaches infinity, and as the potential population of players approaches infinity, the probability that the game will end approaches 1
(5) Putting (3) and (4) together, it sure SEEMS like that means that “with an infinite number of players”, the probability of a player winning is 1/10
(6) But that’s wrong/meaningless/incorrect. In particular, the fact that the probability of a player winning is still 35/36 for any finite population even as that finite population grows larger and the probability of the game ending approaches 1, means that to the extent that it means anything to say “what’s the probability of a player winning if there is an infinite population”, the most reasonable answer is still 35/36

As far as I can tell, all the arguments/examples anyone has given about comparing-population-sizes and whatnot are possibly interesting diversions, but in no way resolve the apparent paradox… as we can easily see from the fact that there is no paradox at all without infinity being introduced. The paradox only shows up with “infinite players”, but there are plenty of cases with large-but-finite players where you can count groups of different sizes, yada yada yada, but in none of those cases is there an apparent paradox.

And taking a step back, here’s a game:
(1) Some things happen that don’t involve you
(2) Then you might get to play. If you DO get to play, then dice are rolled, and you win on anything but double sixes. Your participation in the game now ends
(3) Then other things that don’t involve you may or may not happen

What are your chances of winning the game, assuming you actually get to play it? 35/36. Period. Doesn’t matter what is happening in (1) and (3), or how many people are involved in (2) along with you, or what the beginning/ending conditions are of the game overall. If YOUR winning/losing is determined by a die roll, then your chances of winning are determined by that die roll, period.

This is not a math problem, it’s an English problem. Things like “player” are not well defined. Is a player someone who is offered a chance to play in an ongoing game? In that case he has a 35/36 chance of winning.

Or, is a player someone who participated in a completed game? In that case, a random player has a 90% chance of having lost.

The math for these two scenarios is unambiguous. It just gets confusing when you describe it in English.

I think it’s a bit more complicated than you’re implying… the contradiction wouldn’t matter if infinity weren’t involved.

There are plenty of games in which using those two definitions, and those two ways of thinking about and approaching the calculation, lead to the same number. Basically, any purely random game which is guaranteed to complete. And in fact, what’s pernicious about this case, what makes it so tricky, is that it certainly seems like it’s guaranteed to complete, as the probability of it completing approaches 1 as the number of players increases… but that’s not quite the same.

Let’s simplify it and make a game where a player flips a coin and wins a dollar it is heads and gets nothing if it is tails. Each player plays only once and the game ends after the first tails.

If the game ends after the first round, 100% of players lost even though each player had a 50% chance of winning. Only if the game ends after two rounds gives the “intuitive” result: 50% of players won. If the game ends after 100 rounds 99% of players won even though each player had a 50% chance of winning.

In that last case would you say that a player had a 99% chance of winning? Clearly each player, at the time they played, had a 50% chance of winning.

This game is also unbounded and could go on indefinitely.