Mathematical paradox?

You start with $100. You’re playing a game where you wager $1 at a time on a coin toss. If the coin comes up heads, you win $1, if it’s tails, you lose $1.

Lucky for you, this is a biased coin. It comes up heads 2/3 of the time, and tails 1/3 of the time.

Even more lucky for you, you’re allowed to play this game infinitely, as many coin tosses as you want!

Question:

Given that over an infinite number of trials, every possible scenario with a probability of greater than zero must eventually happen, and given that the probably of going bust in this game, no matter how much money you’ve made, is always greater than zero, does this mean that you are guaranteed to eventually lose all of your money if you are committed to playing this game forever?

Start with $100 flip coin 100 times and get tails each time. Game OVER. :confused:

The problem with your problem is that infinite trials cannot be played. Therefore no answer can be given about your winnings. You can only do that if you stop the game after a finite number of flips. After any finite number of flips you have a probability of being ahead, and a nonzero chance of going bust… but you are not guaranteed to go bust.

Infinity is not a number and you can’t use normal procedures to give an answer. There are numerous “paradoxes” that rest on this, one of which we discussed in Run for your life…it’s the Switch Flippin’ Demon!!! The proper answer for that is that it can’t be answered.

There are some interesting results about the amount of time you stay positive or negative with a fair coin discussed in If you flip a coin forever, are you guaranteed to eventually flip an equal number of heads and tails?. That answer is yes, but meaningless. The process, however, is full of deep theory and requires deep investigation.

No you are not guaranteed to lose every time. Let P(x) be the probability you’ll eventually lose when your stake is x. When your stake is x, 2/3 of the time you will win increasing your stake to x+1, and 1/3 of the time you will lose decreasing your stake to x-1 so

P(x) = (2/3)P(x+1) + (1/3)P(x-1).

This is a difference equation. I’m sure there must be a Wiki page abou them in general. But all we need here is the solution which is

P(x) = A + B(1/2)[sup]x[/sup]

for any values of A and B. You can verify that by substitution. You also know that P(0) = 1 and P(inf) -> 0. This means A = 0 and B = 1. So the final answer is

P(x) = 2[sup]-x[/sup]

If you start with x = 100 you chances of eventually losing are 7.9 X 10[sup]-31[/sup].
Technically we don’t know that P(inf) = 0, but we know that A = 0 so that P(x) = B*(2)[sup]-x[/sup] and that P(x) must be between 0 and 1. Therefore B cannot be any bigger than 1. A smaller B would say the chances of eventually losing were even smaller than I just computed.

What do you mean by “scenario” here? For example, losing all your money on the first $100 flips is a scenario with probability 1/3^100. Positive, but far from 1.

I’ll also point out that probability 1 is not the same as guaranteed (put another way, probability 0 is not the same as forbidden). In fact, any particular infinite sequence of heads and tails you could specify will have probability 0, even though some infinite sequence of heads and tails happens in the end.

But your biggest problem is this:

Your probability of going bust eventually is not 1 (see OldGuy’s post above). You are probably reasoning as though going bust is like flipping a head: you keep taking a shot at it, and each time your probability of having it happen is the same, no matter what has happened before. Events of this sort with positive probability on each trial DO have probability 1 of eventually occurring (because if you have probability q of failing on any individual attempt, then after N attempts, you have only a q^N chance of having failed them all, which will get arbitrarily small over time).

But that’s not how going bust works at all: the more money you’ve acquired, the harder it will be to go bust. The more money you’ve lost, the easier it will be to go bust.

If you want to be more rigorous about it you can compute P(x,n) the probably of losing within the next n tosses if your stake is x, and then take the limit as n -> inf. This is a much harder starting problem though as the difference equation is now a partial difference equation, one in two variables, but it does have a known solution I’m pretty sure. I know that the diffusion approximation to it certainly does.

Huh? One fact we know for sure is that P(inf) = A [as the B(1/2)[sup]inf[/sup] term vanishes], so knowing that P(inf) = 0 is equivalent to knowing A = 0, from which, by our initial condition P(0) = 1, we can compute B = 1.

Why do you say we don’t know that P(inf) = 0 but we do know that A = 0?

Also, we can actually reason, right from the beginning: The probability of going bust starting from $x is the probability of first losing 1 dollar, then losing 1 dollar again, then losing 1 dollar again, etc., x many times. Thus, if we call this probability P(x), we have that P(x) = P(1)[sup]x[/sup]. At this point, the recurrence noted by OldGuy (that P(x) = 2/3 P(x + 1) + 1/3 P(x - 1), for x > 0) tells us the base of this exponentiation must be either 1 or 1/2.

But it doesn’t seem easy to rule out that P(x) is constantly 1. At this point, I would invoke the sledgehammer of a law of large numbers. Is there perhaps an easier approach?

Yes, but then you could just keep playing and win money back. The problem is with the set up of the game, when you reach zero, you have no money to bet. The game is biased towards the zero situation

I think the a real hang-up here, for starters, is that the OP’s question simply isn’t well-formed. I think we all know what he’s trying to ask (I think), and some of the above responses might be trying to answer what I think he’s trying to ask, or maybe trying to answer what someone else thinks he’s trying to ask.

Let me just take a stab at re-writing the question that way I think the OP meant it, and then let’s see who has what to say:

We won’t talk of playing the game infinitely long, but instead we will talk of playing indefinitely long, or until the player goes bust. That is, we aren’t asking what happens as t → ∞ but instead, we’re asking what might be happening at any finite point along the way.

So:
If he plays this game indefinitely long, is it guaranteed that the player will go bust at some finite time along the way?

As I so succinctly stated in post #2. :slight_smile:

Actually, if you rephrase it to “are you guaranteed to eventually lose all of your money if you are committed to playing this game until you do?” I think the answer is “yes”.

On the natural interpretation of the setup, you may try and fail to lose all of your money (e.g., if you get heads on every flip).

Even speaking probabilistically, you have a 1 - 1/2^100 probability, as noted above by OldGuy, of obtaining a sequence of flips in which you never go bust. It is, in fact, very difficult to go bust.

The coin is biased towards rewarding you. As you accumulate rewards, it gets more and more difficult to go bust. The overall effect of this is very strong.

Oh, sure: We can expand out x = 1/3 + 2/3 x^2 into a summation yielding x, which, it seems to me, by its very nature, will both be the probability we are looking for, AND be the minimal solution to x >= 1/3 + 2/3 x^2 over [0, infinity]. Thus, as 1/2 is the smaller of the possible bases, it is in fact the correct one.

This is a standard exercise in probability theory. I’m not going to type up the full derivation, but if you start with $M, the probability that you never go broke is 1 - (1/2)^M. So if you have $100, you do have positive probability of hitting zero, but it’s not something you realistically have to worry about. If you want to see the details, check Theorem 5.7.7 in Durrett.

No, you showed that it’s possible to lose this game. But the question was whether losing is inevitable.

To me, the answer is clear. Think of it as a simple algorithm:

  1. If your total amount of money > 0, then play again.
  2. If your total amount of money = 0, then stop playing.

Given that playing can raise or lower your total amount of money, it’s inevitable that at some point (assuming you can play “forever”) that you’ll hit zero. It’s the only endpoint in the game.

But to assume you can play forever, you have to assume that you won’t run out of dollars to win. We’ll assume you need real physical currency to pay off bets but we’ll allow other denominations. So the upper limit on this game is the M0 supply: $3,948,679,000,000. That’s all the dollars in physical existence.

That part is False. Your paradox is one reason that its believably not true. But anyway its absurd that “every possible scenario must happen” , as this is too many… it takes an analysis of infinities to prove that “every possible scenario must occur” is wrong.

Mathematicians have the “law of large numbers” which is to say that the average will turn out given enough trials… however that law doesn’t prevent a very unlucky run sending you bankrupt.

Apologies if I didn’t phrase the question well, but yes, that is what I am trying to ask.

To clear up some other confusion, let’s also assume once you hit $0, the game is over. Let’s assume that while your balance isn’t $0, there is no limit to how many times you can wager on the coin toss. The money supply will never run out, time won’t ever run out, etc.

No matter how big your $ balance from winnings grows, on every single coin toss that comes up tails, there is a greater than 0 chance that this is the beginning of your downswing all the way to zero. This isn’t disputed.

So my Q is, does that mean busting back to $0 is inevitable if you keep playing indefinitely.

No; as demonstrated above, there is only a 1/2^100 probability of ever busting back to $0 (starting from $100) if you keep playing indefinitely.

Playing the game infinitely many times is only one trial. So while this is true under fairly general circumstances, it’s completely irrelevant.