Mathematical paradox?

Let’s explore the general case where P(heads) = p and P(tails) = 1 - p = q.

Define P(x,y) to be the probability of the player losing when the player starts with x coins and the house starts with y coins.

P(1,1) = q and P(1,n) = q + p P(2,n-1) = q + p (P(1,n-1) )^2.

The equations P(1,1) = q and P(1,n) = q + p (P(1,n-1) )^2 define a recursive sequence. It is not difficult to show that sequence is monotonically increasing.

We can use induction to show that this sequence is bounded above by 1.

P(1,1) = q <= 1.
P(1,n) = q + p (P(1,n-1) )^2 <= q + p = 1.

Furthermore, we can show that when q/p <= 1 the sequence is bounded above by q/p.

P(1,1) = q < q/p.
P(1,n) = q + p (P(1,n-1) )^2 < q + p (q/p)^2 = (pq + q^2) / p = q/p.

Since all monotonic bounded sequences have limits we can define P(1) = lim(n->inf) P(1,n) = lim(n->inf) P(1,n-1)

Taking limits of both sides of P(1,n) = q + p (P(1,n-1) )^2 yields:

P(1) = q + p (P(1))^2

Solving for P(1) gives two solutions, q/p and 1. So given our upper bounds from above we know P(1) = q/p when q/p < 1 and P(1) = 1 when q/p >= 1.

Thus we always know which of the two solutions is correct.

Finally, we also have P(x) = A + B (q/p)^x. Since we know P(0) = 1 for all p and P(1) = q/p or P(1) = 1. So we can solve for A and B in both cases.

When q/p < 1 we have A = 0 and B = 1 thus P(x) = (q/p)^x.
When q/p >= 1 we have A = 1 and B = 0 thus P(x) = 1.

So if the coin is fair or favors tails ruin is certain no matter our starting stake, otherwise there is a non-zero probability of our stake growing without bound.

p = 0 is not really a solution to this difference equation since the equation is supposed to apply for x = 1, 2, … When x = 1 the p=0 “solution” has the undefined quantity 0[sup]0[/sup].

Even if the bankroll is reduced to a mere $10, the odds of busting within 60 trials is about .001 and gets smaller as the game continues, though the online Poisson calculator I used lacks the necessary precision.

This doesn’t specifically address the OP’s paradox (I’m not sure the term is strictly correct) I admit, but I’m more of an engineer than a mathematician.

Considering where p^(x - 1) came from in the equation (e.g., the reasoning that the probability of X many blahs is the X-th power of the probability of a single blah, calling that single-blah probability p), we can deduce that the relevant value for 0[sup]0[/sup] will be 1. But this will still rule out p = 0 from being a solution to the equation.

That’s fine. But I’m still confused as to what a “Poisson calculator” calculates. Could you clarify what sort of inputs you are giving it and what it does with them?

Also, the odds of busting should only get larger, rather than smaller, as the game goes longer. We do know that the overall odds of busting eventually with a bankroll of $10 are 1/2^10 = 1/1024, so your calculation does seem reasonable so far, apart from that comment about the direction in which the odds move.

One more thing to note. The characteristic polynomial of the difference equation P(x) = q P(x-1) + p P(x) has two roots 1, q/p and we are using both in the solution however the root 1 is somewhat obscured.

We have used P(x) = A + B (q/p)^x but this is the same as P(x) = A (1)^x + B (q/p)^x.

Perhaps this is clearer if we consider the Jacobsthal sequence, 0, 1, 1, 3, 5, 11, … Defined as J(0) = 0, J(1) = 1, and J(n) = J(n-1) + 2 J(n-2).

The difference equation J(n) = J(n-1) + 2 J(n-2) has characteristic polynomial x^2 - x - 2 = (x+1)(x-2) which has roots -1 and 2.

This means J(n) = A (-1)^n + B (2)^n. We use J(0) = 0 and J(1) = 1 to get the equations:

A + B = 0
-A + 2B = 1

Which we solve to see that A = -1/3 and B = 1/3. So J(n) = -(1/3) (-1)^n + (1/3) (2)^n. The fact that we are using both roots is much more obvious here.

I could have used the exact same technique to derive an explicit formula for the more familiar Fibonacci sequence (Binet’s formula), but it has messier roots which would have been a pain to type.

Everything you write is great, but I’ll re-emphasize that we can tackle the general case in the same simple way we tackled this case:

We know off the bat that the probability of losing a bankroll of x dollars is the x-th power of the probability of losing 1 dollar. This probability will be the limiting value of the sequence defined by d[sub]n+1[/sub] = T(d) and d[sub]0[/sub] = 0, with T(d) = q + p d^2 [note: d[sub]n[/sub] is the probability of a “dollar-loser” of “rank” < n]. As T is an order-preserving continuous operator on probabilities, this will be the least probability satisfying d = T(d), which is min(q/p, 1).

Putting together the first sentence and the last, the probability of losing a bankroll of x dollars is min(q/p, 1)[sup]x[/sup].

Most of this was in your post, but I feel like it’s worth isolating this argument as particularly simple. Let me ask, though:

The second argument to P here is essentially the “rank” I discussed before, but I’m curious why you call these “house coins”. I wouldn’t imagine a house normally playing with money of its own according to these particular rules (though of course we can force a theoretical house to do whatever we theoretically want it to).

Missing subscript reinstated in bold.

I’m curious about your curiosity. If you flip heads on your first flip the house pays you one coin. Where did this coin come from? The house. Where else would it come from?

I thought we had pretty much established that there is no paradox here. Our resident mathematicians have launched into a frenzy of interesting probability calculations, but I think the main point regarding the actual OP – remember the OP? :smiley: … is that whatever that probability may be, in the described example the probability of going bust in an infinite series of coin tosses in the described scenario is the same as it is for any large finite number, for the reasons already given.

So that the presumption that either (a) you must eventually go bust because… infinity! or (b) that the question has no answer, are both incorrect. If I’ve missed something here, I’m all ears.

What I mean is, why do you say P(2,n-1) = P(1, n - 1)[sup]2[/sup]? The way I would normally think of “house coins” as working would be via P(2, n - 1) = P(1, n - 1) * P(1, n) [the house gaining a coin as I lose my first one].

(Both recursions lead to the same ultimate limit, of course, but different sequences along the way.)

You are correct. I made a sloppy error. I should have P(2, n - 1) = P(1, n - 1) * P(1, n), and that does change some of the details in the sketches of proofs I have as well, but the overall idea remains intact.

Lets bring this back to the OP. Suppose you were faced with such a game with a coin that favored heads and you could vary your bet. How much of your bankroll, B, should you bet each time to grow your bankroll the fastest?

The answer is B (p - q). We get this from the Kelly criterion.

Define P(x,y) to be the probability of the player losing when the player starts with x coins and the house starts with y coins.

P(1,1) = q and P(1,n) = q + p P(2,n-1) = q + p P(1,n-1) P(1,n) so P(1,n) = q / (1 - p P(1,n-1) )

The equations P(1,1) = q and P(1,n) = q / (1 - p P(1,n-1) ) define a recursive sequence. It is not difficult to show that sequence is monotonically increasing.

We can use induction to show that this sequence is bounded above by 1.

P(1,1) = q =< 1.
P(1,n) = q / (1 - p P(1,n-1) ) <= q / (1 - p) = q / q = 1.

Furthermore, we can show that when q/p <= 1 the sequence is bounded above by q/p.

P(1,1) = q < q/p.
P(1,n) = q / (1 - p P(1,n-1) ) < q / (1 - p (q/p) ) = q / (1 - q) = q/p.

Whoa.

So, is the following assertion false, for a coin that’s biased towards the player?

That is, What’s the probability that you will ever see X more tails than heads? If it’s 1, then Little Nemo is correct, but that contradicts what all the experts are saying. If I interpret it correctly, the experts agree that the probability of this is zero.

If I understand what Indistinguishable said just a bit upthread, if the coin is fair or biased against the player, the probability of the above is 1, which makes sense. For any other case, I trust the experts are right but don’t quite follow the math.

For one thing, I’m still struggling with the x^100 thing. I understand that when they have to be consecutive, but I don’t quite follow the ability to factor out win/loss pairs that are interspersed in the series. I think it’s explained above and I’ll just need to reread and think a bit harder. No way I’ll follow the Difficult Equations interpretation. I got a C in that many decades ago, and it still buggers my imagination that I didn’t get an E because I distinctly remember that had no clue what I was doing on the final! I must have passed on the curve, and everyone else skipped most of the lectures or were idiots. Or my BS-fu was strong that day.

Not zero; just small (in the case where the coin is biased towards heads).

There’s an on-point chapter in Martin Gardner’s Mathematical Circus (which I highly recommend) called “Random Walks and Gambling”. Among other examples, it describes a coin-toss game between players A and B similar the OP’s proposal in which heads means B gives A a dollar tails means he loses A gives B a dollar. If each of the players start with a finite bankroll, the odds of either wining at any given moment is equal to their share of the total stake, i.e. if A has $7 and B has $10, A’s chances are 7/17 and B’s are 10/17. Garner adds that if one player has an infinite bankroll, the one with the finite stake is “almost certain” to face “eventual ruin”.

This example is based on even odds with each event, though, not the biased coin the OP proposes. Given that advantage, I suppose it’s possible that if the game continues long enough, the player is more likely to win every dollar on Earth before he goes bust.

Here’s another way to think of the win/loss factoring whatever thing:

There’s a big room of indistinguishable slot machines. Any time you sit down at one, it opens a tab for you to start playing with. Your profit can climb and fall in various ways as you play, but if you ever owe the machine a dollar, it’s Game Over; the machine collects your dollar, closes your tab, and won’t let you play anymore. [So you can only lose at most one dollar to any given machine]

You are a compulsive gambler, so you sit there and you play, over and over. Your addiction is so much that you can never quit. You sit and you play at a machine for as long as you can. Whenever you hit Game Over at one machine, you just hop over to another machine, and start all over.

How could you possibly lose X dollars in this scenario? Well, you’d have to first sit at a machine and reach Game Over, losing your first dollar. Then you’d have to get up, go to another machine, and reach Game Over again, losing your second dollar. And so on, X many times.

At any given machine, there’s some probability p that you will eventually get a Game Over playing it (and some complementary probability that you will be able to sit there playing it forever). Since you have to do this X many times with X many independent machines, the probability of pulling it off is p^X.

Finally, though I’ve phrased this in terms of a bunch of different machines, since the machines all act the same anyway, it’s not any different from just using the same machine the whole time (of the sort that doesn’t lock you out no matter how bad your tab gets).

It’s not quite the same for infinity tosses as it is for a large finite number of tosses, since no matter how far up you go, there’s always a nonzero chance that you’ll still lose it. However, as the finite number of tosses goes up, the probability approaches the probability for an infinite number of tosses, and the convergence is extremely rapid, so saying they’re the same is a very good approximation.

Put another way, in the (already unlikely) event that you do eventually bust, it’s overwhelmingly probable that you’ll do so within the first few hundred tosses.

Just for laughs, I wrote a quick C# program to simulate a game. The player’s holdings at any given moment seem to consistently approximate one-third of the number of games played:



Games=100000000
Stake=33328637

Games=200000000
Stake=66656231

Games=300000000
Stake=99999411

Games=400000000
Stake=133317211

Games=500000000
Stake=166657863

And so forth. The current run is up to 10 billion games with the player at around $3.3 billion. I could let this run all night, I suppose, though I don’t expect any dramatic surprises.