coin tossing to infinity

Perhaps I wasn’t clear. In a random binary sequence of infinite length, runs of arbitrary length are expected to occur. Not just of decent length, but of any length.

This would not be the first random walk problem I’ve seen where you end up at the origin with a high probability that doesn’t depend on where you start from, despite the fact that the odds appear to be stacked against you.

No, I understood. My point is that your run – however long – is likely to be too late to do much good.

Not if it’s long enough.

Crozell, press the quote button in the bottom right of my post to see how I did the coding.

I agree with lucwarm on this. As time goes on, the mean of the probability distribution increases linearly with N, the number of coin flips. However, the standard deviation increases as N[sup]1/2[/sup]. So, the number of sigmas below the mean that you need in order to get to the answer is increasing with time, whereas in the typical random-walk problem it’s decreasing.

This can’t be right. The very second term of your series is 1003 × 2[sup]-1003[/sup], which is roughly 2[sup]-993[/sup].

Yes I should have multiplied by a 1000, so instead of 10^{-300} it ought to be 10^{-297}. I guess the multiplier of the second term ought to be 1002, since, as observed, the one head cannot be the last (else it would have already been counted). And the remaining terms should also be a bit smaller. However, the analysis I used ignored the first few thousand terms of the series, so the numbers ought to be larger, but it doesn’t change the fact that the odds are very tiny.

I agree that the odds are pretty tiny, but I still think we’re talking several orders of magnitude here. The terms increase in size for a while - the fifth term is C(1012, 4) × 2[sup]-1012[/sup] = 10[sup]-294[/sup]. I know the terms in my slightly different series keeps increasing up until term #443.

IANAM, but if we’re talking about infinity isn’t a sequence of 1000 or 10000 or 1000000 tails relatively small theoretically?

Subscripts: x[sub]n[/sub] becomes x[sub]n[/sub]. And if you ever want to see how a poster coded something, just hit the quote button below the post.

Have a look at my approach. I’m sure I’ve made a mistake somewhere, but I think the approach might be useful.

Let P(n;event) be the probability of the event, starting with n. Let p[sub]m,n[/sub] be the probability that, starting from m, you ever get to n.

By symmetry p[sub]n+1,n[/sub]=p[sub]1,0[/sub].

p[sub]2,0[/sub]
=P(2; hit 0)
=P(2; hit 0 | hit 1) . P(2; hit 1) + P(2; hit 0 | don’t hit 1) . P(2; dont’ hit 1)
=p[sub]1,0[/sub]p[sub]2,1[/sub]+0.(…)
=p[sub]1,0[/sub][sup]2[/sup]

But p[sub]1,0[/sub] = [sup]1[/sup]/[sub]2[/sub] + ([sup]1[/sup]/[sub]2[/sub])p[sub]3,0[/sub] = [sup]1[/sup]/[sub]2[/sub] + ([sup]1[/sup]/[sub]2[/sub])p[sub]1,0[/sub][sup]3[/sup]

Solving the cubic gives p[sub]1,0[/sub]=1/phi, so p[sub]1000,0[/sub]=1/phi[sup]1000[/sup]

<fotosbyfrank>

Let’s see - this was covered in a paper I wrote in 1986 . . .

Little’s Law of Large Numbers states that as the sample set increases, the deviation from the ideal distribution decreases exponentially.

For example, the deviation from a flat curve for 1000 runs of a lottery may be 10%. For 1,000,000 runs, the deviation decreases to 1%. However, because of the larger sample set, each percentage is larger, thus no assumptions can be made as to any following outcomes to fit the expected ideal curve.

I leave this as an interesting exercise for the reader to work out. I have a interesting original proof of this myself, but the margins on the message board are too small to contain it.

</fotosbyfrank>

Look at it this way . . . how many flips would you have to do to have a reasonable chance of getting 1000 tails in a row? It’s gotta be a LOT more than 5000.

After 5000 flips, if nothing special happens, you’ll be somewhere in the vicinity of 3500 spaces away from home. So now, if 1000 tails in a row come along, it’s not anywhere near what you need to get home. You need another 2500 tails.

How long do you have to wait to get 2500 tails in a row? It’s gotta be a LOT more than 10,000 flips. etc. etc.


Now, there are other ways of getting “home” that don’t involve an uninterrupted streak. But it just seems to me that this is not like a situation where you are waiting for some particular event to happen – like drawing a straight flush in poker. Instead, as the experiment continues, the bar keeps getting higher and higher – very quickly.

I’m now leaning to the idea that return to zero is certain, and I have thus covered all bases from “it can’t happen” to “it has to happen.”

My reasons are based on ultrafilter’s point and I think that you will end up at the origin from any starting point.

We have a random walk on a line that runs from zero on the left to as far as necessary (no limit was set in the OP) on the right. So no matter how far the throwing of heads moves the start to the right, sooner or later there has to be a run of tails that will bring you back to zero.

Yes but it IS possible to get 1000 tails right at the start. Improbable, but it is clearly possible. So the answer to the original poster’s question is an obvious “yes, it is possible”. I really dont see what infinity has to do with it. It’s always possible, but it get increasing less probable as the number of tosses continues. Quite simple really.

The point was that as I see it if you tossed the coin an infinite number of times you would definitely get back to zero, as no matter how many tosses you have made there is always a chance of getting back to the start. Therefore if you keep flipping coins for an infinite length of time eventually that event will occur even though the probability of it happening decreases faster than it is increased by the multiple attempts. I suppose this all depends on your definition of infinity.

Another way I thought of expressing sort of the same problem that might make for less complicated maths is as follows. If you were to repeatedly attempt to do something starting with a 1/1000 chance of success which halved with each subsequent attempt 1/1000, 1/2000, 1/4000 etc if you kept on trying an infinite number of times would you eventually succeed. I would say that this is a much clearer cut yes than the above example even though the probability decreases faster with each attempt (although obviously the probability of success in the first few attempts is much higher).

As regards the starting number/probability I agree that for any sequence of a defined number of flips/attempts the starting point affects the probability of success, I don’t think this is the case with an infinite series I think there is a 100% probability of success no matter what the starting point is but I can’t really explain why (but I now have a headache).

I think this has to be right. However, another valid solution to your cubic is p[sub]1,0[/sub] = 1, and I can’t think of a good reason to discount this solution.

I found an omission in my analysis before that gave p[sub]6,0[/sub] = 0.0691. I don’t know how to fix it, but I do now believe that value to be too high. Just from that, I like your value of p[sub]6,0[/sub] = 0.0557.

Good point. I’ve an informal agrument on that matter:

If you change the probabilities so the probability of subtracting one is slightly greater than half, then p[sub]1,0[/sub] must increase.

The 1/phi root increases slightly if the cubic is so adjusted, whereas the 1 root decreases. So the 1/phi must be right.

I disagree. To see this, let’s consider the probability of at least 1 success in two attempts.

That probability is 1- (1 - 1/1000)(1-1/2000), which is approxmately equal to 3/2000, but strictly less.

It seems to me that in general, the probability of at least 1 success is strictly less than the sum of the probabilities for each attempt.

The series 1/1000 + 1/2000 + 1/4000 . . . converges to 1/500.

So it seems to me that the probability of eventual success is something less than 1/500 (but greater than 1/1000 of course)

Put another way, there is no number “N” such that the probability of at least one success after N attempts is greater than 1/500.

Has anyone else looked at this as a Markov chain with 0 as an absorbing state? I played around with it a little, but it’s been too long since I did this sort of thing, and I’m making a mistake that I can’t catch.

Your argument sounds good, but I don’t see the math for it. For instance, if there’s a probability of 0.6 of decreasing, the cubic becomes:

p[sub]1,0[/sub] = 0.6 + 0.4p[sub]1,0[/sub][sup]3[/sup]

This has roots at 0.823 and 1. So, the 1/phi root increases, and the 1 root is constant.