The question relies on a false premise. It is true that in a randomly distributed infinite sequence any finite sequence should (theoretically) be found. Define any sequence–say HHTHTTHTHHTTT–and probability says it is possible to find this sequence in the infinite set.
This does not apply to infinite sequences. To see this, note that a sequence is infinite only if it doesn’t have an end. Thus, “an infinte string of H’s” means that after some position p all the tosses will be H’s. By this definition, there can’t be both an “infinite string of H’s” and “an infinite string of T’s” in the sequence; they are completely incompatible. There is at most one or the other, a logical fact which is contrary to the finite case where any string imaginable can be found buried somewhere in the sequence.
An infinite string of H’s does itself contain an infinite number of HHH… sequences: The whole string is one such sequence, the string starting with the second H is another, starting with the 3rd H yet another, and so forth. However all of these infinite strings are essentially the same. A case like HTHTHTHT… embeds two definable infinite sequences–HTHT… and THTH…–each an infinite number of times. More interesting is an infinite string of the form HTHHTTHHHTTTHHHHTTTT… where the H’s and T’s alternate in longer and longer strings. In this case there are an infinite number of definable infinite strings, all different from each other.
Such a string is possible although the probability is zero in a random series. More seriously, you could never know whether such a string existed because you could never test to see whether the sequence holds. Testing would require infinite time.
That’s the problem with all suggestions about infinite strings in a series. Any finite string is theoretically testable. No infinite string is. Since we postulated that the progression is randomly generated we could never be sure of any result.
Ah, you’re talking about incompressible strings. That’s a slightly different notion of randomness. Assuming that you have only a countable number of coin flips, any sequence with an infinite run of heads or tails is compressible: print out the head of the sequence, and then continue printing the appropriate outcome forever.
Of course, there’s nothing actually forcing independent flips of unbiased coins to produce incompressible/Kolmogorov random/uncomputable/whatever strings. There’s nothing even forcing a fair coin away from producing tails on the first flip, tails on the second flip, tails on the third flip, and so on, all tails, no heads. At least, on the usual understanding of fair coin-flipping, there’s nothing preventing this.
In these discussions, one really has to hammer home, as ultrafilter and others were mentioning, the distinction between “happens with probability 100%” and “guaranteed to happen”. With probability 100%, every particular finite string (like HTHTHHHTHTH) will appear at some point in the flip sequence; however, there’s no guarantee of it (because the coin could, after all, just produce all tails or all heads or alternate or whatever). A good analogy might be to think of tossing the Earth as a giant, infinitely-sided die; the probability of landing on exactly the North Pole is 0, of landing on the South Pole is 0, the center of Detroit is 0, etc. In fact, every point has probability 0 of being landed on. Yet, all the same, some point does end up landed on, since probability 0 is not the same as impossible (and, dually, probability 100% is not the same as guaranteed).
To even begin to answer the question, you have to clarify it. I assume that you begin at some time and then go on indefinitely from there. This has two consequences. First the strong of throws is countable infinity (so it is not the cardinality of the real numbers, for example. Second, is that has the order type of the natural numbers. The order type of the natural numbers is usually denoted omega. The set of all integers, negatives included, has the cardinality of the natural numbers, but not the same order type. Yet again is the order type usually denoted omega + 1, meaning go through all the natural numbers and then do one more. Not to be confused with 1 + omega that means do it once, then do it omega-many times. A moments thought will show that, unlike omega + 1, 1 + omega = omega. Then there is omega + omega, followed by omega + omega + 1,… and many many more. But the OP undoubtedly means the order type omega and the question can be answered. With probability 1 any particular finite string will appear (even infinitely often), but you cannot say that it must. Inevitably, probability involving infinity is not going to be identical to finite probability and having probability 1 does not imply certainty. To put it technically, the set of counter-examples has measure 0, but is not empty.
One of the defining properties of omega is that every initial segment is finite. This means that at no point have you tossed the coin infinitely often. Therefore you cannot have had an infinite string of heads (or any other infinite substring) at any point.
To put things in strictly finite perspective, what it all means in the end is that as you go on, the probability that some specfic string does not appear becomes smaller and smaller eventually less than any preassigned value, not matter how small (and no matter how long the original string is). But it is never 0 at any finite stage of the tossing.
Anyway, if one flips unbiased independent coins in a sequence indexed by the natural numbers (c_0, c_1, c_2, c_3, …), probability theory tells us that the probability of getting an infinite string of consecutive heads is 0, as noted by severus. This is the best answer to the OP.
However, it’s interesting to ponder generalized questions. For example, what if you have an infinite two by two grid of unbiased independent coins, indexed by pairs of natural numbers in the obvious way; what’s the probability that one of the rows of your grid ends up producing a sequence that ends in an infinite trail of Hs? Well, as it happens, the answer is again 0 in conventional probability theory, because the Kolmogorov axioms for probability impose countable additivity: given a countable collection of mutually incompatible events, the probability of “One of them happens” is the sum of the individual probabilities for each one. Thus, since each particular row has probability 0 of being the first one to produce an infinite trail of Hs, the probability that any of them do is 0.
But the Kolmogorov axioms don’t impose uncountable additivity (if they did, it would be impossible to have a “uniform” distribution on an uncountable space; indeed, the countable additivity condition of conventional probability theory tells us that we can’t have a probability distribution on a countably infinite space in which all points are equiprobable). And so we can consider the question: what if you have an uncountable number of rows; for example, let’s say you have a row for each real number, and a column for each natural number. All the coins are stipulated to be unbiased and independent, of course. What’s the probability that one of the rows ends up producing an infinite trail of Hs?
The interesting thing is that the axioms of probability don’t say, because that description didn’t in fact pin down a particular unique probability distribution. That is, there’s actually more than one probability distribution on this space in which all the coins are unbiased and independent, and, in fact, these different distributions will give different answers as to the probability that one of the rows ends up producing an infinite trail of Hs (you can get it to be 0, 1, or anything inbetween). I think that’s somewhat interesting.
I’m curious what definition you’re using of “unbiased” or “fair coins”, to be consistent with this statement. And if your definition is “probability of 1/2 each for heads or tails”, I’m curious about your definition of probability. I would define the probability of a coin coming up heads as the limit as n -> infinity of n[sub]heads[/sub] / n, where n is the number of flips. If I had a coin which produced an infinite number of consecutive tails, then that limit is zero, and I would then say that that coin therefore has a 0 probability of coming up heads, and is therefore not a fair coin.
I’m using the conventional mathematical definition of probability as given by a distribution of possible outcomes, rather than a property of the actual outcome. As for “unbiased” or “fair coin”, I just mean to refer to a probability distribution satisfying the property that both heads and tails are equiprobable. I don’t mean to tie anything I say to any frequentist interpretation of probability (one which I would find quite problematic), or any other particular interpretation; I just mean to talk about the conventional mathematical concept itself.
I’m more of a practical mathematician (being an engineer) and the theoretical concept of infinity throws me. In trying to put the OP in a tangible frame of reference I calculated that probably less than 500 quadrillion seconds have elapsed since the origin of the universe. (The age is possibly as old as 15 billion years according to Space .com.)
That’s a lot smaller number than I had expected. That’s also a lot smaller than infinity. As a corollary to what others have said in this thread, the problem is that infinity is not very practical.
At what point in the course of the infinite number of tosses do you need to replace the coin because the act of tossing it will wear off the head and tail markings? (Either by the tossing apparatus or the friction of air molecules. How about neutron decay?) I suppose that you would need to replace it an infinite number of times.
This is a similar conclusion that I came to when we had a previous discussion about infinite numbers of typing monkeys and Shakespeare’s works.
Each time you toss a real coin, you use some energy. If you do it an infinite number of times, then you’ve used an infinite amount of energy, and there’s no reason to believe that universe contains an infinite amount of energy. So, yes, this is a question of pure mathematics, with probably no real application.
It’s a thought experiment, just like Einstein used to develop relativity or any gazillion other examples. No actual coins were worn in the making of this problem. Why would anybody even think that an actual physical object is being talked about? It’s just a figure of speech used to put the problem in ordinary terms instead of mathematical jargon.
I’m not happy with the excuse that you’re an engineer, either. Saying that engineers can’t understand the difference between a theoretical problem and an actual one is an insult to all other engineers.
What I was actually trying for was a segue or introduction into my realization of how many seconds had elapsed since the Big Bang and how that compared to the really huge number and infinite concepts being thrown around. Yes, the discussion is a thought experiment. I was just looking to add a different spin to all the coin tossing.
I did not say that engineers can’t understand the difference between a theoretical problem and an actual one. Instead I indicated that I have a preference for more practical problems. As for understanding the theoretical, I am able to do it and would believe that any engineer worth his/her salt would as well. I could calculate the amount of blacktop necessary to pave a theoretical four lane road to the moon, but the exercise would cause me to :rolleyes: . Mega-engineering projects such as the Channel Tunnel or the bridge between Sweden and Denmark are much more fascinating for me precisely because they were probably once considered as unobtainable as infinity. YMMV and obviously it does.
Sorry, I didn’t mean to piss on the parade because it has been an interesting discussion.
Why would each sequence be finite? In an infinite number of sequences, you’re going to have an infinite number with 384 tails in a row, an infinite number with 9,302,385,103,488 tails in a row, and an infinite number with an infinite number of tails in a row. Right?
The probability of this happening may be 0 if you do one sequence of tosses, or even if you do a finite number of sequences, but I don’t think it’s zero if you perform an infinite number of sequences. It may not be one, either. My brain hurts.
I have no idea. I went back and looked at that after I wrote it and realized a sequence of coin tosses is not analogous to a line.
Yeah, I still remember my physics prof from 30 years ago that presented us problems with “infinitely smooth elephants” (to avoid friction) and measured velocity in units like furlongs per fortnight.
Like I basically said above with the two-dimensional grid example, if you have a countably infinite number of coins (i.e., the coins can be labelled with natural numbers), and toss each in an omega-sequence (i.e., each coin’s tosses can be ordered by natural numbers in the usual way), the probability of “At least one of these coins will produce a sequence that ends in an infinite string of tails” is still 0, according to the conventional mathematics of probability.
When you have an infinite number of sequences, isn’t there a probability of 1 that every conceivable sequence will happen at least once? And why did you add “countably infinite.” Did I say that–or imply it?
The answer to your first question is a flat “No.”. If you were to present your argument for otherwise, I could be more specific about where the argument goes wrong, which could be more helpful. As a specific counterexample, though, let’s suppose you have a countably infinite bunch of sequences s0, s1, s2, s3, …, each generated by coin flips in the obvious way. What’s the probability that at least one of these sequences is all tails? Well, it’s the probability that s0 is the first all-tails sequence + the probability that s1 is the first all-tails sequence + the probability that s2 is the first all-tails sequence … And this is clearly less than or equal to the probability that s0 is an all-tails sequence + the probability that s1 is an all-tails sequence + the probability that s2 is an all-tails sequence … . And this is 0 + 0 + 0 + 0 … = 0. Thus, the probability that at least one of the sequences is all-tails is 0, countering your conjecture.
I mentioned the countably infinite case just because it was the easiest (and because it seems like most of the discussion here has been about countable infinities; e.g., the infinite sequences we’ve been discussing sequences have been ones whose elements are indexed by the natural numbers). I mentioned above in post #26 the trickiness about the uncountably infinite case, where the answer is underdetermined: the laws of probability do not uniquely specify an answer if you have an uncountable number of sequences.