Probability and Infinity. A rather incoherent question.

Can one apply probabilities at all if an infinite number of events are considered?

As discussed in the Monkeys do Shakespeare thread. Given infinite time an (ideal) randomly typing monkey will produce Hamlet etc. My intuition was that there would be an infinite number of ways to get Hamlet wrong and so the monkey might never get it right, I am told this isn’t so.

Is it that even if there are an infinite number of ways of getting it wrong (the monkey types out Pi first), with infinite time the monkey can work his way through all those versions (like the Horatio/Elvis one) before getting it right. Maybe I would be better off considering an infinite set of monkey texts, one could have Pi to infinite places, another the complete works of A. A. Milne, another the libretto to What’s Opera Doc?

Another example. In an infinite number of dice throws you will see a million sixes in a row. Even in a billion billion (finite) dice throws the odds of this happening are beyond astronomical, I am more likely to win the lottery three times in a row (and I don’t play).

What I’m picturing is a graph where the probability of the monkey getting Hamlet down is as near zero as makes no difference for any finite number no matter how huge, but shoots up to one as soon as you plug in infinity. Is this the case or is it a failure of imagination on my part? Am I missing something?

Is it allowable (and does it help) to consider the set of numbers as arbitrarily large, but not including infinity as an end point? It’s possible that this is mathematical bollocks, if it is please say.

There are several threads on exactly this question, one as recently as 3-ish months ago. A little searching ought to turn them up.

The very short answer is you can’t do conventional algebra on infinities (which come in different sizes by the way) and so conventional probablility manth (eg a 1 in X chance times Y trials means a Y/X chance of the desired outcome at least once) breaks down when your trial count goes from merely a very big number to an infinite number.

There is well-defined math for dealing with the situation, but it is not routinely taught in high school or college algebra, nor is it intuitive.

Well, the thing is, the probability of a monkey typing Shakespeare isn’t zero, it’s just one-in-a-very-large-number. Suppose there were one million letters in the complete works of Shakspeare. Suppose there were 44 keys on a typewriter keyboard. Then the probability of randomly hitting keys until the complete works of Shakespeare came up would be one in 44[sup]1,000,000[/sup].

Now that’s a horrendously big number by any ordinary standards, far in excess of the number of nanoseconds since the beginning of the Universe, so even a gigaHertz processor running for ten billion years wouldn’t have made an appreciable dent in the task. But mathematicians can conceive of bigger numbers than that without working up a sweat. The one above can be easily represented by exponentiation - in one level of exponentiation, even! - and in mathematical terms, that’s hardly getting started. For really big numbers, such as moser and Graham’s number, you need to spend a page or so explaining your notation before you even get around to writing the number down.

And they’re not even on the same page as any definition of infinity.

Mathematicians wouldn’t “plug in” infinity, but they might look at the limit as something approaches infinity. Limits (which are the foundation of calculus) are the way to deal with things that get arbitrarily large/small/long.

For a finite sample space, probability is just combinatorics. For an infinite sample space, you have to deal with something called measure theory, which is very rarely taught as part of an undergraduate math curriculum.

*Boink I know about limits (I did have an engineering education, once upon a time). Meanwhile.

Searched on infinity probability and found a thread about a coin tossing game with links to these: Direct Hits

From the second link:

And this: