I don’t think you’ll find a satisfactory definition of randomness. I took a class in probability (a math dept class, not a statistics class) and the best they came up with was a definition of a *random experiment* . That is, an experiment for which the outcome cannot be predicted with certainty.

I think randomness should really be considered a mathematical primitive, like “line” or “length”. Try to define length for me. Everybody takes for granted that you can describe a length with a number, but nobody ever *defines* the process. Similarly, you can quantify randomness with a probability, or a probability distribution.

And there’s where some chances for misunderstanding can start. Many people will say that something’s “not random” simply because it doesn’t obey the kind of probability distribution that is expected. For example, if a gambler rolls a couple of dice and starts getting, say, twice as many double sixes as expected (if they were fair dice), he’ll claim the dice “aren’t random”. I don’t think this is quite the right thing to say.

Similarly, when people ask whether a string of numbers are random or not, I think what they’re really asking is whether it’s reasonable to believe the numbers were produced by a process with a uniform (or some other) distribution.

Another thing people do is to try to determine what the next number in a sequence is based upon knowledge of the prior numbers. If they can “break the code” then they’ll say the sequence isn’t random. But, if you don’t know anything about the process that produced the sequence, how can you really be sure the next number will fit your code? I could write down a thousand digits of pi, then stick my social security number in there someplace. If I started reading the numbers, you might say “hey, those aren’t random, they’re digits of pi!”. However, the next number you predict might be wrong because my SSN is coming up.

Facinating stuff.