Is it inherently impossible to generate truly random numbers using computers?

I used this for a simulation to get normally distributed random values from uniformly distributed ones:

I think that if your inputs are truly random the outputs will be too, as it inverts the CDF.

Mildly OT but there was a paper I found within the last year that postulated some phenomena was correlated to changing nuclear decay rates. The phenomena? Evidence for Correlations Between Nuclear Decay Rates and Earth-Sun Distance. I’m not saying it predicts them, but hey, better recalibrate your geiger counter-based RNGs…

Not to insult the work of some extremely talented physicists, but claiming “randomness” always struck me as an awfully convenient solution to a very difficult problem. Then again, I am posting on a message board and not publishing in the Journal of Physics G… :wink:

Yep, that’s a particularly horrid example of the point I made in my cryptography post: it is not safe to assume that the default RNG which came with your programming environment is any good. Although in this case, as explained here, the problem is not strictly with the RNG itself as with the horribly flawed way in which PHP scales the RNG’s output to a desired range.

Actually, most physicists find ‘randomness’ to be kind of a horrifying or depressing solution to a problem. A physicist wants to know ‘why’, and ‘because it’s random’ is saying ‘you’ll never know why’. So most physicists would admit that something is truly random only very reluctantly.
Einstein, for example, hated that quantum theory ended up with ‘random’ elements, and found it difficult to accept, saying “God does not play dice”.

This is gonna sound stupid, so here I go!

There are different kinds of infinities (whouda thunk it?)

Any theories for different kinds of randomness?

Actually, “random” has a lot of different meanings to different people. It can be a fairly low standard for some Stat types (a sequence passes all the tests applied to it), medium for some Computer Science types (can’t compute the next digit based on previous digits) and quite high for some Math types (no concept of computable at all). Variations abound.

Understandable.

But do the long haired math types have any distinction. Or is there some wild so far unaccepted theory that randomness does not exist?

If you ask Gregory Chaitin, he’d say most real numbers are random. A mildly persuasive and (very) mildly technical argument is given in his book Meta Math!: The Quest for Omega. (Omega is algorithmically random.)

Either “never”, or a negative number. Given what we know of quantum mechanics, for radioactive decay to be predictable, it would have to be governed by nonlocal nonhidden variables of some sort, and we’d have to have a way to observe those variables. Being able to observe nonlocal variables would require technology which could also be used to build a time machine. Which, of course, renders questions like “How many years” somewhat tangled.

Though there is the niggle that Omega is a relative notion; just which number “Omega” is depends on the programming language (or other language for specifying numbers) you’re talking about…

Then again, this is to be expected; “random” is just as relative a notion as well.

Weird fact: while you are correct, it is always the first random number larger than the smallest uninteresting number.

Hell, I’ve found it already then. Multiple times even!

I’ve had a few classes where my test grades meet that criteria!