How do they compute Pi to a trillion digits?

Nobody does.

It’s correct to say that if you examine the digits pi statistically, it looks the same as a sequence of random numbers. Of course the value of pi itself is not random, it has a very deterministic definition.

As for what qualifies as “random,” well, there are purely deterministic pseudorandom number generators that can generate sequences that are, as long as you don’t get too many of them, statistically indistinguishable from “real” random numbers like from the timing of radioactive decay events. The highly-technical computer science term for this is “random enough.” :slight_smile:

There are some similar formulae for non-binary bases. From this page it appears that arbitrary digits of π[sup]2[/sup] can be found for base 3 and for the irrational base Φ . :smack:

To build some intuition, a random sequence (of numbers, for instance) is random if there is no rule or law structuring it – i.e. if there is no way, knowing (n-1) digits, to predict the nth digit in advance with better than chance accuracy. This means especially that such a sequence isn’t compressible, i.e. that there is no shorter form it can be brought into that can be ‘unpacked’ by some computing machine to the full random sequence. But any computer program reproducing the number would be such a compression, if it is shorter than the sequence itself (of course, there’s always trivially a program reproducing the sequence that’s about as long as the sequence, something like ‘print “sequence”’). Hence, there can’t exist such a program. This means especially that infinite random sequences can’t be produced by any finite computer program.

Or, as John von Neumann said it:
[

](John von Neumann - Wikiquote)

After relentlessly nitpicking Exapno above, I feel it’d be unfair if I didn’t pick on this a little, too (:p): the most common definition of ‘random’ is informally what I said above – that in a random sequence, there is no way to predict the next digit, or that a random sequence is incompressible. There are several ways this can be formalized, such as algorithmic randomness, or Martin-Löf randomness, but the intriguing thing is that they all seem to agree, yielding sequences with the same properties that we’d intuitively expect from ‘randomness’; this has been dubbed the Martin-Löf - Chaitin thesis, in analogy with the notion of computability being formalized by different, but equivalent frameworks (Turing machines, recursive functions, the lambda-calculus…), known as the Church - Turing thesis. So if one believes we have a good notion of computability, one should also belief the same of randomness.

Machin-like formulae yield the entire decimal expansion of pi exactly.

What is special about 239 is this: (5 + i)^4 * (-239 + i) = -114244 * (1 + i), where i denotes rotation by 1/4 of a complete revolution (which has the nice property that i^2 = -1). This implies that 4 * arccot(5) + arccot(-239) = arccot(1) = pi/4 radians.

Combining this with a method to efficiently compute arccot, we obtain a method of computing pi from the arccots of 5 and 239 [which turns out to be more efficient than computing arccot(1) directly].

You seem to be under the impression that a computers bit size limits the size of numbers it can work with.

It doesn’t.

You can go beyond that by splitting the number into parts. For example, an 8 bit computer could handle 64 bit numbers by working on it 8 bits at a time. Yes, there would be an efficiency hit compared to a 64 bit computer that could do it in one go, but it’s completely possible, and not even very difficult.

Right. A computer’s “bit size”, in the sense in discussion, only describes the kind of integers it has specialized hardware to do arithmetic on very quickly.

But you don’t need specialized hardware to do arithmetic. You can always describe the algorithms for addition, multiplication, and what have you in software. Even if a compute had no arithmetic unit whatsoever on its CPU (an effective “bit size” of 0), it could still do all the calculations you like. It would just be rather slow about it.

If computers could only do what they had built into their hardware as primitive operations, they wouldn’t be much use to anyone.

For that matter, the Chudnovsky series yields pi exactly as well, though I don’t actually know the details of deriving it or what’s special about 545140134.