How many digits of pi do we need? NASA typically uses 16, presumably because that’s the double precision representation.
For example, Voyager 1 probe, which is now in interstellar space, is currently more than 15 billion miles (24 billion km) from Earth. If you wanted to calculate the circumference of a circle with this distance as the radius, the difference between using the first 16 digits of pi compared with hundreds of digits would be less than the width of a little finger, according to NASA.
The maximum practical number of digits required is 38 according to the article, which I’ll round up to 40:
For example, if you wanted to calculate the circumference of a circle that encapsulated the known universe, the radius of that circle would be around 46 billion light-years — the distance light has traveled since the Big Bang when factoring in the expansion of the universe. In this case, you would need 38 decimals of pi to get a value with the same level of accuracy with which we can currently measure the width of an atom, according to NASA.
An atom? What’s that in Plank units? That would be 10^25 according to Joe Blow on the internet at Quora. So I say we should keep 100 digits of pi, just to be safe. Though 70 would be fine.
Citation:
So… what the heck are we doing past, say, 200? ![]()
ETA: The gang in charge of calculating the fundamental constants of the universe uses the quadruple precision representation of pi, or 32 digits. Cite. Also:
Pi computation can be used to test computer precision, but I think this is a symptom of pi-mania rather than a legitimate need for pi. Other numbers could be used just as meaningfully, but we choose to use pi.
Not sure I buy the author’s POV: I mean using something well understood, broadly understood, and epic seems sensible.