What is the largest number arithmetic has been done on?

There are lots of stupidly big numbers discussed in maths. E.g. grahams number, tree(3). These numbers have been used in proofs (in the case of grahams number) but are far too large to do arithmetic on or even represent as an approximation in scientific notation.

So what is the largest number that has actually been used in arithmetic? (i.e has been multiplied by two, added one to, etc)

I’d actually be interested to hear the answer for the largest magnitude number and the most actual digits used (is there ever a situation where there is an advantage to doing arithmetic on a bazillion and one significant figures rather than just bazillion significant figures?)

At a guess, it would be a number used in the search for primes. Not the largest prime found, but the largest non-prime factorised.
This relates closely to the search for Mersene primes. Candidates are defined by 2^{n} -1, so the largest one of these that has been attempted, whether prime or not is very likely a contender for largest number crunched.

Would that likely be a Mersenne prime? (I suppose any prime can be bigger than a Mersenne…I guess they are just more difficult to find)

I have read that Graham’s Number is the biggest ever used in a calculation.

Tree(3) is much bigger but it is a progression and not really used as a calculation.

Then you get Rayo’s number which I think is just for fun finding the biggest number we can define that is not “anything you say +1.”

Anything you say +2 ?

There has been this (IMHO silly) search for Mersenne primes ongoing for ages. In a previous life I was involved in a high performance computing research group, and all the spare cycles on the machines got vacuumed up by a dopey acedemic who was making a name for himself finding them. There have been challenges with prizes to find them. So I suspect that they step out into the number space further than other prime number searches.

I doubt that the totality of Graham’s Number has ever been used in a calculation. Or ever will be. There are calculations that provide bits of the number. But that isn’t the OP’s question.

For “most actual digits”, it’s probably pi, and whatever intermediate numbers that went into the calculation. The current record is about 100 trillion digits.

Mostly this ends up being memory limited. While there are ways to compute pi without storing the whole thing in memory at once, those algorithms are not so efficient. You need many terabytes of storage to make it practical.

I think the notion of Graham’s Number is it was a solution to a specific mathematical question (I have heard it explained and it still eludes me). An obscure question but a fair one. And, importantly, it is finite. Ginormous, but finite. Not like Pi.

Tree(3) is not like that. Nor Rayo’s number. (although they are also finite and ginormous)

Touché :slight_smile:

It’s an upper bound for the following problem:

The upper bound is now much lower than the original bound Graham found, but still insanely large, especially since the best known lower bound is 13.

Though that’s not doing arithmetic on PI itself though is it? I thought those techniques treat it as a trillion one digit numbers not one number 3.14… which they do arithmetic on?

Maybe the OP would prefer something closer to common experience.

A deck of cards has 52 cards in it. So, there are 52! combinations. That is 8*10^67 possibilities.

Someone did the math to give an idea of how big that number is.

Here is a video that shows how big that number is (which is waaaaay smaller than Graham’s number which is smaller than Tree(3) which is smaller than Rayo’s number).

This is mind blowing and no need for higher dimensional hypercubes…you get this with a pack of regular playing cards:

Though that is not that big. It’s almost small enough to be represented in a single 64-bit int on a computer

Some algorithms can work a digit at a time, but I believe for the 100T digits they stored all of them while calculating.

Of course, they didn’t actually do anything with pi once computed, so arguably that wouldn’t count as having arithmetic done on it. But all the intermediate numbers leading up to it would count.

A while back I wrote a program that computes pi to however many digits you want (though at the time, about 1 million was all I had the patience for). I used this algorithm, which among other things requires a full-precision (i.e., with however many digits you want for pi) calculation of \sqrt{2}. It’s actually quite easy using Newton’s method: start with a low-precision approximation, and then average x and \frac{2}{x}. This doubles the number of correct digits on each iteration. You can compute the reciprocal of a number by iterating x' = 2x - Nx^2, which again doubles the number of digits on each iteration.

So even if pi doesn’t count, for at least some of these runs, intermediate numbers like \sqrt{2} would count. And they’re easy to compute quickly.

Oh, I almost forgot: for the 100T digits, the primary algorithm they used actually computes \frac{1}{\pi}, not \pi itself. They did a final reciprocal to get pi. That would definitely count as arithmetic performed on \frac{1}{\pi}.