I like to think big. Really really big. I like to think about 7 trillion trillion trillion trillion trillion (etc.) years into the future when all matter has been isolated in black holes, stray atoms are rare, and John Tesh is popular (yes, it’s a bleak future).
Well, I got to thinking. A googol (not the search engine, silly people!) is a one followed by a hundred zeros (10 to the power of 100). A googolplex is a one followed by a googol of zeros (or is it “followed by a googol zeros”?). This page happily mentions that, assuming a conservative estimation of the improvement of computer processors (Moore’s Law), it would take 564 years until computers can “write out” a googolplex in less than four years.
Anyways, I got curious (since I have a sick curiosity about the INSANELY LARGE). How much longer - assuming the same progression as the webpage about Moore’s Law - would it be before computers could do that work in one second? For the sake of the mind experiment, ignore realistic constraints like the possible fact that we currently have no idea how to make a computer this fast (or maybe we do… are quantum computers theorized to be capable of achieving such speeds?).
I’m sure that there’s some really simple math equation to figure out the answer, but it’s 3:30 AM, and what little math I do know is eluding me at the moment…
Oh, crud… the title makes it seem like I’m asking how fast a machine one would need to physically print out all the zeros of a googolplex on paper… I’m not. I’m only referring to the computer writing out the entire number in its own system.
Oh, and assume sufficient hard drive space to hold a googolplex in memory…
I might be wrong here, but I have a childhood memory of Carl Sagan saying that if you wanted to write out a googleplex you’d fill up the known universe with paper. This was during an episode of Cosmos. This is a childhood memory and is a bit hazy so someone please correct me if I am wrong.
Make of this what you will, but it seems to me that your desired calculation would be reaching the asymptotal reaches of Moore’s Law, if it is even possible.
I don’t know enough about how computers work to know how processor speed relates to speed of writing things down. But I’ll assume that Moore’s Law predicts that after eighteen months, a computer can “write” a googolplex in half the time. Also assume that the OP’s cite which says that 564 years from now the process will take four years. Since four years is about 2[sup]27[/sup] seconds, it’ll take 18 months × 27 = 40 years before they get it down to one second. So, 604 years from now.
Also, extrapolating backward, this would mean that today, it would take 61 Trillion google years. (Here’s where I know something’s wrong, since it doesn’t take 61 Trillion years to write out a single digit.)
It depends what you mean by “writing into its own system”.
It only takes a few bytes and even fewer milliseconds to store the string “1e(1e100)” into memory. It’d be a fairly inefficient computer that opens a text file and starts writing a 1 and a near-endless string of 0’s to represent a number.
A hammerbank printer (high speed line matrix printer for industrial applications) - is tailor made for your spec and prints out one entire line at a time in say, one tenth of a second. If we had a hammerbank printer whose platen was a googleplex long in 1/8 per letter increments you’d know it was a googleplex/8 inches long. Add a parsec width on each end for the mechanisms and there you go. 1/10th of a second. All the rest is engineering.
To write out a googolplex in one second, you’d have to write out a googol digits in one second. For this argument, I’m going to neglect energy/entropy effects and focus solely on the size of the machine required.
Quatnum mechanics defines the Planck length as the smallest meaningful length. The time it takes a photon to cross this length at light-speed is called Planck time, and is equal to about 10[sup]-43[/sup] second. In other words, 1 second is equal to 10[sup]43[/sup] Planck time.
This means that any single element of the processor could only handle 10[sup]43[/sup] operations per second… you’re going to need a massively parallel computer if you hope to get anywhere. Running at the blazing speed of 10[sup]34[/sup] GHz, that comes out to about 10[sup]57[/sup] single-bit “processors”. If every processor was a single electron with a rest mass of 9*10[sup]-31[/sup] kg, the computer would weigh about 10[sup]27[/sup]kg. That’s about as massive as the planet Jupiter.
Up to this point, the computer’s just been writing bits to /dev/null; we haven’t stored a single digit. Naysayers tell you that to do otherwise would be impossible; after all, there are only about 10^80 protons, neutrons, and electrons in the universe. I see this as a challenge. Considering that every digit you write (save one) will be a zero, why not compress in real time? Sure, it might require another dedicated computer the size of a minor star, but you’d be able to fit your output on any modern hard disk. Heck, you could even fire up your old Commodore and write it to tape.
You’ve got to love the work people do here. I’m kind of that way too, but I wouldn’t have taken the time to figure out the absolute minimum size of a computer that can write a googolplex in one second. The fact that a computer composed of electron processors would be the size of Jupiter is just an astounding fact that really makes me love the SDMB.