How many bytes are there?

Suppose some catastrophe was headed toward earth, and the only way to survive it would be to store some vast amount of data and reconstruct ourselves star trek replicator-style after the disaster. If we pooled together every hard drive, RAM, floppy, CD-R, DVD-R, Flash RAM, tape backup, zip drive, jazz drive, and any other kind of electronic memory storage device on the planet, how many millions of exabytes would we have?

Anyone got an educated guess?

Can’t answer the OP but I’ve read that if the entire Universe was a computer it wouldn’t be big enough to forcast (accuratley) the weather just on this Planet.

(The Matter Myth, Davies and Gribbin, Simon & Schuster)


How many hard drives, RAMs, floppys, CD-Rs, DVD-Rs, Flash RAMs, tape backups, zip drives, jazz drives, and any other kind of electronic memory storage devices are there on the planet and what is the size of each and every one?
The OP calls for speculation and/or opinion. It is not a factual question nor relasistic and should be consigned to the dust bin or???

A COUNTER QUESTION: Suppose the moon was made of green cheese. A colony of six male and six female mice are placed thereon. How long will it take for the colony and its progeny to eat themselves out of “house and home.”

Well, we can make a reasonable guesstimate. Restricting it to Hard Drive storage alone, the revenues and units of major manufacturers is publically availible. I don’t have recent data but a Scientific American article from 2000 claims that as of 1998, there was a total revenue of $30bn at 4c/Mb or 680Pb (680,000,000Gb). Extrapolation from that point leads to around $70bn at around $1/Gb or 70Eb of storage sold per year. Given that HD sales are exponential with a doubling rate of about 12 months, you can expect 140Eb of HD space to exist around the world.

Space on CDR and DVDR are harder to quantify since it depends on the number of unburnt CD’s around. All other forms of storage would be trivial compared to these two IMHO.

I disagree. Simply disregard the speculative part and the following is the factual question:

moderator GQ

a) Well, to start with, the matter that directly consitutes all those hard drives and CDROMS and DVD-ROMS and other storage media would take up way way way way more storage space to encode their physical material states in binary code than they themselves have available, so if you want to store everything in that fashion and not just the humans you’ve got one heck of a bootstrap problem.

b) Ignoring that, there’s a basic “where you gonna store it, bub?” question that inheres from the impending loss of the world’s supply of hard drives and CDROMS and whatnot along with the rest of the world. Unless you can get Cosmic FedEx to transport all this storage media to a safe planetoid before the cataclysm, storing everything in that fashion isn’t going to help you much, now is it?

c) I seem to recall an IBM experiment in which a subatomic particle’s relevant information was beamed to a receiver which was capable of recreating an identical subatomic particle based on that information. Here’s the Master’s discussion thereof. Cecil says:

I think 10 to the 32nd power is a hundred nonillion. Divide a hundred nonillion by eight to convert bits to bytes and you’re looking at 12 nonillion 500 octillion bytes. Convert bytes to exabytes — umm, will you forgive me if I use the decimal approximation here? — an exabyte appears to be 10^18 bytes, so we’ll just rip 18 zeros off the end. Looks like a hundred and twenty five trillion exabytes per person, if my math is worth a damn, and if the resolution Braunstein describes as “fairly coarse” is sufficient to restore them from backup. And as Cecil points out in the above-referenced article, that’s going to put your fastest FireWire 800 card through its paces for a very very prolonged performance test.

Hope this helps.

This vast international project is put off indefinetely when one asks;

“MAC or PC?”

Not enough to fit all your computer games on.
(that always happens)

Can you elaborate on their arguments?

Precisely. Please forgive my bit of whimsy. What use I have for all this storage is irrelevant, and shall remain a secret evil scheme, loosely covered by the “Postapocalyptic Reinstallation” propaganda.

I’m not interested in the feasibility of transporters, or the theoretical information limit of the universe; just curious about the sheer bulk of storage that humans have thus far produced. Shalmanese is on the right track for what I’m looking for.

The percentage of CDR’s and DVDR’s that have been burned vs. unburned is probably not easy to answer, but I’ll settle for a total on those. (We’ll assume that I can perfect some handwaving process to restore burned CD’s to a blank state by restoring the dye layer.)

I misunderstood the question. You were asking how many exabytes to we have and I misconstrued you as having asked how many exabytes you would need.
In other tangents:

II Gyan II:

I’m not Jake nor have I read the specific book that Jake references, but it’s the same point made by Ed Lorenz as long ago as 1963. Google “chaos” and “weather” and “butterfly effect” and “Lorenz”.

Essentially even very simplified equations of the sort that would be necessary would be extremely dependent on exact initial starting values for all the point measurements, to the point that if between one measured point and another taken a meter away one unmeasured butterfly flapped its wing and changed the air current ever so slightly it would introduce a tiny fractional difference a few seconds later which would make a bigger difference a few iterations beyond that until by the end of the month you have calm where your computer model predicted a hurricane.

So to overpower that problem with the brute force of data you’d have to have data tables for temperature airspeed humidity etc etc for every (let’s say) cubic millimeter of atmosphere and crust (perhaps necessarily all the way down to the earth’s core) along with sunspot activity and info on every living creature and its activity and who knows what else —an overwhelming amount of sheer information. Then you’d have to calculate the effect that every single one of those variables would have on the corresponding value of each of the others a (let’s say) tenth of a second later, a massively rendering feat.

In short, it’s no easier than predicting what an as-of-yet-unborn girl will eat for lunch in the year 2047 and whether or not she’ll put the fork down before reaching for the salt shaker.

Just to continue the tangent, AHunter3’s post seems to assume a strictly Newtonian or relativistic world, and ignores the randomness of quantum mechanics. What if a wing-flapping neuron in that butterfly’s brain is just not quite firing, and a carbon-14 atom decays and pushes it over the edge?

As for the real question, all I can say is “Lots!”

Oh, I’m aware of the concept of chaotic systems. Jake’s paraphrased statement says the Universe can’t be a computer because it wouldn’t even predict Earth’s weather, if it were. I just want to know the chain of reasoning that leads to the conclusion that the Universe can’t indeed be precise and accurate enough to predict the weather.

rjk, as I understand it, the formalism of QM does not definitely rule out determinism at the fundamental level, illustrated by the Bohm interpretation or Nobel laureate Hooft’s beable-changeables .

There’s only 256 different bytes. I can just write 'em all down.

This thread seems to have drifted to the computer needed for the task.

So if you want the utlimate computer to handle ALL of your computing needs for a long time…

[url=]Qubit Computer[\url]

An article on the sum total of human knowledge can be found here.

That’s just a portion of the amount of data you’d need to store.

Here’s a lecture related to the topic. Its title is “Everything Digital: Converting the World in 2 Exabytes”.