I was at a conference last week, and amid one very good session, there was a factoid that caught my eye. It’s floating around the Internet, and here’s a good example:
In the conference and in other sources, these “experts” are identified as “IBM.”
It’s a fun factoid. But I just taught my kids a lesson about the power of doubling, and I got lost in the math on this.
If we take the total sum of human knowledge on January 1, 2020, a year and a half later, it will have doubled a little more than a thousand times. Not “will be a thousand times more,” but “will have doubled” a little more than a thousand times.
With the help of Google, I found 2 to the power of 1,000 is an integer with a little more than 200 digits. The total number of people on Earth has ten digits, meaning that over the next year and a half, each of us, including infants and those with advanced dementia, is responsible for generating more than a Googol times the total sum of human knowledge that currently exists.
More than that–every single atom in the universe must store more than a Googol times the total human knowledge that currently exists.
Am I misunderstanding the factoid, or is it just patent nonsense?
I wonder if knowledge is not about the number of facts but also how many people know it. If I discover something new, that’s “1 knowledge”. If I tell that fact to another person that’s “2 knowledge”
I’ve read that the amount of information created in 2024 was 147 followed by 21 zeros of bytes. It is expected that the amount that will be created each year from now on will increase by 23%. No, I don’t have a definition for information. No, I don’t know how this is calculated. Try looking that up. Google on the sentence “The amount of information created in 2024 will be 147 followed by 21 zeros bytes and it increases by 23% each year” and look at what comes up.
I see where the OP is coming from. While I generally agree it is increasing rapidly, the numbers in the quote are not trustable.
Let’s say the 1982 statement is approx. true. Knowledge has been doubling every year for 43 years. 2^43 is nearly 9,000,000,000,000. A factor of 9 trillion. If you assume a reasonable amount of knowledge existing in 1982 you are going to get a scary big number. Where is all that being stored? Forget brains, you have to be talking computer storage. There’s about 150 zettabytes (10^21) of total computer storage out there. Working backwards that would limit the 1982 knowledge to ~16 * 10^9 bytes. I.e., a few billion bytes. Um no. That’s a rack of old 9-track tapes.
And that assumes a flat line of knowledge growth.
The doubling ever 12 hours, per the OP’s calc. is a number made up by someone who doesn’t know arithmetic.
(FWIW, creating knowledge is basically a surefire way to increase entropy. Hence, the SDMB is not helping, entropy-wise.)
In his 1982 book Critical Path, futurist and inventor R. Buckminster Fuller estimated that if we took all the knowledge that mankind had accumulated and transmitted by the year One CE as equal to one unit of information, it probably took about 1500 years or until the sixteenth century for that amount of knowledge to double. The next doubling of knowledge from two to four ‘knowledge units’ took only 250 years, till about 1750 CE. By 1900, one hundred and fifty years later, knowledge had doubled again to 8 units. The speed at which information doubled was getting faster and faster. The doubling speed is now between one and two years.
That form of doubling curve probably made sense in 1982. No doubt knowledge, in the form of printed information, doubled and re-doubled after mass printing techniques became common. Then computers made the collection of vastly increased data possible. That information now includes all the background data that devices are collecting on us everytime we use then.
Experts do not now estimate that knowledge will be doubled every 12 hours. From various cites, that seems to be a casual statement about the future of 2020 from a firm that wanted to sell computers in 2013, or perhaps earlier.
I have a copy of Buckminster Fuller’s Dymaxion World. It is half biography., half catalog of his work. I have read it many times.I have a copy of Fuller’s Operating Manual For Spaceship Earth. I haven’t read it yet.
Fuller’s vision was not always practical, but it was wonderful and awe inspiring.
Thank you–that cite is really helpful! It does make me wonder whether his estimates where reality-based, or just pulled out of his butt; and if they were based on evidence, what did that evidence look like?
The estimate for prehistory and early history seems somewhat possible, given the need for the knowledge to be “accumulated and transmitted”: that sounds either like a clear oral tradition or a written tradition. But I still don’t know how one would quantify that.
Do we count AI-generated data as “knowledge”? If so, it’s probably going to increase exponentially both as more humans use AIs and more AIs themselves use other AIs, generating infinite permutations of texts/music/images/movies/games/whatever data, if only to capture every last $0.0001 of SEO value from every possible Google query.
It’s only a matter of time before an AI monkey accidentally eats a few stars and recreates the SDMB almost perfectly, except with Right_Hand_of_Dorkness asking this question…
While this seems possible, I think an additive increase is much likelier over the long term. If we have X data today, we’re likelier to add another X every year, rather than doubling every year, using AI. And certainly if we’re talking about every 12 hours, the quality of the information is irrelevant: there aren’t enough atoms in the universe to support doubling beyond a few months.
It wouldn’t necessarily be one-atom-one-[unit of information], though, would it? Presumably you could encode data and algorithms into more efficient arrangements of matter and energy and time and such?
Insofar as a human can produce more usable information than a rock of similar size (depending on the specific individual, I suppose), I don’t know how connected physical volume & density are to information storage potential.
Who is this “we” you are talking about? I specifically mentioned 9-track tapes in a quantity that even the grad school I went to had in the CS dept. machine room before 1982. (For some stupid reason our tapes were actually 8-track, not the music format, making the tape I still have with my thesis on it purely a object of curiosity.) The university’s own computer center had many times that.