There is no "Curve" in scientific progress.

Again, the curve of scientific knowledge is exponential simply because it’s only recently that we’ve had the tools needed to really expand our knowledge.

For most of mankind’s history, we made virtually no technological progress, for several reasons. For one, the lack of a permanent record-keeping system prevented knowledge from being passed down from generation to generation. Skills and inventions were passed down through apprenticeship. This prevented knowledge from passing laterally through society. We never learned from each other, and eventually knowledge would simply be lost.

Another reason is that we didn’t get around much. Most people lived and died within a few miles of where they were born, and their experience and knowledge did not germinate and travel. There would be enclaves of people known for their great skills in one area or another, but as they died out, the skills were lost.

Another reason is that we were dirt poor, and people generally had no time for navel gazing. They were too busy trying to keep from starving to death and killing each other. High mortality meant that people died young before they could attain true expertise in anything.

So the first great explosion of knowledge came when we began to travel great distances, and began to pick up knowledge from other civilizations. We’d get the wheel from this bunch, sword-making from another, textile skills over from another group, ceramics from another.

Then we developed the printing press, and started being able to record our knowledge for future generations. But we still didn’t have the capability of building the ever-more sophisticated tools needed to keep expanding our boundaries.

Then we hit the industrial revolution, and rapidly began to expand our capability. This allowed us to build more and better instruments, which helped us learn more, which in turn led to explosions in technologies like electricity, electronics, and computers. That led to the computer age, which saw another huge explosion in knowledge as we leveraged the computing power of our machines to come up with completely new ways of discovering how the world works. Simulations, models, brute-force experimental methods - things we couldn’t do before without great effort.

Finally, we hit the information age. Now we have the internet to disseminate information and to create a ‘hive mind’ of sorts. That’s going to lead to another explosion in knowledge and ability, but how long that will last is not clear.

So that’s where we are today. The beneficiaries of a logical progression of new tools and new ways of collecting, storing, and sharing information. But the big question is, what next? Does this curve have to continue?

Why should it? We can already see some limiters on the horizon. For one, computing science has simply not kept up with advances in hardware. Object oriented programming was a huge step forward, but aside from that the progresss looks kind of linear to me - not exponential. Operating systems take longer and longer to be released, and the improvements are pretty incremental. Our success rate at things like voice and character recognition doesn’t seem that much better than it was 20 years ago. In 1985 I was working with voice recognition software that could be trained to have a vocabulary of hundreds of words. But there was always the annoying error rate, and it’s still there today. We still use keyboards and touch pads and mice to communicate with our computers, because they still suck at understanding speech. OCR software is better than it used to be, but still not good enough that you can feed pages of print into it and expect 100% accuracy.

We still program essentially the same way we did 30 years ago. We write methods and procedures that execute logic statements that all look pretty much the same today as they did then. the ‘C’ language syntax forms the basis of a lot of our newest languages, and it was invented over 30 years ago.

The point is that we are becoming increasingly dependent on the limits of software instead of hardware, and software development just isn’t happening on an exponential curve. That could change if we have a breakthrough in the methods we use to create new software, but I’m not seeing it so far.

Then there are other limits - bandwidth, physical constraints to chip sizes, etc. Moore’s law can’t last forever. As the internet grows in complexity, the demand for bandwidth may grow to the point where it’s a choke point.

But here’s the real worry - the big limiter may not be what we can do, but whether we can do it before forces tear what we’ve built back down again. Can we stay ahead of the hackers and the terrorists? Can we keep growing exponentially while still preserving the planet and finding new resources fast enough?

It’s possible that we may be approaching a different kind of ‘singularity’. One in which the power of destruction that each citizen has is also growing exponentially, and once it reaches a certain level then there will be millions or billions of people who are capable of wreaking vast amounts of destruction on their whim. Bio terror, nuclear terror… How about genetic terror? Nano-terror?

It used to be very difficult to break in to a single bank and physically steal a million dollars. The amount of financial damage crooks used to be able to do was on the order of a few million dollars at most. Now hackers can break into 500 banks at once. Enron and Worldcom can game the system for billions. Can you imagine the news that would be made 50 years ago if some guy managed to shut General Motors down for a day? Well, hackers have shut down eBay and Amazon, and worms and viruses have been released that did hundreds of millions of dollars in damage.

It’s not at all clear to me that we will be able to continue growing in power without society itself becoming unstable. It may simply be inevitable that we will expand wildly and then flame out.

Maybe that’s why we’re not finding any galactic civilizations. Maybe the ‘information explosion’ period is almost always fatal to a civilization.

And even that syntax was based on ALGOL, which dates back to the 50s.

One might argue that component-based development, as seen in VB, Delphi, and .NET, represents a new method of creating software, beyond traditional OOP. You can now grab a few unrelated packages, perhaps written in completely different languages, and slap them together in your program without needing to understand how they work internally.

Great post, Sam.

Digital, do you wish to support the contention that technology is growing/progressing at an exponential rate? Or do you just wish to rebut specific points of mine? I think debate is more fun when both sides have a position, and it’s not just one guy saying, “This is incorrect, this is incorrect…”

Plus, I disagree with everything you say! :smiley:

Sorry – computer problems over the weekend. And fixing them took much time that I now have to make up, so I apologize for being terse.

Like I said, be clear about what you mean. Rate or volume?

I don’t know what one could use to measure tech progress. Number of patents, like Huebner? Flawed, at best. I’d suggest the number of papers being published, assuming that each contributes new knowledge, but that’s not really a good measure either.

I’d think the body of knowledge is increasing exponentially. The rate? I think it possible. Acceleration? Nah, probably not, unless you figure there’s an exponential number of people where the contribution of any given person is fixed.

Sam, could you clear up something for me? Your claim about the lack of any record keeping system simply isn’t true. The written word is quite old, and even music had a primitive notation before Gregory. So, could you explain what you meant? As I see it, technological history is not linear. There can’t have been rifles, for example, until quite a number of other technologies had developed, including the process of milling, which itself required a number of technological discoveries. A requires B and C. B requires D and E, and C requires F and G. D requires H, I, and J. And so forth.

Of course, if we do ever succeed in creating fully-functional, scalable quantum computers, it will no doubt be due in large part to advances in our understanding of physics that have been made possible by the development of classical computing. Which goes back to the point raised by Blake and others that while one particular branch of science or technology may hit a “plateau”, it can nevertheless stimulate significant growth in another area. Whether this overall growth is “exponential”, who knows? Overall scientific progress isn’t as easily quantified as microchip complexity.

I do, however, think that any effort to guess the date when we will have achieved a certain specific technological breakthrough (be it “strong” AI or whatever) is almost certainly destined to be wrong unless we’re only looking ahead to the very near future. Because in the long term there will inevitably be both roadblocks and breakthroughs which could never have been anticipated in advance. To say any specific technological benchmark will follow an exponential curve indefinitely seems quite unreasonable to me.

There may or may not be a limit on computing power. A single chip may eventually reach a certain size, but with the advent of the IBM/Sony/Toshiba Cell chip and dual cores from Intel and AMD, distibuted computing becomes an increasing possibility. The SETI and the Cure for Cancer distributed computers programs have already shown the possibilities.

What’s stupid is pronouncing some apparent short term trends as inviolate “laws”. Moore’s Law? “The Curve” as an “untamable force”? Gimme a break. The burden of proof is upon those who make such sweeping kinds of statements, not upon those who say balderdash to them. “Certainly the curve exists”? Prove that please by more than a few revcent eamples and with some mathematical modelling and with some testable predictions of what amount of information will be vailable in say genetic engineering in say 5 years hence.

Historically scientific progress has not looked at all like a smooth curve but one of fits and starts with occassional significant backward steps as well. To even attempt to try to apply such declarations as “Laws” to things like scientific progress requires possessing some kind of numerical metric to measure it with, and we don’t even have that.

It is a lousy metaphor at best. Just plain silly.

It is you who are making the sweeping assertions without evidence. Most of the people who have made these predictions about technology did it PRECISELY because they looked at the evidence: they looked at the charts and graphs and numbers and said, “OK, if this trend continues, then X will likely happen.” They also generally qualified their predictions with, “Of course, it is possible that factors may intervene to prevent X from occurring.” Granted, the assumptions that are being made by following these curves to their conclusions can lead to some startling results, but that doesn’t mean the curve doesn’t exist.

I personally expect some surprising things to happen in the next few years if the techno curve predictions turn out to be accurate. I’m not willing to put a tight time frame on it, or to say exactly what form the surprises might take, but the numbers indicate that there will be some suprises, hopefully happy ones.

Following the logic can be very productive: Einstein is said to have come up with his relativity theory because he looked at anomalous findings regarding the actions of light in the universe and asked, “Well, the indications are that those findings are true. So what does this say about the nature of the light and the universe?”

I’m not giving up all my wordly possession and waiting for the AI gods to rapture my ass, but I am paying attention to new tech. Seems reasonable.