Here’s an article in the Washington post about the so-called “Curve” in technological progress (actually, the standard book review that is so detailed that you never actually have to read the book). Here’s a quote (from the article, not the book):
This idea is really stupid. First, let’s deconstruct Moore’s law. Computing power has doubled as it has because it started from zero just 60 years ago. It is not too hard to double the power of, say, 144 transistors. What tech boosters don’t tell you is that high-end chip plants also double in cost at about the same pace. Last time I checked (I worked in the semiconductor equipment industry for about 2 years), a high-end chip plant cost about $3B, and there are only about 10 companies in the world that can afford to create such facilities. I’m sure the numbers have grown higher in the last year.
Creating new devices has become monstrously expensive. A single mask used to create just a single layer on for example an Intel Pentium chip can cost millions of dollars, and many such masks are required to create a chip. And that is just one tool in one of hundreds of processes needed to create a working chip.
The process also stretches human organization to its limits. These chips are the most complicated and advanced things humans have ever created, and there are only a handful of people who truly understand how the entire system is put together and works. No single person fully understands every process that goes into their making.
We are about about to hit two walls when it comes to chip-making. The first is that, geometrically speaking, the circuits will soon be so small that electrons will freely be able to tunnel (i.e., the damn walls of the circuit will be too thin, even if we have the technology to make them, which is far from certain). We are only a few orders of magnitude away from this happening (perhaps a more knowledgable poster could nail the details for me). Second, creating those devices will become orders-of-magnitude more expensive.
And, when we do finally make chips that are 1,024 times faster than they are now (10 more doublings, and with 18 months per doubling that is 15 years away), we will still only add three digits to our processing power and will presumably be running Microsoft Windows. Or is there any evidence to the contrary? (BTW, Moore’s Law never really stated that computing power itself doubles. It had to do with chip complexity. Cite.)
One of the big gurus of this “we’re at the knee of the logarhythmic curve” talk was Ray Kurzweil, much more quiet of late than when his last big book hit the shelves in 1999: The Age of Spiritual Machines. I read it and thought it was neat-o. In those giddy, gung ho Internet Bubble days, anything seemed possible. He predicted big changes even by 2009, when all manner of chips would be making our lives more convenient, blah blah. Well, it’s 2005, six years later, and we’ve only seen incremental change.
Implicit in the linked article and explicit in Kurzweil’s theorizing is the notion that, although we humans are too dumb to do it, we’ll invent intelligent machines in the near future that will carry us into the next technological era. But the reality is that strong AI has made essentially zero progress in accomplishing its goals. We’ve created some great chess programs, some pretty powerful expert systems even (note: at great cost in money and manpower), but we have yet to build even the slightest machine that can take over the hard work of thinking things over and creating new machines (which is what the model requires). The problem isn’t processing power. Rather, we have simply no idea how to write the program for such a machine.
The common sense model that reflects actual history.
First, nowhere in history has logarhythmic change been observable. To the contrary, new technologies typically require a great deal of support and maintenance. It has been speculated that only recently have computers added to the overall productivity of the economy. The reason is that, while they have added value over the last 50 years, the amount of labor and capital required to create, maintain, program, and use them has put their overall value in the negative.
I have little doubt that the transaction is now of postive value, but the fact remains that computers add incremental, not revolutionary, value to the economy. And so it goes with most technologies. Extraordinary tech requires extraordinary support. We reap some rewards from the trade, but we do not progress logarhythmically.
I think there are two main types of inventions: those based on knoweldge, and those based on making the hard-to-make. Knowledge-based inventions can spark true revolutions, since their technology is not difficult to create or maintain. The steam engine is a good example. When we eventually figure out how to do fusion, it will be the same thing: free, clean, energy for all. But the computer chip is a much different beast: Figuring out how to make them is extremely difficult, and making them is extremely difficult. Finally, writing programs to use on them is also difficult and time-consuming.
So here’s my common-sense view: New technology pushes along in fits and spurts, and constant new invention is required for progress. Nothing is guaranteed or automatic, and even regression is common. The Space Shuttle is crappy, faulty 1960s technology based on a faulty 1960s design. It is far more complicated tech than the Apollo program, but it is not better tech. Far from representing logarithmic change, it’s been a dead end.
We have been stuck in a Windows mess now since the early 1990s. The Internet is a great thing, revolutionary in its own way (more for culture and business than tech itself), but where is the computer revolution? Mind-blowing virtual reality, speach-recognition, AI that does half the work for you, and flying electric cars? Well, wait a hundred more years and maybe it will happen.
And, just as a general observation, I would say the pace of technological change was much faster in the 19th century than in the 20th. You pretty much started with nothing and ended with the basics of today’s technology and someone not to far from our current view of the universe. The 20th was a century of highly important but incremental change (and, of course, true revolutions in select areas).
So, no logarithmic change is in store for us. Any Dopers disagree?