Ray Kurzweil's 'Law of Accelerating Returns'

I’m starting to get the impression from comments like this:

That people are not reading the Original Post and are largely ignorant of who Ray Kurzweil is. We’re talking about predictions from one of the greatest inventors in history, someone on par with Edison and Tesla, not some schmoe in his dorm-room.

Sometimes progress is exponential and sometimes it isn’t.

Based on the progress that aviation made between 1910 and 1960 it would have been easy for someone 50 years ago to imagine we on the verge of colonizing the solar system. After all, it only took 50 years to get from a glorified kite powered by a little putt-putt engine to jets that could fly around the world and rockets that could carry us into orbit. Extrapolating that sort of growth into the future, it was easy to imagine that Moon bases and the colonization of Mars were just around the corner.

But what wasn’t obvious in 1960 was that the early big technological gains in aviation technology weren’t sustainable. There were a number of hard real-world constraints on the design of airplanes and rockets – the limits of the strength of materials, the specific impulse of different fuels, and so on – and by 1960 a lot of the easy discoveries that yielded big jumps in performance had already been made. Aviation progress since then has a been slower and more incremental. People are still coming up with new ways to make planes more efficient and rockets more powerful, but we’re still a long way from being able to colonize Mars.

Computers have had an amazing run over the last 50 years. We’ve gotten used to amazing levels of progress with systems getting faster and faster every year allowing the machines to do all sorts of new and amazing things. But it can’t go on forever. Just like with aviation, sooner or later we’ll hit some hard real-world limit that puts an end to the rapid explosion of information technology. Progress in computing will become incremental instead of revolutionary and we’ll look back on our current go-go era with the same quaint affection that we look back on the early barnstorming aviators.

The Hamster King S-Curves man S-Curves.

Page 44 of ‘The Singularity is Near’

Hopefully people will assimilate this concept into their further criticisms. The point of it is that his argument isn’t that things do not plateau, but that the time between evolutionary epochs is becoming less and less. He defines an evolutionary epoch as operating along an S-Curve, which means rapid advancement, and then a plateau.

You think his accomplishments are akin to the light bulb, the phonograph, alternating current, and the induction motor?

I think when placed within their correct temporal context yes.

But to be fair, I will give credit where it’s due. Edison and Tesla were operating in a much less scientifically fertile environment than Kurzweil. So I’ll give them props that he doesn’t get as a result. But the point being that Kurzweil is one of the top inventors of our time, along with people like Dean Kamen. So, before we get into the ‘which inventor had a bigger dick?’ game, I am just saying he’s at the top of his game, and isn’t a dorm-room pothead.

You can call ME a dorm-room pothead all you want. :wink:

Ever look at a historical timeline? We fill in lots of details for recent events because they seem so important to us. Someday they’ll look back and say the the age of computing began sometime in the mid 20th century and leave out details like stored programs, development of computing languages, moving from tubes to transistors, from transistors to integrated circuits, etc. It’s like talking with my mother. She’ll start out a story with what time she got up, what she had for breakfast, what she didn’t have for breakfast because she couldn’t get to the store last week with Trudy, whose daughter went to school with my older sister (or was it my brother), what she wore, what the weather was, how she got to the bank, and how long the line at the teller was, how rude the teller was, and how she had to wait 10 minutes for the bus and had to pay in dimes because she forgot to get dollar bills at the bank. Someone with a little more perspective would say “she deposited the check”.

What the hoi polloi will say and think about it isn’t really that relevant to the actual historical study. The trivial pursuit answer you are referring to is likely true, but none of this concerns what your Mother does or knows about, she’s not working on molecular machines or Hard AI.

Bolding mine. I do not believe that it is reasonable (or even rational, really) to presume that you can predict future discoveries based on the timing of past ones. The reason I think this is because there is an external limiting factor to discoveries - whether there is something there to be discovered. Just because were were able to discover/invent amazing new things in the past doesn’t imply that a future massive shift in the way we thing about digitial technology will occur. And it really doesn’t mean that we will keep doing so on a regular schedule, as is needed to sustain the S-curve.

Remember we are not actually talking about evolution here. The ‘system’ of digital technology isn’t developing or evolving on its own. There is not a constant process (like biological evolution) in place that can be relied upon to keep doing the same thing it was doing that instigated periods of rapid growth in times past. On the contrary, when we smack into limits like the laws of physics or the speed of light, we’ll need to discover a way to transcend these limits by discovering and implementing an entire new separate technology. And I seriously don’t think you can predict the invention of the tesseract based on a growth curve.

As long as market demand for advanced computing power exists, I do not see why we wouldn’t find a 6th technology to replace the 5th for computing power. Kurzweil talks about how we will reach a limit to transistors because the size of transistors is becoming so small that it is physically impossible to make them smaller (maybe 20-30nm being the physical limit, which we are rapidly approaching).

But if there is market demand to keep the growth of exponential computing going, I think the trend will continue as long as we can find new ways that still follow the laws of physics.

And the concept of using a new platform (Kurzweil feels it will be 3-D computing chips) is not really unrealistic because he seems to be making that prediction based on preliminary research into those kinds of computing technologies.

I think market demand is something Kurzweil doesn’t really take into consideration. We have speech to text software, we have telemedicine, and we could put computers in rings, necklaces and earrings like he predicted we would have when he made predictions for life in 2009. but there is little/no market demand for these things.

Kurzweil said he started researching exponential trends because he wanted to find ways to market his inventions better (by timing release with points when hardware was capable of making use of them).

If there is no capital to invest in a new technology and no market for it, it will not happen. There is market demand for advanced computers right now (many computers are being used for graphics and medical/scientific research and I’m sure many research labs would love to have a handheld console that could function at 1 teraflop). However I honestly don’t know how far that trend is expected to continue, or if we will reach a point where there is no market for faster computers, at least on the consumer level. I have no use for cutting edge desktop processors. My laptop processor is crap but that is partly because I bought an intro model Acer with a Celeron. However a newer processor would do the job for what I need, I do not need a quad core since I do not do tons of intensive things at once.

With the playstation 3 a lot was made of the new graphics and how fast the Cell processor could work (100 billion calculations per second). But I heard many many gamers respond ‘yeah the PS3 has better graphics, but the Wii is more fun’. So a computer with better graphics is nice, but if market demand is pushing for more interactive gaming rather than better graphics, you will see that. And you have, because the PS3 and Xbox 360 are releasing new technologies which incorporate full body controls to allow interactive gaming. The Wii isn’t following the PS3, the PS3 is following the Wii. Rather than the Wii inventing better graphics, the PS3 is inventing full body controls.

I can’t predict the future, however I do think we are reaching a point in consumer electronics where faster processors are losing their appeal. Intel is releasing a 12-core processor in 2010. However a dual core is good enough for most people based on what they intend to use it for.

Naturally new technologies can and do come up to use that new processing power. However in my internet life I had a 4Mbps cable internet connection in 1999 and was fine with it. And in 2009 I have a 3Mbps DSL connection and am fine with that. I had people try to talk me into a 20Mbps FiOS line and my response was ‘why’? I do not need a 12-core processor or a 20Mbps FiOS line. Broadband and CPU power may follow exponential curves according to Kurzweil. But in the last 10 years I have not needed them for my personal use. And if millions like me do not create the market demand for them, they will not keep inventing them.

My first MP3 player was 128MB. That was limiting. Then I got a 1GB. That was better, but still limiting. Now I have a 6GB player and am fine with it. I can buy a 200GB player, but personally have no need for it. I can get enough singles and albums on a 6GB player where I do not need most of them. If they release a 500GB or 1TB player I will not need them (at least not for music). I may enjoy them for video, but even those trends would reach an end at some point due to better compression technology and more storage.

That means he is apparently good at inventing stuff. This does not reveal any particular skill in other areas. He happens to be a bit of a crank and has made so many wrong predictions, it is difficult to tell whether he was right because he was right or if only by chance. It is hard to be as misinformed as he appears to be in so many areas and be taken seriously.

The Singularity Institute, it should be noted, is not much more than a web presence and a few yearly events largely self-financed. They have not, as yet, produced anything.

Here are some other examples:

DVDs hold 4.7-9.4GB of data. That is good for movies and films. A Blu-ray can hold 25-50GB of data. However a DVD offers many advantages over VHS (portability, better graphics, viewing options, subtitles, widescreen vs fullscreen, better control, trailers, director’s cuts, commentary, etc). The advantage of Blu-ray over DVD seems to just be better graphics. And an upconverting DVD player can produce a DVD almost as good as a Blu-Ray for graphics. So asking people to spend hundreds on a blu-ray player and hundreds more on a new library of movies isn’t really happening.

A new technology coming out soon is holographic DVDs which hold 300GB-1TB of data. However if all that means is slightly better graphics, I don’t see those catching on either. The DVD was a massive advance over VHS. Blu-ray is a minor advance over DVD.

Another is camera megapixels. Ashton Kutcher is on TV promoting a new digital camera that is 12.1 MP. However do most people need or use 12.1MP? A 12.1MP camera can produce high quality 20"x30" photos, but is there a consumer demand for these? Most people seem to either store them on a hard disk or print them in 5x7 or 8x10 sizes.

So my point is that I think Kurzweil needs to take market demands into consideration. If there is no market demand for personal computers that can perform 10^21 cps, then they will not be invented. That doesn’t mean the laws of physics and human ingenuity isn’t there, but there is no capital and market for them and they will not happen.

I do think we will see advances in biotechnology like Kurzweil says, but that is because there is a market for them. The world is rapidly becoming older and governments are going to struggle under the weight of health care and pension costs for the elderly. So there is a massive demand for advances in biotechnology and likely nanotechnology. I also think there is a large market for robotics.

However, as far as I can tell right now, there is not as much of a demand for advances in things like capacity for hard drives, RAM, CPU speed, broadband, camera MP, hard disk storage, etc.

I think he makes an interesting case. Some thoughts on reading his stuff in no particular order

    • He is trying to make predictions based on the outcome of thousands of very complicated and interconnected factors. This makes it pretty much impossible to make any realistic judgement of the likelyhood of his specific predictions being even slightly accurate.
    • He is making a lot of assumptions.
    • I don’t think he is adequately addressing the social factor in his picture (although he does acknowledge it is a problem) There is no avoiding the point that the people who currently have problems with gay marriage and vaccines are going to go absolutely bugnuts about the prospect of connecting computers to people’s skulls.
    • I get the impression that his grasp of technology and development is pretty solid but his grasp of biology is rather more shaky (he has previously been involved with some real quackery on the life extension side). More specifically I don’t think he has taken on board that biological research slows down enormously as soon as actual people enter the frame due to regulatory concerns.
    • Related to this, we have seem some tremendous advances in neuroscience recently and even if you discount some of the dodgier claims (IBM “cat” brain I’m looking at you) His early predictions actually seem reasonably accurate at least to this point. I’m not sure this will continue when people actually start wanting to involve living human brains in the picture.
    • This is way out of my area of expertise, but I do get the impression that some of the progress we are making on the brain implant side of things is just as much related to the brains amazing capacity to adapt to accept and make sense of new input. We achieved some amazing results with things like cochlear implants which are actually (relatively) crude in implementation. Further short term success in this area might be heavily dependant on just how far we can push the “plug in and play” capacity of the brain.
    • I do agree that a lot of his technological predictions, absent massive social intervention, are probably inevitable, mostly within the next century. I don’t think that they will necessarily cause a technological “singularity” and I also don’t think they point to a future which is as rose tinted as Mr Kurzweil seems to think.
    • All of that said, I’m still not willing to say I’m sure he’s wrong.
    • These issues deserves a lot more attention than they are currently getting.

enigmatic, why do you feel IBM’s claims are dodgy? I’m not saying your are wrong, I am just interested in hearing your reasons.

Put me in the camp of “nature imposes limits”.

The speed of light has already been cited.

Elsewhere in nature - A human begins as a single microscopic fertilized cell, then divides and divides and divides, eventually growing to a few pounds when it is born. Then the human grows and grows and then, at about 6 feet tall - doesn’t grow any more.

I believe this is as useful of a model for predicting the future of development as any Mr. K comes up with.

Certainly strong artificial intelligence, merger of biological with nonbiological intelligence, etc., would lead to a “singularity” in human history if they happened. But, let us bear in mind that these things might or might not be possible. The laws of nature are what they are; they allow for some things and not for others. Just because researchers are working along some of these lines now does not mean they will ever get the hoped-for results.

They’ve already been attracting criticism from other researchers in the same field.

Link

I think the problem is as much with the reporting of the work, and the implication that it represents much more than it actually does. This is depressingly normal for scientific news that makes it into the mainstream press.

Thanks for the link. Interesting stuff.

I tend to the view that smarter than human AI is inevitable if work in that direction continues. I think that the very existence of the human brain pretty much has to confirm that there are no hard natural limits that would prevent us from creating one and improving upon it to a least a modest degree.

How long it will take us to get to that point and whether we can do it before self destructing as a species is another question :slight_smile:

I don’t necessarily think that a technological singularity is an inevitable outcome however because that would require

a) that each progressive intelligence continue to choose to produce smarter intelligences.

b) that there is no future physical limit which would prevent the further massive increases in intelligence that singularity theory seems to be assuming.

Not necessarily - the artificial intelligence may find itself running into inherent limits as well. We have no actual proof that cognition is capable of getting smarter than the smartest human.
That said, I consider it certain that strong artificial intelligence is theoretically possible, on account of the fact I’m a materialist. Just simulate every subatomic particle of a real human brain, and you’re golden. Though in that case, you were pretty much just as well off when you just had the human.