Exponential Growth of Technology and the Singularity

First off, that quote is probably false and you didn’t even quote the fake quote right!

Second off, technology is not increasing exponentially. IT is certainly not doing today, and I’m hard pressed to think of anything which is. It’s simply that we get better at, but the improvement is always a decision made at the margins. We see things improving on the surface and look 10 times better there, but often miss that the hardware underneath isn’t changing so much. The heyday of computer improvement is probably over: the era of improved user interfaces is not.

Likewise, I’m hopeful about biology improving our lives, too… but Kurzweill seems to have fallen in love with his sci-fi.

My bad I probably should’ve sourced it.

The hardware underneath isn’t changing?

In the last 100 years we’ve gone from mechanical calculators to relays to vacuum tubes to transistors to integrated circuits. We’ve got a few more years left with the latter before moving on to the next paradigm.

I agree, new uses and new concepts are constantly coming up. However I am using the same amount of broadband I was using 11 years ago (I’ve had different providers since I’ve moved around) since I have never had any reason to increase my broadband download speeds. It seems more and more of the advances in computing is being used to make better graphics which really isn’t that important to a lot of people.

I can get more than enough podcasts, videos, documents and songs on a 16GB mp3 player. I have considered a 32GB model, but don’t really need one right now. If I do need one I will get the 32GB model. But right now 16GB is fine.

My first MP3 player was 128MB. That was too small. The next one was 1GB. That was ok, but still too small. My next one was 6GB. That was decent and functioned well. My current 16GB one is even better, but I don’t see why I’d need 120GB like some of the models have. The newest Ipod touch is supposed to go up to 128GB of flash drive storage. The next model might be 256GB, then 512GB. However for many of us using more than 30GB really isn’t necessary. I put 4 seasons of a TV show on someone’s MP3 player and it only took 10GB because I compressed them down. You can store a lot of data in 30GB, 1GB is 250 songs or it can be 3-4 hours of video. I don’t need a doubling of the GB in my personal storage devices each year.

The point is consumer demand has to drive some of these exponential trends. And many consumers don’t need exponential storage space, processing power, broadband, etc. Lots of people get by fine with 5 year old computers and basic broadband.

Until someone comes up with uses for all these exponential powers of computing, there won’t be that big of a market to keep inventing them.

Is technology increasing exponentially? It is and it isn’t. Transistor sizes and such are still on a more or less exponential curve, moderated a bit by the fact that only a few companies can afford new fabs. However, the complexity and speed of your average microprocessor core hasn’t changed much for a while, with the extra transistors you get on a die going into cache and multiple cores. All the real advances are going into things like smart phones, because we’ve gotten stuff small enough to get good functionality into a reasonably sized phone. The processors inside these phones are nothing special, since power consumption is so important.

Software, which is what people see, doesn’t grow exponentially - though things like apps markets make make it seem like it does. User interfaces will mature when they disappear - unless you consider speech a user interface.

We will never actually reach this “singularity”. The reason is that the singularity is always from the perspective of current technological advancement. As we make new discoveries, our predictions for the future will change and it will continue to recede away from us.

Earlier thread on the Singularity.

And from the TVTropes page:

Thanks for bringing that up. It’s a pretty fascinating machine.

I can see what you’re saying, but it’s essentially anecdotal.

To give you a couple examples, higher processing power has allowed us to build more powerful sensors like fMRI, CAT scan, etc… These sensors generate huge amounts of data, and the more sensitive they get the more data they output. Astronomers are now using supercomputers to simulate supernovae and other phenomena that require huge amounts of processing power and pretty advanced algorithms. Perhaps we’ll one day be able to simulate what goes on inside the event horizon of a black hole.

I think you certainly are correct that market forces are required to drive these exponential trends, but I see plenty of pressure from a variety of areas to continue to push the envelope.

repeat after me - a machine will not reach human level of intelligence any time soon (many, many decades). When in doubt, reiterate it by way of auto-hypnosis…

And even if it did reach some such point, that machine “intelligence” would quickly find out that there are 6 billion other “intelligences” running around. Lots of them very busy with solving important global problems to build a better future for, well, you know the drill. From the looks of it, the better future is not getting anywhere near through the labor of the above mentioned human intelligences, and the addition of a few machine intelligences will not change that. They don’t even have a resume let alone glowing references, so who is going to approve funding for their projects anyway?

Further, note how Kurzweil et al make this wonderful leap from “human level intelligence” to “designing new generation of super machines”. There is no shortage of people capable of passing the Turing test but incapable of designing even an old generation bicycle. Chances are that Kurzweil could not teach a typical smart college kid to be the next super computer design genius if his life depended on it - how is he going to teach similar skills to machines?

Thinking the internet contains the sum total of all human knowledge is just jaw-droppingly ignorant, almost to the point of being offensive.

What doesn’t it contain?

I don’t agree with several items brought by the OP, but I have to say that there is a need to comment on some counterpoints mentioned.

Well, IMHO this is related to the Human IQ controversy of assuming that it is the ultimate or most important reason why one is successful or not, modern researchers take into account that there are different intelligences in humans and so it is with artificial intelligence; there will be many paths to it and so different time lines to get to the machine intelligences.

IMHO this is not the case, humans can be still great at research, but tools like Google are showing results that no human made, a machine did it for you based on your input and even preferences.

From here, it is clear that the next step is to have computers with AI to begin making connections that we humans have missed.

The reason why I do see humans creating an intelligence that is superior to a human (in some aspects and in our lifetime), is because many human organizations do want and seek a solution to many problems that are still out there. As I think intelligence comes from prediction, not behavior, the machines are bound to improve on their predictions thanks to the added capacity to learn from feedback.

Actually there are humans who fail the Turing test but we do not automatically declare them subhuman. :slight_smile:

As I have read on the subject and followed the history, it is clear that many current researchers are now on the right track by not concentrating just on top down teaching of machines, but letting the machines learn on their own and at their rate. (A rate that can be as fast as lightning.)

Oh no! We aren’t falling for THAT one!:smiley:

I’m not sure about the ‘many,many decades’ time frame, but it’s not just around the corner either. But while machines need some time to catch up to human level creativity and conceptual level, they already do plenty of computational intensive work that exceeds human capability. The combination of humans and machines may already have created a higher level of intelligence, in some respects.

To the OP though, I’m not sure what is being predicted. An incomprehensible level of complexity? There are plenty of simple things most people can’t comprehend, and incredibly complex things people already comprehend. So far, if a machine is doing it, we can comprehend it. The ‘singularity’ concept sounds like BS. Not for its falsity, but just trying to stick a label on something to make it sound like an important concept. As an example, IMHO, nobody actually understands macro-economics enough to make consistent predictions, but even with a world-widish economic crisis, we are still managing to get by.

I’m sure we will one day reach the point where machines are as intelligent, or more intelligent than humans, if we can get humans to act according to the level of intelligence they should have, before they destroy the planet.

Quoth msmith537:

Exactly: It’s a horizon, not a singularity. Technology (as a whole) really is growing exponentially, and that’s precisely why there isn’t a singularity: From every point on an exponential growth, it looks like every point on the graph longer ago than a couple of timescales is effectively zero, and within a few timescales in the future things will be mind-bogglingly greater. We think we’re living in an exciting time because we’re only 30 years away from the singularity, but someone living in any other time in history would have thought exactly the same way. It’s like saying that I’m living in a special spot on Earth because the horizon is the same distance away in every direction.

The feel of the wind in your face when you climb to the top of your first mountain.

The smell of the ocean.

The beautiful desolation of the boardwalk in winter.

How it feels when my fiancee smiles at me.

You know, the stuff that makes us human.

We aren’t going to see exponential growth in computer technology. It’s true that we’re currently in the exponential phase, but the likelihood is that we’ll eventually transition into a sigmoid curve. Look at what has happened with processors–the processor wars where faster CPUs appeared every month is over. Nowadays the computer ads barely bother to mention the processor speed. This means a lot less churn, because now the desktop you bought 5 years ago is pretty much just as good as one you can buy today. It used to be that more RAM was a big deal, but nowadays due to architecture limitations you can’t use more than 4 GB, so that’s topped out. Hard drives are getting smaller with more capacity, so that’s still improving tremendously. But how many GBs of memory do you need? If you’re storing tons and tons of video you can eat through TBs, but most people aren’t doing video editing.

But all this doesn’t mean the age of computer improvement is over. It means that now people can concentrate on how to make all this stuff work better. We’ve finally reached the Model-T Ford era of computing, where all the parts of the package are in place, and now we just have to make them work better. There’s still a lot of work making computing cheaper, simpler, more bulletproof, smaller, and easier to use.

In a couple of years a typical computer will be a phone that has a small display and simple input, but can easily communicate with any nearby display or input device. You’ll carry your phone around in your pocket, and sit down at a workstation that is just a monitor and keyboard and mouse, and your phone will communicate with them. But note that what you do on your phone won’t be that different–you’ll websurf, read, and send/receive text, voice, or video communications. It’ll be the same experience, just less annoying, like switching from a Model-T that blows a tire every few hundred miles and has a top speed of 45 mph, to a modern Honda Accord. Sure, some people will have the equivalent of sportscars, but the real benefit of the Honda Accord over the Model T isn’t that it goes faster or holds more cargo, it’s that it’s an order of magnitude more reliable, an order of magnitude safer, and an order of magnitude more comfortable. But it still exists just to haul your body from point A to point
B.

As for machine intelligence, well, there aren’t any prospects for strong AI any time soon, and we have no reason to hope we’ll make any progress. If we do get a strong AI, it will come as a shock. So we could get something tomorrow, but we have no reason to suspect anything tomorrow. But the improvements in weak AI are not to be sneered at. Just Google itself is a huge achievement.

Which new brain implant research stuff are you talking about?

“Both a person and and a computer can win a game of chess…
But only the person will enjoy it.”

(wisdom printed on a cheap paper placemat at a Chinese restaurant)