We are so far away from being able to build ‘strong’ AI and we don’t even know how to start going about it. We don’t even know the information we need to know to begin to make plans to start to learn how to do it. It’s essentially a complete unknown.
In the last 30 years, the advances we’ve made in AI are fairly meager - we’re a bit better at recognizing written text, we’ve figured out how to walk on two legs and stay balanced (thanks to the help of hardware like solid-state gyroscopes), we can do terrain recognition a little better and that sort of thing. We’re no closer to learning how to make a machine actually think complex thoughts than we were in 1975. And there’s no evidence that we’ll be any better at it in 2045, either.
While hardware has remained remarkably true to Moore’s law, Software never has had the kind of growth that hardware has, and in some ways has stagnated. Many of the improvements in software we see today come from improved memory and display performance, which allows us to render things in more detail or store more information. We’ve also seen some leveraging and acceleration of software due to standardized toolkits and simply APIs that take a lot of the drudgery away from programmers. But programming itself isn’t much different than it was 30 years ago.
In my opinion, if we see strong AI it will be ‘accidental’ or evolved - not designed. Rather than writing software to do all the things we think it needs to do to be ‘intelligent’, we might create the software equivalent of early life and simply let it evolve and see what happens. Maybe intelligence will pop out of that when we get really good at simulating the conditions for evolved intelligence. But if intelligence did pop out of that environment, it might not be of a kind that’s useful to us, or even comprehensible by us. So that’s a mixed bag.
In any event, people like Kurzweil annoy me - predicting the future is great in science fiction, but when people step outside of that and claim that they’re smart enough to have figured out what’s going to happen in the future, they go too far. The economy and society are complex adaptive processes, and by their nature they are not predictable. Stuff happens that you don’t expect, and it leads to changes that cause stuff to happen that you didn’t even know was possible. Society changes in unpredictable ways.
This reminds me of the ‘futurists’ who were predicting that we’d all be living in space and colonizing the solar system by now. Hey, from the vantage point of 1970 it seemed inevitable - in 25 years we went from propeller driven planes to men on the moon and plans for mars missions and space stations. We had this big space shuttle we were building that was going to make access to space cheap. Every futurist worth his salt could point you to graphs showing mankind’s top speed attained, and how it was increasing exponentially. We were planning nuclear rockets and Bussard Ram Drives and we were heading out to the stars.
I wonder what they’d say if they could have jumped forward 35 years to watch the last shuttle flight, read about the last flight of the only supersonic transport plane, and discover that man had not left low earth orbit for 30 years.