Absolutely nothing in that popular science article disputes any of my points. Notice how in my definition of the Singularity I did not give a specific timeframe. I’m well aware that self replicating factories and/or artificial intelligence are very difficult problems. I’m not taking Vinge or Kurzweil’s view that this event is imminent. It might be 300 years away. It won’t be any less dramatic when it happens, even if it is 300 years hence.
Also, unlike the popular science article, I mention 2 causes of the singularity that are quite different. Popular science only mentions AGI. We might develop
-
Human uploads. See thisessay for an accurate technical description of how you might do one. Note that every tool needed that the author mentions exists in working prototype form, 11/6/2014.
-
Self replicating factories. Leaving the nanotech out of it, all a self replicating factory needs is to be completely unmanned and the total factory (which may be a many many square mile complex) must contain all of the equipment needed to produce every machine in that factory. All it takes is robots at about the tech level of thisone, and the capital to build and develop all the firmware and production line engineering for every stage of this factory. (it would be pretty expensive, obviously).
This model of factory would be assisted self replication (since you don’t have AGI/uploads yet) : humans would be needed whenever the factory encounters a fault that software does not know how to handle. I imagine millions of workers in India acting like the factory’s “help desk” or “Maytag repairman”, clicking a mouse to order the robots to resolve faults that the software is unable to resolve on it’s own. This does mean that there’s a limit to how many factories you can build with only this method.
So I have established the how. Now, regarding the *why *: this earth has competing nations and corporations. Any corporation or nation the develops AGI would have a gigantic competitive advantage because they would be able to produce things without paying as many workers. Any corporation of nation that developed human uploads would be able to charge immense fees and/or command obedience from their populance because they would have a legitimate treatment for biological death. Any corporation or nation that developed self replicating factories would become the richest entity on earth (barring theft of the tech, of course).
Essentially, to disprove the Singularity, you need to show
- **All **of the three technologies that triggers the singularity are highly unlikely based upon peer-reviewed scientific knowledge.
OR
- Humans will never put in the investment needed to reach those technologies because they do not want a vast economic and/or military advantage over other humans.
There once was a human chipping rocks together trying to start a fire. I’m sure other humans like yourself told him he would never succeed and that fire was the realm of the gods. Nothing I have mentioned (self replicating factories or sentience) is not already performed by biological systems, in the same way that the cave man chipping the rocks together had seen fire and knew it was possible.
I’m of the view that once humans develop one of the three trigger techs, they will be able to rapidly develop the other 2. Also, once all 3 techs are present, they each feed back on each other and allow for even faster growth.
If humans have self replicating factories, they will have vast numbers of people who can be trained as AGI researchers and the ability to manufacture supercomputers the size of mountains. If AGI is possible at all, they would have the resources to do it.
If humans have self replicating factories, they would be able to produce the scanning equipment to scan human brains (a multi-beam electron microscope like this one) and the gigantic computers with enough power to emulate them. Also, they would be able to produce robotic lab equipment to perform the millions of distinct experiments on in vitro synapses to determine the exact effect each protein in a human synapse has on thresholds and other measureable characteristics, and to work out the rules for learning.
If humans have AGI, they could order the AGI to develop the tech for self replicating factories and human uploads.
If humans have human uploads, they could run the uploads at high speeds and use them as super-scientists and engineers to develop the self replicating factories and AGIs.
Once humans have all 3, the human uploads provide the direction for their AGI assistants to solve problems human neural architectures have difficulty with, and they use their self replicating factories to make more computers to run themselves and their AGI cousins on. The self replicating factories are improved by the AGI and uploads, and the extra computers made by the factories make the AGI/uploads better, who make the factories better, etc.
Ultimately this kind of feedback loop can only rationally end in all of the material resources available locally in Sol being converted into machinery and computers.