Given that it could research improved versions of AI far faster than we could, yes it would. That sort of upload is in fact one of the common examples given of something that would likely trigger a Singularity.
Technology advance is exponential. Now, the thing with exponentials is, it looks like we’re currently advancing at breakneck speed, while every time before now was almost completely static and boring… But it would also have looked like that a thousand years ago, and will also look like it a thousand years hence. In other words, you could argue that the Singularity is 30 or 40 years away, but it’s always been 30 or 40 years away, and always will be.
People often like to talk about technological advancements which themselves increase the rate of technological advance, like AIs smarter than we are producing AIs that are smarter yet. But you get that sort of effect with all technology, which is what causes the exponential advance in the first place.
i’m fixated on the “imponderable” aspect. there have certainly been inventions that made future tech “imponderable”. the wheel, metalworking, combustible engine, computers, etc. the OP is asking if we’re on the precipice of another game changer? i don’t think so. we’ve got too many technologies currently that need improving on. IMO there are just too many dollars spent on improving existing technologies and not on theoretical research for another HUGE tech to be developed. even if the science presents itself, it has to be engineered into something productive.
I was using telepathy in that instance to describe the tools we now have that make it possible to detect thoughts. The “output” I described above. I understand that we don’t have an input device, as I pointed out earlier in my post. I think when we do master direct brain input “singularity” will have arrivd. We will be able to choose our reality.
I should have been more specific in my summing up. I thought I had made myself clear through context but obviously failed.
Well, that depends on what you consider “much of anything” - for instance, adding strong AI to our space probes would be a big win, without any other new tech. Adding strong AI to various other existing systems would also be quite useful. But the real win would be AI that could improve itself, faster and more efficiently than we could.
We don’t have any tools now that make it possible to detect thoughts. We have tools that can detect blood flow to groups of neurons and electrical fields, etc. but not thoughts.
Memory and CPU speeds are not the cause of lack of progress in AI. When I took AI the MIT AI Lab’s machine was a PDP-10, probably running about 1/5000 th the speed of your computer, and the memory stick you can pick up for $20 has more capacity than there was in all of MIT. I took an architecture class my senior year. Ed Fredkin came in to give us a talk, and he boldly predicted that memory would one day be a penny a bit. What a dreamer!
I remember that the USA Today said when Intel announced the 386, that a 16 bit computer would mean that AI would really work. Pretty funny, since we were on a VAX equivalent and had a Symbolics machine in our lab.
AI has attacked little pieces of the puzzle, quite successfully, but no faster machine is going to find the way to figure out the way we really think. We use the fastest current processors to design the next generation processor, and we need them, but none of them have ever come up with a new architecture. SF writers, even Clarke, have assumed that when you build a computer big and complex enough it will become aware. Not true, alas. (Or luckily.)
Technological advance is in the form of an S-curve, which looks exponential at one point but which flattens out. Think about transportation. Horse to railroad to car to airplane to rocket is exponential - but we’ve been stalled there, and will be for quite some time. CPU speeds have flattened out (but not processing power.) If you ignore the top part of the “S”, and just extrapolate the exponential part, then you get absurd things like a Singularity.
I’m sure real AIs would help design better ones - but we need to get the first one, which seems no closer than it was when I was in college.
This is what I was thinking of
But as to your point the scientist responsible for the breakthrough you mention had this to say about it:
But even tools like this are fairly close to what I am talking about. If this could be advanced to the point where you didn’t need the display and could just imagine the shapes then I would have a hard time saying that wasn’t machine assisted telepathy.
Both are are great technology (I assumed the first article is what you were referring to because I remember the scientist using that phrase), but are also very far from reading “thoughts” in any kind of general sense of the word.
I have to disagree. If I can project my thoughts to another with purely mental effort (without spoken word or sign language, etc.) then I have achieved telepathy.
From Wikipedia,
True this is only man => machine telepathy but if I can sit and think at a machine and have it receive my thoughts then telepathy has occurred.
But I have a hard time thinking that the afore mentioned silent phone paired with a set of these is not for all practical purposes machine assisted telepathy.
Nerve impulses would be detected and converted to language, broadcast via radio waves, then interpreted into inaudible vibrations that cause the inner ear to perceive sound. Mind to mind communication without resort to the traditional five senses, facilitated by technology.
Telepathy.
In the case of the voice article, it’s a mapping from nerves that are used to produce sounds, to a computer version of those sounds. Again, great technology, useful, and on the surface someone might consider the vocalization of a word the same as a thought. If so, here are some questions:
How then do we get at all of the thoughts that aren’t vocalized in this manner?
How do we extract meaning? If a person uses this technique to vocalize the word “two”, how do we know the person meant “two” and not “too”? If we don’t actually having meaning associated with the thought, then did we really get a thought or did we get an external representation of a non-unique label for the thought?
I personally wouldn’t call it telepathy (I wouldn’t call cell phones telepathy either), you can if you want.
My primary point was that it’s not reading thoughts in a general sense of the word.
When it is a sentient machine, a human upload that can look at its own structure I see no reason to believe that it couldn’t.
Again; claiming that strong AI is impossible is mysticism.
No, that doesn’t contradict the Singularity scenario at all. That nearly-vertical part of the curve will still be there, and the assumption is that when the curve flattens it will do so at a level far beyond human comprehension.
No, that’s not what I’m saying. What I’m saying is that the exponential growth is why there won’t be a Singularity. All points on an exponential look the same. It’s like the horizon: I can only see things that are a few miles away, but that doesn’t mean that I’ll fall off the edge of the Earth if I go further than that. As far as I can go, I’ll always see a horizon that’s just a few miles away.
Fair enough.
But assuming it…and a bunch of other stuff… is a given …aint much better.
I disagree. We aren’t talking about a pure mathematical abstraction. To use your traveling analogy, if I walk along a slope that’s constantly increasing, eventually it’ll turn into a cliff. Or to use the example someone was using earlier; a situation where what was modern on Tuesday is obsolete Wednesday is very different from a situation where something modern now is obsolete in a decade. We can, while remaining human only adapt so fast; and it seems inevitable to me that there will be a top to the curve.
I would argue that the rise of the personal computer has been a mini-Singularity all by its lonesome, having a tremendous effect on how we all live, work and play. The increasing lure of the computer triggered my realization that I have to go to a gym if I want to spend as much time as I like to at a computer keyboard and remain healthy. Kids nowadays are uninterested in TV … back in the 50s and 60s they were worried that TV would turn everyone into passive zombies … but the interactivity of computers keeps pulling us in … mention was made earlier of Second Life … that stuff is incredibly compelling.
IN fact, it’s much worse than reading thoughts, as converting speech to be used by a computer is one of the most inefficient ways of using a computer. We don’t have Star Trek computers that can figure out what we mean no matter how we say it.
my predictions -
I think the big game changer is going to be the brain interface research that is currently advancing very quickly with surprisingly little in the way of comment and oversight.
We are now at the point of crude thought to speech technology and control input and there is no indication of any immediate technological hurdles preventing further advances especially as the technology is becoming more commercially viable. We appear to be very close to responsive point and click level computer input and as soon as that point is reached expect the whole area of research to explode.
The bigger issue is the social inertia that will occur in response to the idea of computer controlling brain implants, but I’m really not sure how critical that will be. There are plenty of teenagers today who would countenance the idea of implants that improve their MW2 response times, and likely a lot of overseas IT workers who would be willing to consider surgery that gave them an advantage in what is going to be a very competitive international labor market.
After that, if there are no unexpected technological hurdles we are likely to see brain input research and that is likely to be the really big game changer.
We are going to see a bit of a speed bump in computer speed soon, but I tend to think that this will only encourage people to be more creative in using software to realise the truly ludicrous untapped potential of today’s hardware. Augmented reality may take off in a big way, or people may rapidly get sick of seeing giant neon floating phalluses everywhere they go.
Another real wildcard technology is going to be 3d printing/prototyping, if the technology can find some kind of useful commercial niche at the present cost then research will turn it into something big.
Expect intellectual property to be one of the biggest legal issues in the next few decades, but it is going to take an embarrassingly long time for our legislators to realise how important it will be. As the internet becomes more important in everyday life cyber-crime will explode and people will try and take a step backwards into AOL style gated virtual communities, but this probably won’t work.
Haven’t any of you seen Biodome? Clearly the technological singularity has already arrived.