U.S. military funds research into mind-reading technology

Story here.

Meh. They’re thinking too small.

So, could this work? Could it lead to the development of real mechanically assisted telepathy?

No, unless your definition of telepathy is very, very loose.

Saw this article on CNN yesterday and got a good chuckle out of it. I don’t think this is anything like ‘mechanically assisted telepathy’…more like a machine to mind interface. I’ve seen stuff like this on Discovery and such (kid who is a paraplegic who can control a cursor using such a mind to computer interface for example). One of the hardest things they were talking about is attempting to translate or interpret the signals in order to allow the software to do what the user wants. I remember seeing a chimp hooked up to such a machine that seemed to be working pretty well…it was allowing the chimp to play a video game completely hands free, using just his mind.

Could it work? Sure…eventually. But it ain’t gona be telepathy.

-XT

I know, but that would be a necessary first step, like inventing the Leyden jar before the electric motor or the telegraph.

I don’t think it will ever be telepathy or mind reading in the traditional sense…more like an interface like the skull-phone from Vatta’s War, or like the neural interface from March to the Stars.

-XT

The Soviets had this for their Firefox fighter jets more than 20 years ago. And we’re just now catching up?

Yeah, but that only worked because of Clint Eastwood’s remarkable command of the Russian language. You can’t expect everyone to be as good as him.

Actually I think it would qualify as telepathy, at least of a limited sort. No, you couldn’t transfer memories, but being able to receive speech and images ( possibly sensations and movement commands with some kind of bodysuit interface ) seems pretty telepathic to me. What’s sometimes called a “surface reader” telepath in science fiction.

I’m not sure about the receive part though…or if you’d get clear ‘pictures’ from the reader. It seems to me the technology is essentially used in a similar way to how you move your hand. You think a certain way, it forms a certain brain wave pattern and your hand moves. The only difference is that instead of your hand moving it’s a cursor on the screen (or maybe a weapon firing or some other mechanical action). I suppose if they could ever map what all the surface brain activity means you could get some kind of picture out of it, but putting something directly into the brain (like a picture or thought or whatever) would be a lot more difficult I imagine. Even mapping the surface brain waves is light years beyond what we can do now…at least according to that show I mentioned earlier.

-XT

So what exactly is telepathy then?

A machine that can read your brainwaves certainly seems like a telepathic machine to me.

A friend of mine was using an EEG helmet to control a midi sequencer for a while.

Yes; as I see it basically, the person receiving the input is the “telepath”. Or to look at it another way, the scanner wearer would be a “projective surface telepath”.

From what I’ve read over the years, we CAN map the brain’s surface brain activity; it’s just real crude at present, and requires invasive techniques to get the better resolutions. A “he’s imagining a black circle” sort of thing. In theory you can get a lot of data though; visual imagination is projected on the brain’s surface for example. And brain implants or a glass skull ( the latter having been done with monkeys ) are a bit farther than most people would be willing to go, short of for replacement limbs and such.

Direct (mind to mind) communications. This wouldn’t be anything like that.

If I cut off your arm and put a cuff on that can detect the electro-magnetic pulses from your muscles, is that the same as mind reading? Put another way, if I am reading your brain waves via a machine set up for certain patterns, do you really think it’s also going to detect your thoughts of ‘Nice rack on that nurse!’ with full visuals? Not a chance. It’s just looking for a certain pattern that it knows means something like ‘move arm’ or ‘wiggle ear’…the only thing it’s going to be able to ‘read’ are patterns it has learned, and not all that complex of patterns at that. And I know of no way to make the communications two way direct in any case…that would entail putting waves readable by someones brain back INTO a brain.

Sure…you can do the same thing with artificial limbs these days. Have them detect electro-magnetic patterns to move a limb or whatever. That’s a far cry from a machine that will read your mind though.

-XT

I don’t see how you could project ‘thoughts’ back to a person, but I’m certainly no expert here. I also think that anything more than basic patterns tied into motor control functions is going to be beyond any foreseeable future technology. I can see thinking to fire a weapon or move an actuator remotely or something like that…but looking into someone’s brain and detecting what they are thinking i think is beyond our capabilities. And projecting thoughts back into someone seems yet another huge leap.

From what I understand we can detect the patterns of waves pretty easily these days. It’s interpreting them that’s the show stopper. Even getting the software tuned to move a cursor on a computer screen via simple brain patterns consistently seems pretty challenging. I think the limb thing is a LOT easier and definitely well within our current means. I understand the Army has made some pretty amazing strides due to all the folks injured in the current fuckups overseas…maybe the only good thing to come out of all that stuff.

-XT

Abstractly speaking, why couldn’t a general-purpose “brain-speaker” be developed, to just rattle off stream-of-consciousness babble of all your thoughts and to show any mental imagery you cook up on a few TV screens? Presumably all the information is there inside the skull to be mined; the only problem (and not a trivial one, to be sure) would be organizing and filtering the data. Especially if you only wanted conscious thoughts. But difficult as it might be, one would think that it could theoretically be done.

I do agree that it wouldn’t be (mechanically-assisted) telepathy, though, unless we also devised a way to transmit the sounds/images/thoughts into another person’s brain, which is a whole separate problem than simply observing one.

Is this product along the ideas of the OP.

Can’t remember the program but The military was working on using brainwave interfaces to take some of the workload out of cockpits.

Found this on military research for binoculars.

The technology is an interesting thought, but I think a $4M grant is hardly enough to skim the surface of the research. I’m more interested in improvements in current technology like what the Navy is doing with their GPS systems- http://www.bizjournals.com/washington/stories/2008/08/25/daily17.html

The gap between discerning the content of thoughts and some of the physical correlates with thoughts (fMRI, EEG, PET, etc.) is enormous.

You can park yourself out of an accounting firm that works in a 100 story office building all day and all night long. You might be able to figure out which workers are burning the midnight oil until 3:00 AM when it’s tax season for a certain corporation, and you might figure out which people work with other accounting projects, but just because you look at the patterns of which lights are on when doesn’t mean you know the GAAP rules for accounting.

But this is a lot more like picking up radio waves from unshielded computers and reconstructing the monitor images & keystrokes; not like looking at windows.

Not really. All of the “brain reading” technologies that we have developed lend insight into the anatomical basis of psychology, but they really don’t give us any idea of the content of the thoughts. I think the analogy to an office building where you’re looking to see what lights are on really is apt. All of the technologies we have describe the activity of anatomical components like a certain voxel in an MRI volume or a cortical region and they have varying levels of temporal resolution, but they really don’t give you insight into the specific content of the thoughts. We really are just looking in the windows to see who’s in at what time.

Please research the technologies I’ve mentioned previously if you still have questions on the point.

I won’t claim that interpreting the content of thoughts is an intractable problem in neuroscience, but I can assure you that we aren’t anywhere near that nor are we making significant progress towards that end.

Going from squiggly lines on a screen to mentally controlling robotic arms and being able to tell if someone is thinking of a ( simple ) shape is quite a lot of progress. And much more detailed than your ‘office windows’ analogy.