I have a friend who has severe spastic cerebral palsy. He has a caregiver 24/7 and his mobility is extremely limited. He is unable to use a computer mouse and he uses his keyboard with a headstick with great difficulty.
His speech is almost entirely impaired (especially in the last few years), and is essentially unintelligible except to those who know him… and most of his communication is done by facial gestures.
This man is highly intelligent, having read many of the cornerstone books on science and other fields as a child. He enjoys reading as much as he is able, writing poetry and watching plays.
I’ve communicated with him and his eyes brighten up tremendously when I talk about wanting to help him find a combination of hardware and software that will enable him to both use the internet and communicate more efficiently with other people. He would love to be a mentor to young people who share his very great physical and emotional challenges.
The cost of the hardware is not so important as the effectiveness and efficiency of design. If I can’t find anything currently available for sale, I already have plans to approach Google’s non-profit arm to see if I can get something happening through their Google Labs and grant programs.
This guy has a great mind, a good sense of humor, a big heart and I’m really fortunate to have a friend in him.
If anyone has any ideas or can point me in the right direction, this would mean a lot to me, my friend and possibly other people who could benefit by extension.
Incidentally, I’ve already bought him some word-prediction software through Aurora Systems. While it does increase his typing speed, even so, at about 4-5 WPM, communications are excruciatingly slow for him.
Amazing. So few views, and no replies… is there no technology available. What setup does Stephen Hawkings use? My friend is essentially in the same position, though he can turn his head with effort.
There are free and open source technologies, however, some of them will require quite a bit of training.
I suggest you first try (yourself) all the features of Universal Access in OSX. It really has a great design - figure out exactly what it can’t do that you need it to and then let me know what those features are.
Also, depending on how much work your friend is willing to put in, I can point you towards systems that will allow him to control his computer using “brain waves”.
(disclaimer: I study ‘Computational Cognitive Neuroscience’
The boards seem to have fewer posters this week. I also think most people can’t help so haven’t read the thread. The interface that tracks the eyes sounds like it may be the most helpful. Maybe one of those blow tube devices will help? I hope you can find something that helps him interact in a better way. I seriously hope that.
Ah, thanks so much for your help, AlterEgo. I’ve been dilligently searching on this topic for awhile now, and I consider myself a pretty darn good Googler, but it’s been surprisingly tough to make progress in my research so far.
I will get back to you on his movement and muscular control abilities. We don’t live in the same city, so I’m going to have to contact his caregivers.
Out of the people he knows, I think I’m the most tech-savvy… so I’m set on at least introducing him to what is currently available, and what may be possible.
Your willingness to provide links and feedback definitely appreciated!
I don’t know if you’ve seen this before or if there are better things out there, but Dasher seemed like a very interesting project when I last looked at it.
It combines eye-tracking with word/sentence completion… sort of like the T9 predictive typing on your cell phone, but for full sentences. For example, you might start typing “he…” and it might suggest “hello, how are you?” – that’s an example only; the full thing is not quite that sophisticated (I don’t think), but it IS predictive and DOES work with eyetrackers or any other sort of 2D pointing interface (head, arm, or foot tracking, for example). They say people have managed to reach 30WPM with an eyetracker alone. I tried it with a mouse and it’s definitely usable…
There is also free, open-source eye tracking software like TrackEyethat can perform eyetracking using a regular webcam. Unfortunately, I wasn’t able to test its effectiveness because it won’t work on my computer. You might have better luck; if not, there are definitely commercial solutions available.
I don’t know if they will be of any use to you, but Spanish National Organisation for the Blind www.once.es (in Spanish) does a lot of work for people with all kinds of disabilities; they might be a source of information.
Oh hell, I wish I could remember more details . . .
But a few months ago I saw a video (that I probably read about here) about a system that involved wearing a special cap with electrodes that could somehow communicate with certain areas of the brain. The person could then look at a screen that was flashing letters very quickly, simply THINK about the letter, and type that way, rather quickly. I believe it could also be used to control artificial body parts.
The communication function was demonstrated by both severely disabled people and the male reporter, who was amazed.
Toward the end of the video there was also a woman who had had basically an electrical socket installed in her head, and the control device could be literally plugged into it. At the time, they were letting her use it to practice controlling an electric wheelchair (I suppose by thinking “right,” “left,” “over by the door,” or whatever) – not yet one that she herself was sitting in for safety reasons, but they hoped that she could eventually use it to roll herself around. You could tell by the look on her face that she was thrilled and excited.
Wish I had more detail or a link, but maybe this will jog someone else’s memory.
These systems require a large amount of training. The reporter was amazed that he got it to do something, not that it made him an effective communicator. In all honesty using facial muscles - or any muscles for that matter - is preferable to using EEG signals. You really don’t want to start using EEG until you have to, for instance, in advanced ALS (Lou Gehrig’s). However, since cerebral palsy is a developmental motor disorder he may have an underdeveloped ability to perform motor control tasks. Thus, eye movements are the most promising avenue. While saccadic eye movements are a motor task, they are hard wired at a very low level using very efficient cerebellar circuitry.
Some of the most impressive research systems that have come out recently allow you to control the endpoint of an artificial arm by reading out EEG signals. You can also do it with real-time FMRI. You can also embed electrodes into motor cortex and use that signal to control an artificial arm. You can also read out the motor goal from supplementary motor area (higher abstraction of motor cortex - goal-oriented region) and use that. You can also provide a proprioceptive feedback signal for a motor-cortex controlled artificial arm by attaching some electrodes to nerves in the chest near the shoulder. This allows the patient to embody the arm - they feel like they own it.
None of these avenues are as promising as eye movements. As long as you can perform “smooth pursuit” eye movements it is your best last chance - for now.
Thanks - the vocal chords aren’t really that important as long as he has good eye movements. I just remember a patient with severe ALS that they were only able to communicate with by putting a PH meter in his mouth and the have him think of lemon for yes and milk for no, which had made a noticeable difference in the PH of his saliva. That is really a severe case.