The recent Time Magazine article Thought Control (subscription required) describes what is essentially another brain-computer interface. What’s novel about this device is that the EEG signal is monitored from dry electrodes on the arm or leg. The BodyWave® Brain Wave Monitoring (pdf) system developed by Freer Logic claims to allow measurement of brain wave activity away from the head:
BodyWave simply views brain energy as a field, collects the field energy as if the brain were a radio tower broadcasting from the brain and through the body.
For the purposes of teaching “stress control, increase attention, and facilitate peak mental performance”, this may well be an adequate method. Not having to wear the more traditional EEG head gear is certainly an advantage. Providing reliable control of computer interaction tasks via either “mind reading” method is not likely to happen any time soon (see Turning the Mind into a Joystick).
More “mind reading” hyperbole in today’s New York Times Magazine: The Cyborg in Us All.
I’ve talked about EEG-related technology many times in the past. Here are some quotes from the article:
This creates a pulse in his brain that travels through the wires into a computer. Thus, a thought becomes a software command.
We’re close to being able to reconstruct the actual music heard in the brain and play it.
… a “telepathy helmet” that would allow soldiers to beam thoughts to one another.
The NeuralPhone was meant to demonstrate that one day we might mind-control the contact lists on our phones.
The general public has two reactions when the lay press publishes this kind of stuff:
- I always knew this would come true. I.e. perpetuation of scientific fantasies.
- This is really scary stuff. I don’t want anybody reading my mind — or worse, controlling it.
If you know anything about the underlying techniques and algorithms you also know that “mind reading” and useful brain-controlled interfaces
are a long way off. Because the article fails to provide any sort of time-frame perspective, why won’t someone think these capabilities exist now.
The real problem I have with these kinds of articles is that this is important work that could potentially improve the quality of life for many disabled individuals. Hyping it up to be something it’s not doesn’t help anyone.
One more quote:
“This is freaky.” And it was.
Huh? … I think the NYT needs to improve their editorial oversight.
There have been some interesting EEG related stories lately:
I’ve followed BCI: Brain Computer Interface and EEG work for a long time. There is still a long way to go on the “mind reading” front, but these types of developments are all encouraging.
The IntendiX (by g.tec) is a BCI device that uses visual evoked potentials to “type” messages on a keyboard.
The system is based on visually evoked EEG potentials (VEP/P300) and enables the user to sequentially select characters from a keyboard-like matrix on the screen just by paying attention to the target for several seconds.
P300 refers to the event related averaged potential deflection that occurs between 300 to 600 ms after a stimuli. This is a BCI research platform that has been made into a commercial reality. The system includes useful real-life features:
Besides writing a text the patient can also use the system to trigger an alarm, let the computer speak the written text, print out or copy the text into an e-mail or to send commands to external devices.
I’m usually skeptical of “mind reading” device claims (e.g. here), but P300-based technology has many years of solid research behind it. It may be pricey ($12,250) and typing 5 to 10 characters per minute may not sound very exciting, but this device would be a huge leap for disabled patients that have the cognitive ability but no other way of communicating.
(hat tip: medGadget)
UPDATE (3/24/10): Mind Speller Lets Users Communicate with Thought
As announced at a recent MIT workshop: The BCI X PRIZE: This Time It’s Inner Space:
The Brain-Computer Interface (BCI) X PRIZE will reward nothing less than a team that provides vision to the blind, new bodies to disabled people, and perhaps even a geographical “sixth sense” akin to a GPS iPhone app in the brain.
As I’ve discussed many times (e.g. BCI: Brain Computer Interface), “mind reading” with EEG is a huge challenge. Another hurtle they have to overcome:
The foundation must court donors to make the $10 million+ prize a reality. Once funding is secured,…
That will be the easy part.
The problem with the X Prize incentive approach is one of expectations. If people believe that Avatar-like advances (“new bodies”) is a realisitic result, they will be sorely disappointed.
Even though I’m a certified “mind reading” skeptic I think great BCI strides will inevitably be made. The good news is that these innovations will provide numerous benefits for handicapped individuals.
UPDATE (2/5/10): Here’s a great example: Technology Behind Second Sight Retinal Prosthesis
In today’s New York Times business section there’s a piece called: Moving Mountains With the Brain, Not a Joystick. I’ve previously discussed both of the mentioned EEG-based headsets here.The article highlights some of the problems that this type of technology will face in the consumer marketplace:
“Not all people are able to display the mental activity necessary to move an object on a screen,” he said. “Some people may not be able to imagine movement in a way that EEG can detect.”
I agree. Even though Emotiv claims that “all 200 testers of the headset had indeed been able to move on-screen objects mentally” it’s very doubtful that the device will have that level of success (100%!) with real gamers.They talk about the use of facial muscle activity (EMG) in addition to the EEG signal. With proper electrode placement, I think EMG holds far more promise for enhancing the gaming experience. Even EOG could be used effectively as a feedback and control mechanism. Reliable EEG processing for this purpose is still a long way off.
UPDATE (6/17/08): More of the same: No Paralysis in Second Life
UPDATE (6/29/08): Here’s a pretty good description of how these devices are being used for control purposes: OCZ’s Neural Impulse Actuator (The flying car of control schemes).
UPDATE (7/21/08): An even more thorough evaluation: OCZ NIA Brain-Computer Interface. A generally positive and realistic assessment of the technology:
…the NIA isn’t a replacement for traditional input methods, it is merely a powerful supplement.
I guess I’m a sucker for EEG related technology (see all my HCI posts). So when I run across an article like A baseball cap that reads your mind I can’t help but comment on it.
Unlike other “mind reading” systems that make unrealistic claims, I can see this research and wireless technology leading to something quite useful. The ability to discern closed eyes and drowsiness by the presence of alpha waves (8-12 Hz) in human EEG is well known.
Developing an affordable product that provides a timely audible alert to a driver that’s about to fall asleep could have a huge impact. From (beware, this is a PowerPoint presentation) Fatigue and Automobile Accident Risk:
The US Department of Transportation estimates that 100,000 accidents reported are due to drowsiness and/or fatigue. These crashes result in 1550 deaths annually (4% of traffic fatalities) and $12.5 billion in monetary losses.
Even the annoyance of false alerts would be worth the lives saved. And of course it’s convenient that a lot of truck drivers already wear baseball caps.
The New York Times had a couple of articles over the last few days that deal with short-term memory loss.
David Brooks’ 11-Apr-08 commentary called The Great Forgetting summarizes the Bad Memory Century best with:
In the era of an aging population, memory is the new sex.
In addition to taxes and death, aging is something that none of us can avoid. I was born in the later part of the baby boom, so over the last few years I have become acutely aware of these short-term memory challenges. Remembering to write down thoughts and lists has become essential. And I’m still young, relatively speaking anyway. Like the other inevitables, you never think it’s going to happen to you. But it does!
The 13-Apr-08 Sunday Magazine’s Idea Lab Total Recall speculates about embedding a computer chip in the brain in order to improve short-term memory. I know that short-term memory loss is a problem, but who would have guessed that “sky divers have been known to forget to pull their ripcords — accounting, by one estimate, for approximately 6 percent of sky-diving fatalities.” !!
Interestingly, the brain is a particularly effective associative memory system:
…, studies suggest that if you learn a word while you happen to be slouching, you’ll be better able to remember that word at a later time if you are slouching than if you happen to be standing upright.
Neural prosthetics are a long way off (see here), but the concept of embedding a Google search engine in your brain is certainly intriguing — and scary to most.
In the mean time, you can follow the suggestions in How to Cope With Short Term Memory Problems.
The human-computer interface (HCI) will continue to be a major challenge for the future. The iPoint Presenter is an approach that makes a lot of sense. It’s been depicted as the future of computer interaction in movies like Minority Report and could easily be imagined as the next generation Wii.
Unlike the EEG-based “mind reading” devices that I’ve discussed before, this technology could be made affordable and reliable, so it holds much more promise. Plus that, it’s very cool.
This is HCI related anyway: University of Bremen’s Brain-Computer Interface: The future world is here. This is an interesting approach for helping the disabled. LEDs are flashed at specific frequencies which causes the visual cortex to respond in a corresponding manner. When the person looks at that one LED or another the EEG response is detected and initiates the desired activity or makes the associated selection (e.g. letters or numbers). The communication rate is slow, but this is a realistic technique nevertheless.
Via Slashdot, Emotiv has a new Brain control headset for gamers scheduled to come out later this year:
- Sensors respond to the electrical impulses behind different thoughts; enabling a user’s brain to influence gameplay directly
- Conscious thoughts, facial expressions, and non-conscious emotions can all be detected
- Gyroscope enables a cursor or camera to be controlled by head movements
- The headset uses wi-fi to connect to a computer
I’ve discussed this type of Mind Reading Software before (and here).
Interpreting motion and/or motor signals (facial expressions) is one thing. I’d love to know what type of EEG processing will be used to detect conscious thoughts and non-conscious emotions. At the very least, this type of quantitative EEG analysis has to begin with high quality EEG signals. I don’t know how this can be done from an unprepared scalp and electrodes that are applied without gel.
Via Slashdot, here (and here) is another game controller from OCZ (the Neural Impulse Actuator (NIA) is not currently a listed product). The NIA works by “… reading bioptentials. These include activities of the brain, the autonomous nervous system and muscle.” You can’t help but be skeptical about the value of this technology for these purposes.
More information on the Emotiv device: ‘Mind Gaming’ Could Enter Market This Year.