Direct Thought Control of a Computer’s Cursor: Say Hello to Your Mind-Reading PC

It’s called electrocortiography, and it involves a surgical procedure that places groups of electrodes directly onto the surface of a patient’s brain.



A science team has, for the first time, engaged a brain-reading electrode setup to allow a computer’s cursor instructions to come from a living, thinking human brain.

Brain-computer interfaces have long been the stuff of science fantasy, but over the last several years the technology has quickly evolved to the point that even kids’ electronic toys incorporate basic mind-reading sensor suites. There’re a few problems in deducing extremely accurate control commands from a thought, however, and that’s something the team from Washington University has now successfully tackled with re-purposed medical system that normally is used to identify the regions of an epilepsy-sufferer’s brain responsible for seizures.

The trick is called electrocortiography, and it involves an invasive surgical procedure (craniotomy) that places groups of electrodes directly onto the surface of a patient’s brain. Four epilepsy patients underwent the recent experiment, which monitored the electrical signals from their brain as they sat in front of a computer screen. Their objective was to try to move the cursor in a particular direction by thinking or saying out loud particular words that had been pre-chosen–thinking “ah” would, for example, move the cursor one way.

The results are astounding: The team found that the computer could be controlled up to 90% accurately even with no pre-training of the system or the patient. In other words, the first real, if crude, mind-reading computer interface was demonstrated. Better yet, as a separate result of the the experiment, the researchers identified a specific region of the brain that was needed to make the interface work, and future trials will only need a one-centimeter hole in the patient’s skull instead of far more access.

For the time being the highly invasive nature of the procedure will limit its utility to people who really need it–patients who are paralyzed, or who have lost the ability to speak. But it’s extremely likely that as more refinements are made to the technique less invasive surgery will be needed, and the team already imagines they will be able to detect not just simple commands, but eventually what a “pure idea” may look like. In addition to helping those with disabilities, these are steps along the road toward making the science-fictional “braincap” computer interface that Arthur C. Clarke dreamed up years ago. Think of the gaming and entertainment possibilities!

To read more news like this follow Kit Eaton himself and Fast Company on Twitter.


About the author

I'm covering the science/tech/generally-exciting-and-innovative beat for Fast Company. Follow me on Twitter, or Google+ and you'll hear tons of interesting stuff, I promise.