New research developed at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) lets a person control a robotic arm with brainwaves and subtle hand gestures. It’s the stuff of X-Men, manga robots, and joystick-piloted spaceships–except, in this case, it’s quite real.
The research, which will be presented at the Robotics: Science and Systems conference in Pittsburgh next week, aims to create a natural link between human and machine that feels completely transparent and natural. The idea is to create an interface that acts like an extension of a person’s will, without any training or learning any mental commands. CSAIL director and project supervisor Daniella Rus says in a statement that the goal is to “move away from a world where people have to adapt to the constraints of machines . . . to develop robotic systems that are a more natural and intuitive extension of us.”
This isn’t the first time scientists have tried to control robots with brainwaves, but older systems failed when it came to accuracy. For example, past research required the person to learn and practice each new command–for example, you’d have to mentally associate a graphic pattern with a robotic command like “go to the left” or “go to the right.” It was the equivalent of asking a person to learn a new language. And even when you did, you would have to think really hard on that graphic pattern to get the robot to execute the command based on the EEG patterns generated by your brain, recorded via electrodes placed on your scalp. All that mental effort still had a low effectivity rate, with only 70% accuracy. “Such approaches are difficult for people to handle reliably, especially if they work in fields like construction or navigation that already require intense concentration,” the authors point out.
The new work avoids those pitfalls by combining the EEG with something called electromyography, or EMG. It’s a technique that monitors the electrical activity of your skeletal muscles–in this case, on your arm. This hybrid method lets the person communicate with the robot both by thinking and gesturing, making it much more efficient to control. According to co-lead author Joseph DelPreto, who developed the work alongside Rus, former CSAIL postdoctoral associate Andres F. Salazar-Gomez, former CSAIL research scientist Stephanie Gil, research scholar Ramin M. Hasani, and Boston University professor Frank H. Guenther, it’s “more like communicating with another person.”
The team demonstrated using a humanoid robot called Baxter manufactured by Boston-based Rethink Robotics. The person–the robot’s director–was connected to a computer controlling Baxter with scalp electrodes to monitor EEG activity and electrodes on their forearm to monitor EMG activity.
Baxter was equipped with a power drill and given a task–to reach three drilling targets on a mock-up of a plane fuselage. Baxter, without any training, would select a target and go for it. If the person noticed that it had selected the wrong drilling location, they would simply think about it in order to stop Baxter, and then gesture, gently, to the correct hole. In the process, Baxter learns what kind of brain and muscle electrical activity correspond to that target. In other words, as DelPreto explains, it’s the robot that learns from the way the user thinks rather than the user having to learn the robot’s language.
The system also monitors something called error-related potentials–a type of brain signal that occurs naturally when we detect a mistake. If such a signal occurs, Baxter stops so its human pilot can correct the mistake with a quick gesture.
The difference between the success rates of the old EEG-based system and the new hybrid method was more than significant. The accuracy went from 70% to 97%, almost perfect and without training. That’s a huge success differential, and a giant step toward a future in which the user interface disappears in favor of simply thinking. The MIT team imagines that one day this type of technology could be “useful for the elderly, or workers with language disorders or limited mobility.” Also, giant manga robots.