Intel's Mind-Interface "Reads" 1,000 Words

Intel Human Brain Project

We've moved a long way from the keyboard-and-mouse only computer interface. Touchscreens, speech, and even typing systems that track eye and muscle movements all aid in interaction. But what if we could literally transmit our thoughts to computer interfaces—without any sort of implanted computer chip? Intel's Human Brain project, a collaboration with Carnegie Mellon University and the University of Pittsburgh, is attempting to do just that.

The ambitious project (see our report from last year) uses EEG, fMRI, and magnetoencephalography to deduce what a subject is thinking about based on their pattern of neural activity. The process is still fairly primitive—it only works with concrete nouns within a 1,000-word vocabulary, and it can only tell the difference between two nouns at a time. In other words, the algorithm can't yet deduce on its own if a user is thinking of the word "arm," but it can figure out whether a user is thinking of "arm" or "shirt"(or any number of other nouns). Intel tells us that the algorithm is accurate 9 out of 10 times.

Eventually, Intel hopes that users will be able to slip on a headset like the one pictured above to control computer devices. So instead of manually typing in "fastcompany.com" to reach this website, we could think to ourselves, "Fast Company Magazine." And instead of typing out articles, journalists could simply think about what they want to write.

Of course, the technology is nowhere near being available commercially, and Intel isn't the only company exploring the use of mentally controlled electronics. Intel researchers estimate that their tech could be ready in a decade in the best case scenario—or never, if things don't go as planned. But just think about how much easier this could make life for the computer illiterate (and the Stephen Hawkings) among us.

Ariel Schwartz can be reached on Twitter or by email.

Add New Comment

0 Comments