How Wicab's BrainPort Technology Gives Sight To The Blind

Goal: Give sight to the blind


Project: Wicab's "BrainPort"


THESIS
Seeing happens in the brain, where visual information is processed. So if a person's eyes don't work, visual information can be channeled to the brain through another body part—like the tongue.

METHOD
A camera image must be transformed into something users can feel—and then "see" inside their heads. "It's like drawing a picture on your tongue," says Wicab CEO Robert Beckman.

RESULTS
The brain almost immediately processes electrical impulses as images. But the images are grainy; learning to use them takes practice. After 25 hours of training, test subjects have been able to differentiate objects, like a mug from a Magic Marker, with 80% accuracy; read individual words on flash cards; and navigate a 15-foot hallway.

REMAINING CHALLENGES

1. Improve hardware
The BrainPort has a handheld remote, so a user can adjust contrast and brightness. The remote needs fine-tuning—more streamlined control knobs and greater durability for the stresses of everyday use.

2. Cut that wire
Big problem: "You have this wire hanging out of your mouth," Beckman says. Doing away with it would make the device more comfortable.

3. Simplify what's seen
"If you display the exact information from the video camera, the information may be cluttered and difficult to use," Beckman says. Software must be refined so the camera highlights only what's most important.

The somatosensory cortex [1] in the brain processes information sensed using touch. When blind people use the BrainPort, both the somatosensory cortex and the visual cortex [2]—usually active when the eyes are used—process the information sent through the tongue.

FUTURE PLANS
The BrainPort is currently undergoing an FDA study, which should be finished in 2013. If the company gets clearance, the device could be released by 2014.

illustration by Crystal Chou

Add New Comment

2 Comments

  • seeingwithsound

    I'm looking forward to see the BrainPort Vision Device enter the market. Some diversity is good because different people have different needs and interests. In the mean-time another sensory substitution technology, based on seeing with your ears, is already globally available: http://www.seeingwithsound.com

    It offers offers a higher resolution, is available for free for personal use, and does not need a cable running from the mouth. There is even an Android version that may someday run on Google Glasses if and once the market for augmented reality glasses takes off.

  • Jfpeacocke

    The vOICe  is the program referred to. Because there is a lot of promotional stuff for a talent show of the same name, search for ... The vOICe sight substitution...  .
    A laptop will deliver an image which is low-resolution  and L to R reversed.(Imagine that the special glasses are looking towards your face).
    Initially, darken the room and use screen glow to illuminate your face and hands.
    Bright lights and sunlight  produce a lot of distraction. Outdoors comes later.
    Demonstrate , using  pieces of white paper, how the height and brightness encoding is done.
    Notice in particular  how a slanted object is represented by a  tone.
    Try the Inverted Video (funny face setting) which makes white paper go dark (silent)  and brings up  black ink  as loud.
    Try to recognise the capital letters  V ,W ,T,I,L ,C,O  to begin with.
    This system works with either special  spy glasses or any cheap plug/play webcam.
    It costs nothing for experimenters. It delivers better results than  expensive prosthetics.
    The downside is the need to carry around a bulky, expensive  tablet (or a fiddly phone)
    The BrainPort  has near-ideal ergonomics (dry-side!). A future vOICe  could be delivered in a similar  shape.