advertisement
advertisement

MIT’s Breakthrough, Super-thin 3-D Gestural Display Captured on Video

A new display offers a 3-D gestural interface, without any of the clunky equipment that hobbles its forbears.

model_demo_long_8

advertisement

Most people are still getting used to the idea of touchscreens everywhere, but MIT is way ahead of you: They’re demonstrating a super-thin LCD display that can read your hand movements in 3-D without your ever having to smudge fingerprints on glass. MIT has just released information
about how the interface works, with a video (below), in advance of a
big presentation on the technology at next week’s Siggraph Asia
conference.

We recently wrote about gestural interfaces for Web 3.0, and mentioned the MIT project, which was developed by Media Lab ph.D. candidate Matthew Hirsh. It relies on your standard LCD display, backed with sensors–then it gets interesting. The LCD is programmed to flicker an array of black and white pixels amongst the images being displayed. The pattern is so fast it’s imperceptible. Those pixels act basically like pinholes, allowing the light bouncing off objects in front of the screen to pass through to the sensor behind. Fancy algorithms and processing power turns that into a 3-D readout of what’s passing before the screen.

So, if you’re waving your hands in front of the display, it has a good idea of where exactly your hands are. In turn, you can interact with 3-D objects on screen. Here’s a video that kinda-sorta explains:

 



Now, the genius is that the technology is far more elegant than other ways of creating gesture-sensing screens. Usually, cameras are placed either at the edge of a screen or behind it. The former creates enormous blindspots, since the cameras have such a limited field of vision. The latter creates massively thick screens, since the camera has to be placed so far behind the main display. The MIT innovation, by contrast, allows you to create gestural displays as thin as an ordinary LCD screen. Pretty cool.

advertisement

For now, the MIT test is only a prototype, since sensors layered behind an LCD screen aren’t yet being mass produced. But there are no technical reasons why that shouldn’t be the case sometime soon.

And then: Gestural iPhone and tablet PC interfaces for everyone! (Give or take a few years, and a several million dollars of research.)

About the author

Cliff is director of product innovation at Fast Company, founding editor of Co.Design, and former design editor at both Fast Company and Wired.

More