Touchscreens let you interact with them in exactly one way: by poking your finger at them. It’s the equivalent of the Mac’s one-button mouse, which was probably exactly what Steve Jobs intended. But in practice, it would sometimes be convenient to have the touchscreen equivalent of a “right click”–to invoke contextual menus, for example, or even just to distinguish certain kinds of input from others. Chris Harrison, an interaction designer at Carnegie Mellon University whose work we regularly cover, came up with a solution called TapSense.
The TapSense prototype can distinguish between four different modes of touch input–the pad of your finger, the tip, your knuckle, or your fingernail. That means you have four different ways of invoking touchscreen functionality with just one digit. This could be useful for specialized applications that need to cram a lot of functions into a limited interaction modality, like a touchscreen version of Photoshop with its zillions of nested options and brushes. TapSense could let a user draw with the tip of her finger, select with the pad, undo by tapping her fingernail on the screen, and…well, I don’t use Photoshop enough to imagine what function could be mapped to a knuckle tap, but I’m sure one exists.
TapSense can also sense different materials used on the touchscreen–like the difference between a wooden stylus and a plastic one. There are all kinds of possible applications for this kind of “extended” touchscreen interface. It’s still just pictures under glass (to lift Bret Victor’s damning description), without the rich haptic feedback that our bodies and minds expect from everyday objects, but even incremental advances in touchscreen interactivity like TapSense can only improve matters until the next big UI paradigm shift–whatever that will be.