With the iPhone 6s, Apple added a new dimension to the multitouch screen. Its 3D Touch display not only sensed touch, but the amount of pressure applied to the screen. This pressure-sensing display allows developers to add a whole new way that users can interact with their apps via Peek and Pop gesture support.
Now it seems that Apple is working on the exact opposite of the 3D Touch display, allowing the iPhone to detect non-contact hover gestures—the movements of a finger or palm over the screen without actually touching it. The revelation of Apple’s hover gesture display comes from a recent patent filing dug up by AppleInsider.
The patent titled "Proximity and multi-touch sensor detection and demodulation" was only filed in March of 2015 and granted this week. It describes a radical new way for a device to detect non-contact gestures. Non-contact gestures have existed in devices for a while. The most obvious example is found in Apple’s own iPhone: the proximity sensor that detects a large surface area, which tells the iPhone to disable the screen because you’re talking with the phone next to your face. Other more refined versions of the technology exist, but by using cameras, such as the tech from Apple’s acquisition of PrimeSense, which worked on the first generation of Microsoft’s Kinect hardware.
What’s so radical about the non-gesture detection in Apple’s patent is that the company has found a way to embed proximity sensors directly in the iPhone’s display. It can embed one proximity sensor next to each pixel, meaning the iPhone would be capable of detecting fine motor movements from objects, such as fingers or styli, that never touch the display. In effect, Apple could add another layer of input, which is the exact opposite of its new 3D Touch pressure-sensing input, to the iPhone: hover gestures.
Practical examples of where hover gestures could come in handy are to call up more advanced sub-menu options or commands via a button. They could also be used to enable control of your iPhone or iPad when you have sticky or dirty fingers, such as when you are baking in the kitchen and using a cooking app that you need to tap a button in to see the next part of the recipe.
From an accessibility standpoint, it’s feasible that hover gesture detection would allow someone who uses sign language to communicate using ASL and then have the iPhone turn it into text or even speech. Or Apple’s new display could enable simple gestures, like a hand swipe over the screen to quickly erase a full page of text.
Of course, Apple patents many things which never make it into its finished products, but given the wide-ranging uses of hover gestures it’s possible this is one patented tech that could find its way into iPhones, iPads, and Macs over the next several years.