Today, phones are big—so big that they’re unwieldy to use unless you have two hands free. But a new kind of technology could make it far easier to navigate your phone without ever touching its screen.
Right now, those of us with overly large smartphones have to contort our hands to grip the phone while also swiping and tapping across its surface. But with this technology, which uses ultrasonic waves to detect the position of your fingers, you could scroll through menus simply by rubbing your finger where it usually sits naturally: along the side of the phone. When taking a picture, you could press lightly on the side of the phone to focus your picture, and then press harder to snap the image, with no button needed—just like a point-and-shoot camera. Then, you could slide your finger along the back of the phone to flip through your photos. And when gaming, imagine having virtual buttons on the sides of your phone that you can tap to shoot and jump, transforming your phone into more of a classic gaming controller. Basically, it’s like having a touchscreen everywhere on your entire device, turning every available surface into buttons you can interact with.
“You’re already holding and touching the edges of your phone. You were already using buttons on the side,” says Jess Lee, the CEO of Sentons, which is making this technology, called SurfaceWave, available to all smartphone makers. Sentons’ technology lets phone makers take controls and “make them virtual and remappable and actually extends them to the entire edge of the phone.” But they remain familiar enough that people don’t have to relearn how to use devices they’re used to.
When I tried SurfaceWave out on a demo phone, it certainly felt like the future of interaction design. The desire for more real estate on a phone is, after all, the reason why phone makers are trying out wacky new forms: foldable phones, dual-screen phones, or even phones with a screen that wraps all the way around. But if you’re happy with the general form of your phone already, adding the ability to make the back and sides interactive is an obvious next step.
Sentons’ tech isn’t the only company looking for new ways to interact with smartphones beyond the touchscreen. This week, Google announced that the Pixel 4 will be able to detect broad gestures that can do things like skip to the next song. Google’s tech, called Soli, uses radar rather than ultrasonics to sense motion. However, early reviews have been skeptical, as it often takes a few tries for Soli to finally see your hand gestures. Google’s Pixel also has another similar feature, where you can squeeze the sides of the phone to launch the Google Assistant. It uses strain gauges inside the phone that can detect flexing but is nowhere near as precise as SurfaceWave.
Why has it taken so long to make our entire phones interactive? Capacitive sensing, the technique phone makers use for touchscreens, doesn’t work through the metal that encases most phones. But the engineers at Sentons, a company founded in 2011 that previously focused on large interactive screens, realized that by lining a phone’s body with small components that oscillate at a very low frequency, an algorithm could detect any disturbances, like your finger, to the sound waves—even through metal.
“That’s because your finger has water, and it absorbs the sound waves as it makes contact with the metal,” Lee explains. “[The waves are] modulated in a certain fashion to give us really high accuracy and super low power . . . that’s how we see how you’re holding the phone.”
The company says it is already in talks with all the largest phone makers in North America, and its technology has been deployed on several phones already, including on a gaming phone produced by Asus and Tencent that’s available in Europe and China. I was able to try out the new Call of Duty game on this phone, and its virtual controls provide a nice zap of haptic feedback so you know you’re hitting the buttons. “Apparently, you’re half a second faster to acquire target and shoot a target here because your thumbs aren’t in the way,” says Lee, who joined Sentons in April 2019 from Apple, which acquired his previous company.
Sentons was originally founded by a group of radio engineers, who initially tried to sense touch through metal using radio waves. But it required extremely high frequencies to detect the anomaly of your finger, which was difficult to achieve. Instead, the engineers settled on ultrasonic waves, which require very little power—Lee says it’s slightly less than the natural leakage of the battery.
When I tried out SurfaceWave, its precision was impressive, and it felt natural to scroll using the side of my phone rather than the screen. Not every demo the Sentons team showed me worked particularly well—squeezing the phone for a selfie was a bit awkward, as was swiping between images using the back of the phone. But the point is that Sentons is now shipping very basic phones outfitted with its tech to developers and designers, who’ll be able to dream up a million ways to use the new virtual buttons.
The tech isn’t limited to smartphones, either. Sentons is also looking into applications in wearables and cars. When you’re in the drivers’ seat, finding the right place to tap on your console’s touchscreen requires you to take your eyes off the road. But Lee imagines layering SurfaceWave inside the steering wheel itself, turning the entire thing into an interactive surface complete with virtual buttons.
The one limitation here? Anything remotely soft or squishy, where it’s too hard to predict disturbances in the ultrasonic waves.
Much of SurfaceWave’s success will hinge on adoption by large smartphone makers like Apple and Samsung, and it may be challenging to convince the industry giants to try something this new at scale. But since these companies are locked in a race to offer something new on phones that increasingly look and act pretty much the same, there’s a chance that this revolutionary new way to interact with your phone is on the horizon.