advertisement
advertisement

The iPhone X’s Real Legacy: Making Your Face The Interface

Wink once to open this article, and then ugly cry to close it.

The iPhone X’s Real Legacy: Making Your Face The Interface
See the full video below. [Image: Rich Oglesby /Youtube]

The iPhone X is a horribly flawed device that maybe no human alive should actually buy. But its TrueDepth Camera–which uses lasers to map and track your face in 3D–may end up having a much greater legacy than mere animoji. Developers have begun to play with this new feature in all sorts of creative ways, allowing you to not just augment your face, but to use your face to control apps. The consequences of this new camera technology may be a phone that controls you as much as you control it.

advertisement
advertisement

Nose Zone (great name!) is a game that challenges you to shoot balls across the screen with lasers. How do you aim? You guessed it, your nose! Pew pew–achoo. Another title, Rainbrow, tasks you with dodging emoji cars across the colorful lanes of a rainbow. How do you dodge? Yes, by raising and furrowing your eyebrows.

Let me be clear: All of these are novelties, largely akin to last year’s VR apps which, for lack of a better idea, had you steer your way around games with your head and neck. Beyond the sheer silliness of using your neck or face as a controller, how many of those gestures can you really comfortably repeat hundreds to thousands of times a day? Not many. Compare the act of typing with your fingers to raising your eyebrows. Do it once, you don’t feel much. Raise your eyebrows five times fast, and you realize there’s probably a good reason that Apple didn’t make these face controls a stock feature inside iOS.

But these silly face-controlled apps do tease an avalanche more to come. Consider that in the right developer’s hands, the iPhone X is able to capture a full-color 3D scan of your face. It’s an early proof point that the camera is capable of even more than you might think. That’s probably why Apple has invested in two companies in this area within the last year: Emotient, an emotion analysis company, and SensoMotoric Instruments, which can track your gaze 120 times per second. Add in these two capabilities to the next iPhone front-facing camera, and UX as we know it may change forever.

Apps might see how they make you feel and change content to alter your emotions. Ads might see if you’re watching them and tweak their placement or color to snag your attention. (Netflix does this aggressively on its service based on your behavior aready.) Might publishers on Apple News get access to analytics on whether or not you clicked on their articles–or if you loathed an article but read it and shared it anyway?

It’s Nose Zone taking over the world! But if you remove the humor, and bury consumer consent under a mountain of legalese, it’s easy to see how invasive such tech could become. Make the consequence of face-control data invisible to the user, all while leveraging that personal data to influence what they see and do, all to keep them looking into that Mirror of Erised with an annoying top notch. Will you be able to opt out? Sure, maybe, just like you can opt out of sharing your location with Uber. You can, but you’ll just break the entire experience if you do.

At the end of the day, consumers have very little autonomy as soon as they decide to buy a phone or use an app. The rules are laid out, then we agree to whatever they say. And if our smartphones truly learn to see and understand our faces, we may lose any lingering semblance of control. The UI will no longer work for us; the UI will control us.

advertisement
advertisement

About the author

Mark Wilson is a senior writer at Fast Company. He started Philanthroper.com, a simple way to give back every day.

More