The smile: It’s a universal, biological expression of happiness and joy. But what if your smile–or frown–was the basis for more than just how you interacted with your peers? What if certain elements of your environment required you to communicate with your facial expressions?
University of London masters student Freddie Hong is exploring how our faces–and specifically, our smiles–could serve as an interface. For a class on physical computing at Goldsmiths College, Hong built a connected door that only opens when you’re smiling, thanks to facial recognition software. “The focus of this project was to provide the audience an experience of what it feels like to have your emotions read by the computer,” he tells Co.Design in an email.
The Smiling Door points to the next wave of emotive technology, which will use facial recognition software to analyze the difference between a smile, a grimace, and a frown. That’s something the startup Affectiva, an offshoot of MIT Media Lab that uses advanced facial recognition to track and analyze emotions, is already doing. Microsoft, too, has its own Emotion API that developers can use to add emotion-based context to their products. Embedding emotional awareness into our products seems like the logical next step for UX design–regardless of how invasive or manipulative it may be.
But it would mean allowing our emotions to be manipulated by unthinking algorithms that were likely developed to make someone else money. Either way, smiling for the camera would take on an entirely new meaning.