Smart devices continue to infiltrate our homes, but they’re often dependent on slow, clunky smartphone apps. Manually pulling up a different app just to turn on a light, turn up the AC, or reboot your Wi-Fi isn’t just annoying–it’s bad design. While the smart home market is projected to grow from $46.97 billion in 2015 to $121.73 billion by 2022, actually living in a smart home can be incredibly frustrating–an example of how poor UX could have serious business implications as the industry continues to grow.
Presented this week at ACM CHI, the largest human-computer interaction conference of the year, the prototype phone could change not only how we interact with our smartphones, but how smart home appliances are designed and manufactured.
Any product that contains electronics or electromechanical parts emits an electromagnetic signature. CMU’s EM-Sensing phone is equipped with a sensor that can detect these electromagnetic signals, and a chip that uses machine learning to determine the likeliest match for each signal. In short, the smartphone “listens” to any appliance’s frequency signature to identify it.
“Everything acts like it’s own tiny radio station–only it’s broadcasting centimeters,” says assistant professor of human-computer interaction Chris Harrison, whose collaboration with researchers Robert Xiao, Gierad Laput, and Yang Zhang led to the EM-Sensing phone. By detecting and cataloging the differences between, say, a laptop’s signature versus a vacuum’s, Harrison and his team were able to build a device that actually knows what’s around it, with greater than 98% accuracy.
For instance, tapping the phone to a thermostat or a television instantly brings up its corresponding app, allowing you to adjust the temperature or change the channel without having to search through your phone and remember which particular app enabled you to do that. Instead of logging in to your router’s controls on a desktop, tapping your phone to the device could present the same interface. It could also provide more context-dependent uses: Tap the phone to your laptop, and it can send files from the phone to the computer’s desktop.
“Instead of having 50 touch screens in your house for every appliance, you could use the smartphone as this gateway,” Harrison says. “That’s a really powerful notion.”
The EM-Sensing phone is the extension of a project that began in 2015, where Harrison and his team worked with researchers from Disney to create the EM-Sense, a smartwatch that could detect what you were holding in your hand by using the person’s arm as an antennae to read electromagnetic signals. My colleague Mark Wilson named it one of the best user interfaces of 2016.
The EM-Sensing phone is basically version 2.0, since the functionality of a smartphone is superior to that of a smartwatch. It’s similar to radio-frequency identification (RFID) technology, where objects can be tagged and their location tracked using radio waves, with some major benefits: The technology doesn’t require any external tagging, coordination between manufacturers, or third-party apps in order to manage a host of smart appliances.
“It’s bootstrapping a smart environment,” Harrison says.
Still, he believes that there’s more research to be done to make the user experience more seamless. For the technology to work, manufacturers would also have to get on board and start offering smartphone apps even for products that aren’t yet smart. And then there’s the barrier of consumer adoption.
“It’s not a magic bullet, but it gets us closer to the smartphone knowing the context around me,” Harrison says. “That’s a much more magical, powerful experience than what we’re seeing right now.”