Researchers at the Massachusetts Institute of Technology say they’ve developed the first known system able to read people’s emotions by bouncing wireless signals off a person’s body.
Potential applications include more adaptive user interfaces as discussed in Co.Design. And while the team from MIT’s Computer Science and Artificial Intelligence Lab is taking measures to make it difficult to scan people’s emotions without their consent, the experiment still raises questions about privacy that some experts say current legal frameworks may be ill-equipped to handle.
“The whole thing started by trying to understand how we can extract information about people’s emotions and health in general using something that’s completely passive—does not require people to wear anything on their body or have to express things themselves actively,” says Prof. Dina Katabi, who conducted the research along with graduate students Mingmin Zhao and Fadel Adib.
The system, called EQ-Radio, works by generating a low-power wireless signal and measuring the time it takes the signal to reflect from various signals in its vicinity. Since the reflection time from people’s bodies vary as they inhale and exhale, and as their hearts beat, it can distinguish humans from other objects that generate static reflections, according to a paper the team plans to present next month at the Association for Computing Machinery’s International Conference on Mobile Computing and Networking.
Then, the system learns to distinguish heartbeats, which cause faster but smaller changes in reflections, from breathing, which leads to slower but larger differences. It’s roughly as accurate at measuring heartbeat time as a traditional electrocardiogram, say the MIT scientists, who are also working with researchers at Massachusetts General Hospital to study potential medical applications.
“We are able to extract breathing and heart rate in a very passive way without asking the user to do anything except for what he does naturally,” says Katabi, who in 2013 was awarded a MacArthur Foundation “genius grant” for her work on wireless networks.
Both sets of measurements are then fed into a machine-learning process that observes people in emotional states including anger, joy, and sadness, along with their heart and breathing rates. Once it’s trained, EQ-Radio is about 87% accurate in recognizing emotions in people it observed during training and more than 70% accurate in others, the researchers say.
And while they emphasize that commercializing the system won’t happen overnight, they do envision some potential applications. Similar systems might be able to help diagnose mental health conditions like depression or bipolar disorder, they say. Or, they could also be able to help movie studios and advertisers track people’s emotional reactions to their work. Katabi says the technology is likely to find its way into tools made by Emerald, a company she founded that’s also developing wireless tools to sound an alert if elderly people fall within their homes.
EQ-Radio is far from the first attempt to scientifically determine people’s emotions: the medieval Persian physician Avicenna wrote around the year 1000 of diagnosing melancholia, similar to what would now be called depression, by measuring a patient’s pulse, wrote Manuel Garcia-Garcia, a senior vice president for Research and Innovation at the Advertising Research Foundation, in an email to Fast Company.
And more recently, companies ranging from fledgling startups to giants like Microsoft and IBM have offered software to infer emotional state from facial expressions, spoken words, and written language. And while these tools can be useful to companies looking to understand their customers’ emotional states or even to consumers looking to track their own feelings over time, there’s also plenty of potential for abuse, especially if people’s emotions, or simply their heart rate and breathing, are tracked without their consent or even knowledge.
“I think that any kind of nonconsensual monitoring of people’s metabolism is a pretty serious invasion of privacy,” says Jay Stanley, a senior policy analyst at the American Civil Liberties Union. “I wouldn’t be surprised if we see security applications for that, maybe even commercial applications, where people aren’t even aware that they’re being monitored, let alone having given permission for it.”
Security officials could treat elevated heart rates as evidence of lying or suspicious behavior, or employers might shy away from hiring job candidates whose vital signs suggest potential health issues, he suggests.
The MIT researchers say their current prototypes are designed so they can only be used consensually: The existing version of the device prompts users to make certain distinctive motions that it can wirelessly detect, in order to effectively authorize it to begin tracking, says Katabi. And, she says, they’ve already developed ways that people can block such a system system from taking measurements where it’s known to be in use, essentially by transmitting interference at similar frequencies.
“You want to block the information this wireless signal has by countering it with another wireless signal,” she says.
But in some cases, like in employer-employee relationships, people might still find themselves coerced into allowing such technology to be used, potentially with little recourse under current laws, says Stanley.
“If an employer informed its employees that it was doing it, and was very up front about it and made it a condition of employment, I’m not sure whether it would be illegal,” he says. “Even so, it’s an extremely intrusive thing to do to your workers.”