Expert humans or face-reading machines could have saved thousands of lives on 9/11 by detecting the emotional states of hijackers. They would have triggered detainments, says San Francisco-based psychologist Paul Ekman. But they weren’t being used.
In the years after 9/11, Ekman, the expert who inspired the fib-hunting character played by Tim Roth in the Fox series Lie To Me, has worked with the Central Intelligence Agency (CIA), the Department of Defense (DOD), the Department of Homeland Security (DHS), and others to help develop both people and machines that read faces for emotions and help stop disastrous events on all levels. New applications are assisting Transportation Security Administration (TSA) agents in screening for potential terrorists at airports or teaching U.S. Army Special Forces in Afghanistan how to determine enemy combatant veracity or intent to kill. Ekman has provided training to a whole series of people who were guards at Abu Ghraib prison, too, in how to extract information and truth without torture. “They used my [facial analysis] work, and it was very successful,” Ekman tells Fast Company.
He’s helped pioneer a field called facial emotion measurement, which shares some ties with both face-recognition and neuromarketing. The CIA and Cars studio Pixar alike are users of facial emotion analysis services. Apple and Microsoft have been building their own capabilities while Google has just recruited a facial expressions expert from a company Ekman advises. But the category goes way beyond the mere branding or advertising uses and delves into a realm that would wow Philip K. Dick. Ekman has written a book, Emotional Awareness, with the Dalai Lama. Classrooms are putting his methodology to work. And others’ applications are helping those with cognitive disabilities be better understood or simply enabling people to be more in tune with their emotions.
There are two ways to go about facial emotion measurement: a human way to analyze facial “microexpressions” and emotions; and a technological, automated method. In life or death matters, the human approach, which relies on specially trained facial observers to gauge perpetrator deception, is significantly more accurate than the automated method, according to Ekman. He is universally credited for developing the Facial Action Coding System (FACS), a comprehensive dictionary of facial expression measurements that has become the scientific underpinning for human observer and automated facial analysis around the world, across varied academic and commercial fields. (Ekman went against the grain and celebrated anthropologist Margaret Mead to prove naturalist Charles Darwin’s hypothesis that facial expressions of emotions–anger, disgust, contempt, fear, surprise, sadness, happiness–are innate and universal in earthly humans, and not culturally determined). And while he spends a lot of time these days training intelligence, military, and law enforcement officers on how to uncover a subject’s deception or concealment in a split second (when there is no time for laborious, frame-by-frame video analysis or FACS catalog consultation), he also acknowledges the need for a “backup” system whereby computers automate the FACS work. Humans get tired, after all, and their levels of observation or interpretation may vary.
Ekman, in fact, presently sits on the board of a nascent company, Machine Perception Technologies (MPT), which specializes in automating facial expressions analysis based on his FACS foundation. MPT was spawned from the DOD’s innovation arm DARPA and DHS in the aftermath of 9/11. The CIA has been studying the efficacy of MPT’s “Neural Network” method of automating microexpressions analysis versus an alternative based on “Artificial Intelligence.”
About half of MPT’s work is in the “security” arena (including DHS Project Hostile Intent); the other half is in marketing, COO Stanley Kim tells Fast Company. MPT wants to advance “machine learning” and create smarter, more natural human-machine interactions and interfaces. Current MPT clients Kim would disclose include Procter & Gamble, Intel, and Sony (MPT built the engine for the smile shutter camera). Sony executives say games that can read players’ faces and lies will be available before 2020.
Then there’s Affectiva, which bills itself as an emotion measurement technology company. It was recently spun out of MIT Media Lab. Affectiva’s methodology is based on FACS but updates and links it to a range of emotional states, and automates it with a proprietary algorithm. Cofounder and CTO Rana El Kaliouby tells Fast Company Affectiva’s initial focus has been in health care. The startup has developed tools that help those on the autism spectrum to communicate and applications that allow people to self-monitor their anxiety level or heart rate via skin and facial sensors. Recently Affectiva has been busy consulting to marketing and media clients on advertising and consumer engagement (for example, measuring the extent to which various ads and elements evoke facial joy, interest, laughs, etc–as shown in the Affectiva dashboard below).
While critics of Ekman’s own field assail its big brother intrusions, sinister mind-reading, phony junk science, low accuracy, or high expense, Kaliouby says his FACS encyclopedia on its own is incomplete, and that it does not correlate, explain, or predict the most common and relevant human emotions.
“Our system first detects the FACS units and then combines those codes in space and time to infer what emotion the person is communicating/expressing,” she says. Affectiva tries to build algorithms that capture nuances of facial expressions (beyond the basic emotions) and focus on relevant emotions, such as enjoyment–or interest and confusion–which aren’t part of Ekman’s basic taxonomy but are critically important in applications such as advertising, an online tutoring system, or an online game that tracks whether users are engaged, bored, or confused, Kaliouby says.
Ekman, in turn, thinks the Affectiva methodology is less comprehensive and precise than his own or MPT’s model. “They take some shortcuts. I’m sure it works to some extent. But when the stakes are very high (in security or criminal matters) … Over 90% accuracy (of FACS) is required. Seventy percent–which you can get published in Science–has no practical use.” Unlike some of his counterparts in the field, Ekman will not apply his work to partisan politics or judicial cases (e.g. supplying witness testimony or advising on jury selection) and decries the ethics of doing so.
Affectiva is aiming to make neuroscientific and emotion measurement tools accessible and transparent to consumers (only as they opt in) and do good in the world with the broadest applications (beyond ad testing). Kaliouby is uncomfortable having the company associated with “neuromarketing” and says Affectiva will never focus on security and deception.
Ultimately Affectiva says it wants people to be more aware of their emotions and empowered to better manage and improve their moods, health, work, socializing and life. “Our big dream is to power and be the emotional context of everybody’s interactions, the ‘Affectiva Inside’ that”, says Kaliouby. Cofounder and chief scientist Rosalind Picard says their cloud-based technology will be used to crowdsource and report facial reactions to stimuli, including political candidates and events, business presentations (Affectiva is advising The Bill & Melinda Gates Foundation on speaking), movies, news stories, and, yes, ads.
Another maker of an automated emotion measurement system, London-based Realeyes, provides eye-tracking and facial emotion analytics. Its CEO Mihkel Jaatma says while building off Ekman’s FACS “methodological backbone”, they are less focused on the full body, biology, and health care than Affectiva, or Philips, which has recently introduced an iPad camera app which can report your vital signs by reading the color changes on your face. While Realeyes also does a lot of facial measurement work in marketing, the company is seeing growth in games and education. Jaatma tells Fast Company that Realeyes is now working with Kaplan Education, Fox International Channels, the University of Cambridge, and Hungarian Artificial Intelligence scientists on Project Picaro to develop video games that help teach Hungarian students aged 4-9 English based on a continual, optimizing process of analyzing and responding to their faces. Picaro’s facially enabled learning products are being used in classrooms in Szeged, Hungary today.
Jaatma, like Affectiva’s Kaliouby and MPT’s Kim, says there’s been a recent surge in inquiries from potential clients spanning the globe and a range of industries. “Facial expressions and emotional states will soon become an essential part of interfacing users with all sorts of software,” Jaatma says. “Less of keyboards, less of using [a] mouse, more of users just being themselves. More devices and more applications will understand the face of the person and be able to adjust themselves according to how the user is feeling and what the next feelings are predicted to be. The expressions recognition technology will also be used beyond the face for the whole body.” Full-body gesture tracking will enable more understanding of emotions and allow us to categorize behavior in places such as shopping malls, casinos, and transportation hubs for example, Jaatma says.
For Kim, “The next generation [for MPT] will be about the interface, you, and the machine in Minority Report asking, ‘How are you feeling?’ It’s going to watch you and learn from you and vice versa.”
Ekman thinks FACS is more advanced, useful, and accurate than other hot neuroscientific areas such as neuromarketing, which focuses on the brain; he says the study of the face continues to reveal more clearcut and precise connections between known physiological and emotional states. The overriding strategic objective for the psychologist advising the Secret Service remains “preventing assassinations of political leaders and stopping terrorism in our country.” But not all of Ekman’s work involves detecting liars and catching perpetrators. While he turns down marketing assignments almost daily, Ekman devotes a lot of time and energy to softer, emotionally “uplifting” and “fun” projects too. He continues to consult to Pixar Director Pete Docter on animated character expressions through his dictionary updates. (Toy Story was the first cinematic use of FACS). Ekman worked on Rise of the Planet of the Apes and Avatar. Disney, DreamWorks and Industrial Light & Magic are clients. For the hit Fox TV series Lie To Me, which is based on him and his work, Ekman advised cast members on acting to scripted facial expressions.
As different as their methods, services and views may be, all of the facial emotion measurement experts agree that new applications in the space will emerge that cannot be imagined now; several mentioned tools will be coming that detect drunkenness, driver fatigue, disease, or depression.
For Ekman, one goal–“improving emotional life”–hasn’t changed since 1948, when his mother committed suicide. Ekman was 14. Wishing he could have seen the warning signs of bipolar disorder on her face before it was too late, he became driven by a determination to “help mental patients” and “save lives.”
In the process, he’s built a new science.
Follow Kevin Randall on Twitter: @KevinBrandall