Fast company logo
|
advertisement

Technology is quickly gaining the ability to understand–and anticipate–human emotions. What happens when AI knows what you need before you do?

BY Richard Yonck5 minute read

Charlotte strolled through Central Park, the filtered sunlight dancing along the path before her. Taking her time, occasionally stopping to admire a newly blooming tree, she spoke evenly, authoritatively. “… include the department heads and copy the primary stakeholders,” she said, stretching to smell a Magnolia blossom. “Format it with a firm but non-adversarial tone.”

“Got it,” a young man’s voice chimed on her wireless earbud.

“Oh, and Adam,” Charlotte added, “Janice just had a baby girl–Emily, I think is her name, but you can look that up. Add a suitable line of warm congratulations near the end of the email, but obviously keep it professional.”

“Obviously,” Adam chimed back. “Should I send her flowers as well?”

“She’s a bit of a minimalist, not really the bouquet type.”
Without a pause, Adam said, “That’s why I thought to send a single Dendrobium orchid. Pink, of course. Analysis of her emotional profile indicates a greater than 90% likelihood she will love it.”

“Perfect! Adam, you’re a gem. And thank you for suggesting this walk. It’s really lifted my spirits.”

“You’re welcome.”

“I really like what they’ve done with your last upgrade,” Charlotte said, subconsciously touching her hand to her ear. “Your emotional depth is much more nuanced.”

“Thank you, Charlotte,” the personal assistant AI known as Adam responded. “I’ll be sure to pass that along to the developers. Now,” the AI continued. “Would you be interested in some hot chocolate? There’s a new café two blocks away that’s said to have some of the best in the city.”

The young executive laughed lightly. “Adam, it’s as if you could read my mind!”

Welcome to the age of artificial emotional intelligence, an era in which our technologies are increasingly able to read, interpret, predict, and even influence our emotions. From the early stage, we find ourselves at today, these programs, robots and other devices will grow to become our friends, confidants, and possibly much more–like the film Her brought to life. In the process, they will transform nearly every aspect of our relationship with technology, beginning with the world of commerce, the driver for all of this change.

Why would we do this? Because emotions are at the heart of the human experience. Before there was technology, before there was even language, our emotions played crucial roles in communication, social bonding, even our decision making. Today emotion still remains at the core of who we are and how we communicate. As a result, it is our most natural means of interacting with the world. This has led us to the stage where we are now building ways to emotionally interface with our machines as well.

These technologies come to us from a relatively new branch of computer science called affective computing, stemming from work initially done at MIT Media Lab in the mid-1990s. From there it advanced to where it is today, rapidly becoming a growing commercial force. But this is only the beginning. One market research firm recently forecast the global affective computing market will grow from $9.3 billion in 2015 to $42.5 billion by 2020. This is being driven by applications in brand and product market testing, assistive and companion robots for seniors, alertness detection systems in vehicles, immersive gaming, education and tutoring, PTSD treatment in soldiers, and much more. As a result, we are seeing the beginning of what is coming to be called the “emotion economy,” a range of interconnected software and services which seek to fulfill the promise of emotionally aware machines.

advertisement

In many ways, this is similar to the early days of the PC and the Internet, the first digital revolution. The early systems may seem a little clumsy when compared with the ease with which we’re used to interacting with other people. But over time these services and devices will develop and become increasingly sophisticated and nuanced. Companies such as Massachusetts-based Affectiva, which develops facial expression recognition software and Tel Aviv-based Beyond Verbal with its voice emotion analytics provide access to their services via application programming interfaces. This allows other companies the ability to develop new services without starting from the ground up.

“We really want to take the barriers away and just make our technology accessible to a wide range of developers that are building interesting things,” says Affectiva’s vice president of marketing, Gabi Zijderveld. “We want to be the emotion AI platform. We want to be the technology that powers that.”

In a different approach, other startups will be acquired, as Apple did with Emotient, another facial expression recognition company. This happened at the beginning of 2016 and there’s been considerable speculation that it was done as part of a strategy to augment the capabilities of Siri, Apple’s intelligent personal assistant.

Over time, all of these technologies will become increasingly interconnected and interdependent, giving rise to an ecosystem of emotionally aware services and devices. Software-as-a-Service and a range of affective apps will deliver these powerful capabilities to our smartphones and other devices. Drawing on the developing sensor-filled world known as the Internet of Things, inputs of all manner will eventually act as additional eyes and ears, allowing these services to connect with and engage us almost anywhere.

This is not a dream of some distant future. Already billboards are being developed that can recognize and categorize individuals, then demographically direct personalized messaging. Researchers are developing systems that can detect early signs of autism, as well as means of better engaging with students in schools. Therapy robots to support and engage seniors, victims of Alzheimer’s and other patients are seeing billions of dollars in research funding as a means of addressing the burgeoning elder population seen in many developed economies.

Adam, the AI assistant in the opening scenario, could well be the descendant of today’s Siri, Alexa, Cortana, and other increasingly capable intelligent personal assistants. While these programs are amazing when compared to the digital assistants of only a decade ago, they are nowhere near as functional as they will be in just a few more years. Improvements in deep learning algorithms and the computing power of servers and smartphones will make these programs vastly more responsive to our unspoken needs. Nowhere will this be so evident as in those devices that are imbued with emotional awareness. Such developments will permanently alter our relationship with technology, forever transforming it from mute servant into a far more intelligent and collaborative partner.

Richard Yonck is a futurist, author, and speaker with Intelligent Future Consulting in Seattle. His new book, Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence explores the next giant step in the relationship between humans and technology: computers that can recognize, respond to, and even influence human emotions.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

ModernCEO Newsletter logo
A refreshed look at leadership from the desk of CEO and chief content officer Stephanie Mehta
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Richard Yonck is a foresight analyst for Intelligent Future in Seattle and is the Computing/AI Contributing Editor for The Futurist magazine. More


Explore Topics