Fast company logo
|
advertisement

“Ambient computing” and other sensor-driven technologies are promising to make cars of the very near future much safer.

What happens when cars get emotional?

[Photo: Ruvim Noga/Unsplash]

BY Mike Elganlong read

Self-driving cars are nice—but the real revolution is automotive empathy.

While we’re all waiting for Total Recall’s “Johnny Cab” to arrive, a more profound change will hit the road: Cars are learning to get in touch with our thoughts in dramatic ways.

Emotion and activity detection will take many forms and employ myriad new technologies, especially artificial intelligence (AI) that processes real-time data from cameras, microphones, biosensors, and even radar.

Companies motorists have never heard of (like Affectiva, Guardian Optical, Eyeris, Smart Eye, Vayyar Imaging, Seeing Machines, B-Secur, eyeSight, Nuance Automotive, BeyondVerbal, and Sensay) are working on dashboard technology that watches over us like a digital fairy godmother—or a high-tech Big Brother, depending on your perspective.

The biggest reason for this tech shift is safety—because the most dangerous thing about a car is the driver.

The U.S. Department of Transportation says that “human choices . . . are linked to 94% of serious crashes.”

Emotion and activity detection is being designed to alter or even override risky choices to save lives. They’ll detect not only driver drunkenness, stress, confusion, distraction, or sleepiness; they’ll also identify and pay attention to the interaction between people in the car, then use artificial intelligence to “understand” the context of human behavior in order to respond in a way that supports human life, health, and happiness.

Surprising things can happen inside cars that suddenly increase the risk of an accident. Drivers can see something upsetting—a car accident by the side of the road, or an animal injured by a car—or do something distracting, such as spilling coffee or dropping a mobile phone. Emotion and activity detection can detect when this happens and take safety-related actions, such as going into autonomous mode briefly and slowing down until the driver can recover.

In emergencies, even if the driver is unconscious or incapacitated, cars should be able to call 911 or even drive them autonomously to the hospital.

But safety isn’t the only objective. “The consensus among experts is that emotion and activity detection will begin with human-driven cars, then continue to exist and evolve in the future self-driving car era—they’ll be deployed to “serve and delight” people in the car, providing everything from functional safety to pleasure island,” according to Tal Krzypow, vice president of product management for eyeSight Technologies.

Some new features enabled by this shift will combine both safety and satisfaction. Nils Lenke, PhD, senior director, innovation management, at Nuance Auto, says that by using “gaze detection,” you’ll be able to ask the car questions like, “What is that building?” Or, “Is that restaurant open?” and get the answer without fumbling with a smartphone.

 Making sense of the sensor-based car

Ever since Henry Ford began mass-producing automobiles, the car has functioned as little more than a machine that converts gasoline or electricity into motion for transporting our bodies from one place to another.

The car of the future is still a conveyance, but more profoundly an artificially intelligent supercomputer on wheels that rapidly processes terabytes of sensor data, makes sense of it, then issues microsecond-by-microsecond commands to the wheels, engine, and brakes to effect semi-autonomous and, eventually, totally autonomous driving.

While exterior sensors point outward in all directions to detect the edges of the road, changing traffic signals, the speed and location of bicyclists and myriad other factors, powerful sensors inside the car will point inward toward drivers and passengers to detect factors even more complex: Human emotion, attention, and action.

But which sensors? The leading candidates are cameras, for watching activity and reading facial expressions; microphones, for detecting emotion through voice intonation; direct sensors on the steering wheel or seatbelt; and even radar!


Related: See Toyota’s all-electric car line for 2025


Each of these sensor types offers benefits and limitations. Gawain Morrison, CEO and cofounder of Sensum, summarizes the challenge: “Emotional signals like facial expressions and voice patterns can be sensed with cameras and microphones, which are common tech and are often already installed in vehicles. But we don’t always show much facial expression, especially when engaged in a complex activity like driving. Likewise, we don’t always speak, or the environment can be so noisy that it’s impossible to understand what is being said. On the other hand, internal physiological processes such as heart rate, respiration, and skin conductance can provide a constant emotional signal that is nonconscious and hard to fake. But to measure these signals typically requires more intrusive sensors such as wearables.”

“The technology is evolving very fast,” Morrison says. “For instance, there are now contactless ways to measure signals such as heart rate and breathing. One leading example is ultra-wideband, also called biometric radar. In time, we may see minimally intrusive ways to measure ever more subtle and fundamental psychophysiological signals, such as brain-waves or changes in the body’s chemistry.”

It seems likely, given the low cost of sensors and the high value of good sensing, that over time, some cars will likely deploy all the sensors—cameras, microphones, contact biometrics, and radar.

That sounds like a lot of sensor technology just for a car. But in fact, this AI-processed sensor technology is the future of all computing.

“Ambient computing is in the air (and soon the car)”

One of the tectonic tech transformations in the human/computer user interface is an idea called “ambient computing.” Often associated with the Internet of Things (IoT), “ambient computing” is the result of “computers” (microprocessor-based devices of every description) and sensors embedded in our living and working spaces in a way that enables us to “use” computers and the internet without being fully aware that we are doing so.

A simple analogy for understanding “ambient computing” is the automatic door at a grocery store. With a regular door, the “user” walks up to it, identifies the location and functionality of the door knob, manipulates it as a mechanical device, then uses the knob to push or pull the door open before walking through it. But with motion-activated automatic doors, the “user” doesn’t “use” anything; the door opens as they approach, and they just walk through.

With enough sensors and better AI, many of the tasks we do today with PCs and smartphones might be accomplished like that automatic door.

Telling the microphones embedded somewhere in your office to “book a meeting next week with Jerry” might trigger a series of advanced compute tasks like figuring out who Jerry is, contacting Jerry’s AI calendar software, negotiating a time, using the preferences of both parties and known locations to identify a spot for the meeting, making a reservation, and placing the meeting on the calendar, which will trigger an audible reminder in time to prepare for the meeting.

In our homes and workplaces, sensors should detect that we’re a little cold by scanning our skin, then turn up the heat based on learned past preferences.

“Ambient computing” involves embedded connected computers and sensors that vanish into our living and working spaces and are attached to our bodies, which feed AI that acts in a very individually tailored way on our behalf for our comfort, convenience, and safety.

For most consumers, our first immersive experience with full-fledged “ambient computing” will almost certainly be our cars.

There are three reasons why cars get “ambient computing” before most homes and offices. The first is that it’s easier to achieve in the cramped quarters of a car. Inside the car, we’re a captive audience, strapped into a chair facing a control panel. Bodies and faces are in a fixed and predictable location and orientation. We’re also physically touching the “device” (the car seat, seatbelts, and steering wheel) so there are additional opportunities to detect biometrics and physical movement.

The second reason is that, unlike in a home, the range of possible actions and desires is limited. It’s easier to tease out the context of what people are doing. For starters, the person in the driver’s seat is driving—that’s their fundamental context. So AI agency can be focused on helping the driver safely and comfortably drive.

And third, cars are dangerous for both the occupants and the people around them. More than 100 people die in car accidents on average in the United States every day, according to the U.S. National Highway Traffic Safety Administration. So car buyers, car makers, and society at large are all strongly motivated to deploy “ambient computing” technologies to save lives.

In fact, the danger factor is “driving” regulation that requires cars to ship with “ambient computing” technology. European law is leading the pack. Cars sold in Europe starting in 2022 will be required to detect drowsiness, attention, and distraction. The EU’s General Safety Regulation enacted in April by the European Parliament will require that new cars sold in Europe by 2022 will have to contain several safety features, including a range of alerts to notify the driver. These include alerts for drowsiness and distraction, which will necessitate technology in cars that can detect these mental states and activities (such as being able to tell when a driver is looking at a smartphone).

advertisement

The EU law isn’t requiring anything that car companies haven’t already come up with. These already exist in the high-end luxury cars. They’re just mandating a more even distribution of these technologies across car types.

Because run-of-the-mill cars will be forced by law to include basic features, the luxury cars are newly incentivized to take their “ambient computing” features to the next level.

“We expect emotion sensing to be mainstream in the next two to three years,” says Nuance Auto’s Lenke.

That’s why some carmakers are turning to a company called Affectiva (an MIT Media Lab spinoff).

Affectiva’s software has previously been used to gauge emotional reaction to advertising and political debates. Recently, however, Affectiva raised $26 million to plow into the automotive space. Affectiva expects its technology to show up in cars on the road within three years.

The first iteration of Affectiva Automotive AI uses cameras and microphones already built into cars, plus proprietary deep learning algorithms. The cameras enable the AI to detect three emotions from both drivers and passengers: joy, anger, and surprise, and each occupant’s face is constantly monitored for mood, either being positive or negative.

Also: The algorithms watch for the signs of sleepiness, which include the rate of blinking, yawning, and when the eyes close.

Automotive AI is also listening through the cars’ built-in microphones for signs of laughter or anger, as well as how alert the driver is.

Why emotion detection is hard

In the early years of emotion detection research, it was common to assume that people experience six emotions: anger, fear, disgust, sadness, happiness, and surprise.

In this “low-resolution” emotional landscape, researchers also based their work on the now-discredited idea that people experience and express those emotions in the same way.

It turns out that people have more emotions than six (experts disagree about how many individual emotions people can feel), and that the expression of emotions is not universal. Krzypow says that “one challenge in emotion detection is that everyone is different. Both between and within cultures, people use different facial expressions, vocal pitch, and even heart rates while expressing the same theoretical emotions.”

“Nuance Auto’s partner for sensing technology is Affectiva, which trained its emotion detection by using about 6 million faces analyzed in 87 countries to account for differences in how different cultures and different individuals express emotions,” according to Lenke. Affectiva’s Automotive AI system is being integrated into Nuance’s automotive virtual assistant, which is called Dragon Drive. (Nuance says its Dragon Drive is used in more than 200 million cars in 40 languages. It’s the technology behind existing voice-interaction systems customized for individual car brands like Audi, BMW, Daimler, Fiat, Ford, GM, Hyundai, SAIC, Toyota, and others.)

Rana el Kaliouby, PhD, Affectiva’s cofounder and CEO, told me that in order to avoid algorithmic bias, “You need to train your algorithms with lots of diverse data, so it can accurately detect people regardless of age, gender, ethnicity. Collecting this data is time consuming and expensive, but required for the technology we build. Not only is it the ethical thing to do, but frankly it’s tablestakes to ensuring the technology will work.”

And one more possible roadblock . . .

The monitoring of emotional and mental states could turn out to be controversial: Consumers may feel that emotion and activity detection in cars is an invasion of privacy or a trespass upon freedom.

Some people have resigned themselves to being tracked by technology, to a persistent invasion of digital privacy. But when technology takes away control, consumers may rebel.

In cars, emotion and activity tracking systems could enforce laws, such as speed limits, or prohibit or limit a student driver, prevent someone who has had their license revoked from driving, or prevent drunk driving. Some European politicians are talking about possibly requiring future cars to administer alcohol breathalyzers before enabling the car to start.

Studies conducted at Nuance Auto found that “not everyone will like every reaction, so it will be important to make these features personalized and configurable,” according to Lenke.

Where are emotion-aware cars taking us?

The venerable automobile will be transformed over the next decade or two by autonomous driving technology, and we’re all expecting this transformation.

What most consumers are not expecting is a whole new relationship with their cars.

While emotion and activity detection, combined with AI, will enable the car to understand, predict, help, and protect you (even from yourself), cars will also become more communicative.

For example, what can a car do about a driver consumed by road rage?

Various proposals include soothing music, suggesting a place to stop, changing the temperature, or even taking control from the driver and pulling over.

But one suggestion is that the voice of the car’s virtual assistant talks the driver through deep breathing exercises and what is essentially on-the-spot anger management. The car becomes a coach to do what a human friend might do, which is to help calm you down.

Karl Iagnemma, PhD, who works as president of autonomous mobility at Aptiv, spoke at Automotive AI’s Emotion AI Summit last year about the importance of both safety and driver satisfaction of drivers trusting their cars. One way to do that is to use AI to develop what he calls a “shared mental model” that can result from the car sharing what it’s “thinking” and what it’s going to do next.

Eventually, self-driving cars will be nice. But even before we get to universal autonomous vehicles, we’ll enter into a whole new relationship with cars. Getting in the car will be to immerse ourselves into an empathy pod, where sensors and AI will work together for our safety and happiness.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the final deadline, June 7.

Sign up for Brands That Matter notifications here.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics