Fast company logo
|
advertisement

CO.DESIGN

There’s a new AI that can guess how you feel just by watching you walk

Are you a happy, sad, or angry walker? Scientists are teaching algorithms to perceive emotions based on gait alone.

There’s a new AI that can guess how you feel just by watching you walk

[Source Image: Nosyrevy/iStock]

BY Jesus Diaz1 minute read

Our emotions can influence everything from our appetites to the way we perceive the world—and even how we walk (just ask Charlie Brown).

So is it possible to interpret how someone is feeling based on their gait alone? That’s exactly what scientists at the University of North Carolina at Chapel Hill and the University of Maryland at College Park have taught a computer to do. Using deep learning, their software can analyze a video of someone walking, turn it into a 3D model, and extract their gait. A neural network then determines the dominant motion and how it matches up to a particular feeling, based on the data on which it’s trained. According to their research paper, published in June on arXiv, their deep learning model can guess four different emotions—happy, sad, angry, and neutral—with 80% accuracy.

While we’ve seen AI trained to predict how people feel based on their facial expressions or even their voices, this is the first time an algorithm has been trained to make an accurate guess by simply watching someone walk.

[Image: University of North Carolina at Chapel Hill/University of Maryland at College Park]

Aniket Bera, who supervised the research and and is a professor in computer science at the University of North Carolina at Chapel Hill, told the science and technology blog Techxplore that their research doesn’t try to detect true emotion—rather, it predicts perceived emotion, just like people predict about each other every day. According to Bera, the work could teach robots to anticipate how people around them may be feeling, and adjust their behavior accordingly. Or the inverse: It could help engineers design robots to better innately communicate with their gait and body movements.

advertisement

Bera adds that the underlying research could eventually make its way into surveillance applications, or even help make mixed-reality experiences more engaging, since the 3D models of people walking in particular ways could help design more lifelike characters.

But as the team concludes in their paper, the next step will be looking beyond walking to everything from running to gesturing—understanding the subtle emotions that we express when we move.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards before the final deadline, April 5.

CoDesign Newsletter logo
The latest innovations in design brought to you every weekday.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Jesus Diaz is a screenwriter and producer whose latest work includes the mini-documentary series Control Z: The Future to Undo, the futurist daily Novaceno, and the book The Secrets of Lego House. More


Explore Topics