The robots are here. They’re scanning shelves at your local supermarket, delivering food, and even assisting nurses in the hospital. As the robots begin to infiltrate human spaces, questions remain for designers and engineers tasked with convincing people to view them as approachable and friendly, rather than ignoring or avoiding them?
Addressing the complexity of human-robot interaction was the goal of the designers behind a new robot at the central Oodi library in Helsinki, Finland. The library brought in the digital consultancy Futurice to help it to transform some of its existing robots—which help move books between floors—into ‘bots that could help librarians with other tasks. After interviewing several librarians, the Futurice team decided to reprogram the robots to perform a task that the humans universally hated: Showing customers where the fiction section is, or pointing them toward the bathrooms. It’s a simple task, but time-consuming. Perfect to hand off to a robot.
But there was a problem.
“As we were testing [the robot], people weren’t relating to it as a social object. Some children were jumping on top of it, impeding it doing its job,” says Minja Axelsson, a roboticist at Futurice who designed, coded, and tested the robot. People weren’t necessarily put off by the robot, but they certainly didn’t connect with it. The bot’s form—basically, a box on wheels—was simply too abstract for people to make sense of what it was and how they were supposed to interact with it.
To help people see the robot as a friendly helper, Futurice came up with a simple interface: Googly eyes. Inspired by one of Disney’s 12 basic principles of animation—a list of rules published in 1981 that the company’s animators use to create the illusion of life in their illustrations, including the idea of “exaggeration”—Axelsson decided to use googly eyes combined with sound and movement to both show the robot’s intent and express its state of being. Most importantly, the eyes are programmed to indicate the robot’s direction to customers, so they’re not caught unaware when it’s moving around.
But Axelsson also created a matrix of behaviors to make the robot seem more dynamic, like spinning around and beeping, based on what happened to it. If it successfully led a person to the section they were searching for, its “emotions” would become more positive with high arousal, leading to the robot chirping happily. If it failed, its state would become more negative, and if it wasn’t being used, its emotional state would tend toward low arousal—which would result in it moving around and trying to get people’s attention using its eyes. For instance, if customers haven’t plugged a request into the robot’s tablet in a while, it starts to get “bored” while it sits in its position at the top of the entrance stairs. “If it hasn’t had a mission for a long time, it can start to move around a little bit, to indicate, ‘Hey! I’m here!'” Axelsson says.
If someone then comes over to interact with it and puts a location into the tablet, the robot’s state would change again toward more happy beeping, with its googly eyes leading the customer in the direction they both want to go.
According to Romeo Pulli, who works in information technology at the library, these movements combined with the googly eyes are designed to entice customers into interacting with it. The small change shifted the robot from a utilitarian automated book cart on wheels to a robot with personality. That was a big bonus: “It’s more approachable with the eyes,” Pulli says.
Once the googly eyes were in place, Axelsson says, the robot abuse stopped. Instead, people had positive reactions to it, and groups would follow it around the library like a little flock. (Not everyone was a fan: one woman exclaimed, “I won’t try any robot!” and stormed off when she saw it.)
Rather than trying to mimic a human expressions entirely, the eyes help people make the mental switch toward treating the robot differently. The low-fi approach, embodied by the big, cartoonish eyes, also helps clue visitors in to the fact that the robot’s not as smart as a person.
“If you make it too humanoid, people also assume that it is has more sophisticated skills than it actually would have,” Axelsson says. “With the eyes and the emotions, the goal is definitely not to build a robot that’s like a human. It’s more to make it more usable, to clearly communicate what the robots abilities are, to make it fit within its environment in a way that users feel comfortable.”
The designers of the Oodi library’s googly-eyed guidance robot aren’t the only ones to think of putting adorable animated eyes on a machine. In 2018, the car company Jaguar Land Rover added cartoon-y eyes to an autonomous vehicle it was testing, with the aim of encouraging people to trust the car. Even though the car can’t technically “see” pedestrians when they cross in front of it, the googly eyes indicated to the person that the autonomous vehicle’s cameras and sensors has detected them and it wouldn’t run them over. Earlier this year, the supermarket chain Giant Food Stores similarly added googly eyes to customer-facing robots, which it rolled out to 172 stores on the East Coast. The company reports that people like to take pictures with the bots.
According to research on the subject, adding googly eyes to other objects has an impact on the way we think about them, too. In a study from 2012, researchers placed cartoon eyes on one donation bucket in a supermarket and left another without any eyes. Over the course of 11 weeks, 48% more people donated to the bucket with the googly eyes compared to the one without. Another 2018 study that similarly focused on donations confirmed this effect, adding evidence to the idea that images of eyes encourage people to engage in behaviors that help others—like donating or volunteering. Scientists think this happens in part because images of eyes make us we feel like we’re being watched, which means we’re more likely to feel pressure to volunteer for the good of the group.
Why do humans have such a strong reaction to such a simplistic design? It may be because we tend to turn anything into a face, something that robot designers have exploited to encourage people to interact with robots like they’re intelligent, living beings. As Axelsson’s library robot shows, all it takes is a pair of little plastic disks with a smaller black disk floating around inside.