Last month, unwitting burghers in the suburbs near Virginia Tech got a taste of the future: A self-driving Ford minivan motoring around town with no one inside. The car behaved usually, like all self-driving cars do. Which is to say, it obeyed the letter of the law. It never went so much as a mile over the speed limit. It came to a full two-second halt at every stop sign. One driver, stuck behind this infuriating machine, honked and peeled around so that he could yell at the driver. And then he gaped in wonder, when there was no driver at all. Except, there was a driver. He was hiding in plain sight, wearing a suit that blended seamlessly into the driver’s seat. When a reporter spotted the minivan, his video went viral. But what went unreported until now was that it was all in the name of science.
— Adam Tuss (@AdamTuss) August 7, 2017
On Youtube you can find dozens of videos of people wearing simpler seat suits to order fast food at drive-thrus and scare the bejesus out of employees. But engineers at Ford and Virginia designed this one to see how real-life pedestrians behave around a driverless car. “In a car with no one in it, you’re talking not just about how the customer reacts, but how other people behave around it,” says John Shutko, a human factors specialist at Ford. They wanted in particular to test a set of signals designed to communicate what a driverless car is doing. But it’s still illegal to have a driverless car on the road without any supervision, so the suit was a necessary hack. “You can’t stage situations like these,” explains Andy Schaudt, the project director from Virginia Tech Transportation Institute, with has a specialty in collecting real-world driving data. “You have to get out and make it realistic.”
Today, cars have only one interface for sharing a driver’s intent with the world around her: turn signals, which hardly ever get used properly. But there is another that we don’t appreciate, because it’s second nature. Before you walk across a busy intersection, you probably try and make eye contact with the drivers who are stopped, to make sure they see you and that they’re paying attention. Often, the driver in turn will wave you past. Meanwhile, if you’re driving and pull up to a four-way stop, you look around at all the other drivers, to see they understand who’s supposed to go next. Point being, there is a subtle interplay of gestures and body language that pass between drivers and bystanders on the road. How do you replicate that dance with a self-driving car?
The Ford team came up with a simple solution: A light on the windshield, with three symbols that correspond to the main actions of the car. a solid white light means the car is driving itself; two white lights moving side to side mean the car is yielding; and a rapidly blinking white light means the car is about to start accelerating. To you and me, those signals might not seem particularly obvious. But that’s the point: Until people learn a set of signals, they won’t be. The researchers were probing how well people could understand signals they’d never seen before. In all, the researchers collected 1,800 miles of road data, spanning 150 hours.
Getting naturalistic reactions required perfecting the actual suit—people had to look in and really believe the car was driving itself. The ingenuity lay in the attention to detail: the controls of the car were retrofitted, so that the drivers would never have to lift their hands into view. And both seats were redesigned, so you couldn’t see something was strange with the driver’s seat in particular. Black accents were done all around the seats, so you couldn’t tell they were fatter, to accommodate a person inside. “We tried to use misdirection,” Schaudt says. It worked. When Shutko gave a engineer at Ford a curbside look at the “autonomous car,” his only complaint was that the seats weren’t original to Ford. The engineer didn’t realize, even up close, that someone was inside.
Then came the matter of scientifically testing people’s responses to the signals. The researchers had the car drive about half the time using the signals, and half the time without them. The car itself was loaded with six high-resolution cameras, covering a 360-degree view, as well as microphones to record 3D audio. Currently, analysts are poring over the video, logging whether people were surprised or hesitant or oblivious; if they saw the signals, and for how long; and dozens of other signals covering posture and facial expressions. Once the data is tabulated, they’ll try to glean whether the signals actually worked—whether, after seeing them, people crossed faster or with more confidence. They’ll also be looking to see whether people learned the signals over time. The researchers made a point of driving to the same intersections at the same times of day, to see if people’s behavior changed over time once the car became more familiar.
Many of Ford’s competitors, including Volvo and Audi, are in the midst of testing their own external signaling systems, but none has conducted such a long study with unsuspecting pedestrians. Ford plans on making its findings public. The company hopes that all car manufacturers can come to an agreement over a universal signaling system. “We feel this should be standardized,” says Shutko. “If carmakers come out with different solutions, that’s inviting failure. Vehicle signs will help with acceptance, and to get that, we’ll need a common implementation.” The technology for autonomous cars is already here, but they won’t get far unless we trust them—and we won’t, if we can’t understand what they’re trying to do.