advertisement
advertisement
advertisement

Tesla’s ‘Autopilot’ mode shows how branding can become a UX risk

It’s vital that we use the right language to describe autonomous tech, suggests a new study on the controversial Tesla feature.

Tesla’s ‘Autopilot’ mode shows how branding can become a UX risk
[Photo: Tesla]

In an automotive industry that’s been remarkably slow to embrace electric, Tesla has proven that you can create a greener all-electric car that actually outperforms gasoline engines on a race track, too.

advertisement
advertisement

In addition to bringing EVs to more consumers, Tesla is also putting emerging autonomous tech into their hands. These self-driving capabilities are still limited, which makes Tesla’s decision to brand its driver-assist mode as “Autopilot” questionable. Teslas can cruise down a highway through robotic vision and AI, but not so well that a user is ever meant to take their eyes off the road or hands off the wheel—let alone fall asleep in the driver’s seat. While Elon Musk has argued the feature is named after autopiloting a plane (which still requires pilot attention—did you know?), common sense has long implied otherwise. To many of us, Autopilot just sounds like the future, implying Teslas can magically drive themselves, and potentially leading drivers to treat it with a dangerous dependency as a result. It’s a debate that foregrounds the way branding and user experience intersect, for better or worse.

Now, for the first time, a survey by the Insurance Institute for Highway Safety (IIHS) has quantified how the general public interprets the word “Autopilot.” The survey asked 2,000 people what they’d expect they could do in a car while “Autopiloting.” They also asked people what they thought they could do in a car while using equivalent technology from other brands (technically, known as “Level 2” self-driving capabilities), like Audi’s “Traffic Jam Assist,” BMW’s “Driving Assistant Plus,” Cadillac’s “Super Cruise,” and Nissan’s “ProPilot Assist.”

[Image: courtesy IIHS]

Despite all of these self-driving systems having similar capabilities, in every single case, people believed Autopilot allowed them to be less attentive than they would be using any of the less hyperbolically branded driving assistance platforms. Nearly 50% of people who heard “Autopilot” assumed they could take their hands off the wheel, 33% believed they could talk on a cellphone, 15% thought they could text, 8% though they could watch a movie, and 6% believed they could sleep (that’s twice as many people as thought they could sleep behind the wheel of the other techs).

As the IIHS points out, three different Tesla crashes—which led to multiple fatalities—involved an Autopiloting driver having removed their hands from the wheel. IIHS cannot name a direct motivation when it comes to why the hands were off the wheel in those cases. It’s also worth noting that Autopilot is set to alert you and disengage if you remove your hand from the wheel. In any case, the IIHS concludes that Autopilot’s name creates user experience issues by coaxing us to trust the machine more than we should—and be less attentive drivers as a result. “One name in particular—Autopilot—signals to drivers that they can turn their thoughts and their eyes elsewhere,” the IHHS writes. “Manufacturers should consider what message the names of their systems send to people,” adds IIHS president David Harkey.

When we reached out to Tesla about the news, the company provided the following statement:

“This survey is not representative of the perceptions of Tesla owners or people who have experience using Autopilot, and it would be inaccurate to suggest as much. If IIHS is opposed to the name “Autopilot,” presumably they are equally opposed to the name ‘Automobile.’

Tesla provides owners with clear guidance on how to properly use Autopilot, as well as in-car instructions before they use the system and while the feature is in use. If a Tesla vehicle detects that a driver is not engaged while Autopilot is in use, the driver is prohibited from using it for that drive.”

In fairness to Tesla, the company does provide in-person training to new drivers who buy its vehicles, and the study specifically didn’t poll Tesla users on their understanding of Autopilot after receiving such training, or owning a vehicle for six months or a year later. But that doesn’t mean we don’t still have a natural propensity to trust “Autopilot” more than we should simply because of the word itself, especially if we’re stuck in traffic, and have an email on our phone, and children screaming in the backseat, and…

advertisement

In any case, this is just the sort of research we could have used years ago when legislators may have done something about it. Now, self-driving systems are rapidly reaching a level of autonomy that actually will allow us to fall asleep at the wheel. Going forward, nearly every part of our lives may be touched by AI and automation—machines making important decisions on our behalf. So it’s vital that we use the right language to describe these technologies, because it’s not just a company’s bottom line, but our well-being that’s at stake.

advertisement
advertisement

About the author

Mark Wilson is a senior writer at Fast Company who has written about design, technology, and culture for almost 15 years. His work has appeared at Gizmodo, Kotaku, PopMech, PopSci, Esquire, American Photo and Lucky Peach

More