A Tesla Car Crashes While In Autopilot Mode, Killing Its Driver

The incident, in which the self-driving setting was only in “beta release,” could spur a debate over autonomous vehicle safety.

A Tesla Car Crashes While In Autopilot Mode, Killing Its Driver
[Photo: Flickr user Joseph Thornton]

The world may have seen its first road death caused by a self-driving car. On May 7, in Williston, Florida, a Tesla Model S drove under the wheels of an 18-wheel semi trailer when the truck took a left across U.S. Highway 27A, and the car, coming in the other direction, failed to stop. The Tesla’s top was stripped off as it passed under the trailer, then it left the road, smashed through two fences, bounced off a power pole, and spun to a final rest. The driver, Joshua Brown, died at the scene.


Tesla has confirmed in a blog post that the car was running on Autopilot, Tesla’s self-driving mode:

What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

This, combined with the bad luck of having the car low enough to pass under the trailer instead of hitting it with the nose of the car, made the crash way worse than it could have been.

Tesla’s main defense shifts responsibility onto the driver for engaging Autopilot and taking his attention off the road. It says Autopilot was still in testing mode:

“It is important to note that Tesla disables Autopilot by default,” says Tesla, “and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.”


Right there we see the problem of tech companies getting into the car business. Tesla has done a lot of great things for the car industry by from making electric cars desirable to own to its challenge of the crooked old world of traditional car dealerships, but palming off a fatality as the result of “beta” software is weak sauce. It’s one thing to release an unpolished map app with the promise of improving it in the future, but on a car, a beta release seems like a bad idea. And remember, that car can kill pedestrians and other road users if things go wrong, none of them gave “explicit acknowledgement” to switch on beta mode.

The “beta” label is also catnip to tech nerds, with its promise of new features. And Joshua Brown, the dead owner of the Tesla, was not only a tech nerd, but a tech nerd with a YouTube channel full of videos shot in a Tesla running on Autopilot.


Tesla says that, when in Autopilot mode, the driver is advised to keep their hands on the wheel at all times, and he or she receives regular alerts if they don’t do so, slowing the car if their hands stray from the wheel.

Tesla also gives safety figures for its Autopilot mode. So far this is the first fatality in 130 million miles of Autopilot use. “Among all vehicles in the U.S., there is a fatality every 94 million miles,” says the blog post.

This is the first death that may have been caused by a self-driving car, and as such, will likely spur both legal and moral debate, as the responsibility for the crash is determined, and the risks of self-driving cars—up until now a 100% safe alternative to deadly human-steered cars—are discussed by the wider public. This might be the crash that determines the future of self-driving cars. Or it might be the point at which we realize that nobody cares whether autonomous cars are sometimes dangerous, as long as they’re convenient.

related video: How A Recall Helped Jeni’s Splendid Ice Cream Reinvent The Business

Have something to say about this article? You can email us and let us know. If it’s interesting and thoughtful, we may publish your response.


About the author

Previously found writing at, Cult of Mac and Straight No filter.