advertisement
advertisement
advertisement

Tesla’s autopilot was engaged prior to fatal Florida crash: NTSB report

Tesla’s autopilot was engaged prior to fatal Florida crash: NTSB report
[Photo: Torbjorn Sandbakk/Unsplash]

A new report from the National Transportation Safety Board (NTSB) sheds light on a fatal crash involving a Tesla. Data from the damaged Tesla shows that the car’s autopilot system had been engaged 10 seconds prior to the collision. The car did not detect the driver’s hands on the wheel for eight seconds.

advertisement

The accident took place in Delray Beach, Florida, on March 1. The car was traveling at 68 miles per hour when it hit a semi-trailer. “Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers,” the report stated. The crash remains under investigation.

This is just the latest accident involving Tesla’s not-quite-self-driving cars. A similar incident happened in 2016, when a Tesla Model–also in autopilot mode–drove underneath a semi-trailer and killed its driver. In that case, the NTSB acknowledged that the driver was partly responsible for the accident. Tesla vehicles have also been involved in two other fatal accidents where autopilot was engaged, one in Mountain View, California, last year in March, and one in China in 2016.

The new report comes amid increasing questions about the safety of self-driving or semi-autonomous vehicles. Last year, one of Uber’s self-driving cars was involved in an accident that killed a pedestrian in Arizona. Since then, other companies working on autonomous vehicles have rolled back expectations on when this technology might actually come to roadways in a meaningful way.

Companies have largely been allowed to experiment, putting self-driving cars on American roads with few stipulations. The U.S. Department of Transportation has largely been welcoming of self-driving innovation, as have state governments. That began to change last year following these incidents. Arizona’s governor ended Uber’s self-driving program in the state following the accident. And the mayor of Pittsburgh, where Uber and others have self-driving offices, added new restrictions for Uber, like a speed cap of 25 miles per hour for self-driving cars.

Because the technology is so new, regulators have been hesitant to impede innovation with strict rules. Still, there are efforts under way to create a test for self-driving cars that would vet their technology before they hit roadways. Some car experts are hoping that the International Organization for Standardization can tweak existing rules around electrical car system software in such a way to set guidelines for self-driving cars. Other groups, Underwriters Laboratories and Edge Case Research, are currently writing their own set of standards, which they’re hoping the wider industry will adopt. As with any new set of rules, this process is likely to take time.

Reached for comment, a Tesla spokesperson sent the following statement:

Shortly following the accident, we informed the National Highway Traffic Safety Administration and the National Transportation Safety Board that the vehicle’s logs showed that Autopilot was first engaged by the driver just 10 seconds prior to the accident, and then the driver immediately removed his hands from the wheel. Autopilot had not been used at any other time during that drive. We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy.

Tesla drivers have logged more than 1 billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance. For the past three quarters we have released quarterly safety data directly from our vehicles, which demonstrates that.

advertisement
advertisement