In a huge boost to Google’s self-driving car program, the National Highway Traffic Safety Administration (NHTSA) on Tuesday released a letter it sent to Google confirming that it agrees with the company’s interpretation that its artificial intelligence software behind the company’s self-driving cars can legally be considered the "driver" of the vehicle under federal law, reports Reuters.
Google submitted a proposed design for one of its self-driving cars to the NHTSA in November in which it stated that the car had "no need for a human driver" since its AI systems could control the journey from start to finish. This means that the car is technically the driver instead of any human occupant in it, according to Google.
In the letter the NHTSA said, "We agree with Google its [self-driving car] will not have a 'driver' in the traditional sense that vehicles have had drivers during the last more than one hundred years." The agency also wrote that it "will interpret 'driver' in the context of Google's described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants" reasoning that "If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the driver as whatever (as opposed to whoever) is doing the driving. In this instance, an item of motor vehicle equipment, the SDS, is actually driving the vehicle."
Many major technology companies and traditional auto manufacturers are working on driverless cars and it is expected they’ll be commercially available within four years. One of the major complaints and hindrances for autonomous car companies like Google, however, is trying to navigate the sometimes vague state and federal safety rules covering self-driving vehicles.
The National Highway Traffic Safety Administration’s ruling will go a long way to alleviating the legality of what is driving a self-driving car on the road. However, there are still many legal and regulatory issues to be clarified and resolved. For example, who is responsible if a self-driving car gets into an accident? The human occupant? The manufacturer? The developer who coded the software? It is still unclear where the liability would rest. Also, human drivers need to pass driving tests in order to be licensed to drive. How do you license an individual car to drive? Would each car need to be road-tested at a local DMV after the owner bought it? Or could a driver license be issued to a fleet of self-driving cars as they roll off the assembly lines?
"The next question is whether and how Google could certify that the (self-driving system) meets a standard developed and designed to apply to a vehicle with a human driver," the NHTSA wrote.
For its part Google said it is "still evaluating" NHTSA's lengthy response, reports Reuters.