When it comes to self-driving cars, the legal discussion usually focuses on what happens when robocars mow down soft humans at confusing intersections. Thankfully that hasn’t happened yet, but that doesn’t mean that driverless vehicles aren’t already having brushes with the law.
Exhibit A: One of Google’s self-driving cars got pulled by a cop recently, and of course, hilarity ensued. The cop can be seen at the driver’s window, with nobody to ticket. It’s pretty funny, especially as the cop pulled the car for driving too slow, but it raises questions about how autonomous cars will be viewed by the law in the future.
Right now, Google’s cars still have a human on board, whose job it is to monitor the car’s progress and to take charge if things go wrong. That means there’s still a human responsible for the vehicle, and that human might have a hard time arguing that “it was all the car’s fault.” In this case, the officer flagged down the car to ask why it was driving so slow.
“We’ve capped the speed of our prototype vehicles at 25 mph for safety reasons,” says the Google Self-Driving Car Project’s blog. “We want them to feel friendly and approachable, rather than zooming scarily through neighborhood streets.” The bug-like Google cars are NEVs (Neighborhood Electric Vehicles), restricted to slower roads and capped at a maximum speed of up to 45 mph.
But what about the future? The extreme example would be a car ferrying home its drunken owner, perhaps passed out in the back. If that situation is legal, then the driver certainly won’t be responsible for the car. Or consider people who never learned to drive, but nevertheless own a self-driving vehicle. Unlike our drunk, who could at least drive after they sobered up, these non-drivers would be helpless to control the vehicle under any circumstances.
“Current laws never envisioned a vehicle that can drive itself,” AAA managing director John Nielsen told Co.Exist when asked about a driverless future, “There are numerous liability issues that need to be ironed out. If an autonomous vehicle gets in a collision, who is responsible—the ‘driver,’ their insurance company, the automaker that built the vehicle, or the third-party supplier that provided the autonomous control systems?”
The legal aspects of self-driving cars will almost certainly be the biggest drag on widespread adoption of the technology. It certainly won’t be the technology itself, which is already good enough to be deployed. Google’s cars still haven’t caused an accident and have all but solved safe, driverless travel on known suburban routes. Both Switzerland and the U.S. are deploying self-driving buses. The robocars are out there and driving already, but the law doesn’t even seem to have started thinking about the consequences.