advertisement
advertisement
  • 11.04.15

Should Self-Driving Cars Be Tested And Licensed Like Humans?

Machines don’t learn the same way as people–but we do need some sort of inspection of a robot’s skills in order to trust it on the road.

Should Self-Driving Cars Be Tested And Licensed Like Humans?
[Top Photo: Oregon DOT Flickr]

Should self driving cars have to take a test and get a license? Michael Sivak and Brandon Schoettl of the University of Michigan Transportation Research Institute say yes. And no. We probably need some kind of licensing system for autonomous cars, but it won’t look anything like the one we use to test people.

advertisement

To see why, we should look at how and why human drivers are tested. Today’s system of graduated driver licensing (first a learners permit, then a junior license) ramps up the difficulty level as it goes, removing restrictions along the way (highway driving, night driving). This is designed to let the novice driver practice in easy situations and gain experience that will help in other situations.

Flickr user Daniel Lobo

Driverless cars, on the other hand, are either good at something, or they’re not. A new car won’t get distracted by a cellphone the way a new driver will, for example. Likewise, getting good at driving in good weather won’t help in rain or snow. The authors give another example:

For self-driving vehicles, experience with daytime driving does not improve nighttime performance. Instead, good nighttime performance requires everything that good daytime performance does, plus sensors that provide the necessary information even at low levels of illumination.

This leads Schoettl and Sivak to conclude that the system for licensing humans is pointless when applied to machines. Robots just don’t learn the same way we do. But that’s not to say that licensing isn’t needed. It’s just that it should be tailored to fit driverless vehicles.

Some human tests would carry over. The eyesight test is analogous to a sensor test, and making sure a driver keeps up to date with their lens prescription is similar to making sure a car’s sensors are kept in working order.

Otherwise, licensing could rest on a car’s abilities. For instance, a car might be certified only for daytime driving or for driving in sun and rain, but not in snow. It might also be assessed on how well it can perform pattern recognition. Humans are so good at spotting oddities on the road that we don’t need to be tested for it. Computers have all kinds of trouble with it. One can imagine a car’s camera’s being fooled by a very large, mirror-like puddle, for example.

Oregon DOT Flickr

Most of these tests could be done either at the design stage, just like crash safety tests. Others could be done annually along with emissions and other assessments. But there’s one test that might prove sticky: ethics. This is something built in to human drivers, but that will doubtless be troublesome for robot-piloted vehicles.

advertisement

Humans break the rules all the time, say Schoettl and Sivak. But cars are programmed to follow the laws of the road. “Indeed, the problem here might not be disobeying appropriate laws and regulations,” say Schoettl and Sivak. “Instead, the problem might be just the opposite: Self-driving vehicles may follow the letter of the law too strictly, compared to what people typically do.”

To illustrate this, they take an example from Forbes’ Bill Visnic. “Merging at the speed limit onto a highway of cars zipping past at well over the speed limit is just plain dangerous,” he writes. Human drivers wouldn’t think twice about stomping on the gas to merge safely. A robot, on the other hand, might get a little confused.

“Should manufacturers be allowed to program a vehicle to willfully break applicable laws?” ask the authors. “If so, which laws and to what extent?”

At some point a driverless car will kill a human, whether by accident (a kid runs out from behind a parked car) or by design (the car is programmed to avoid ploughing into a crowd, and instead takes down a single pedestrian). In cases where people kill other people with cars, the circumstances of each individual case are assessed by the courts. Driverless cars, though, need to be programmed ahead of time to make the right call, and this would need to be figured into the licensing tests. This, say Schoettl and Sivak, raises an uncomfortable question.

“If, eventually, there would be testing procedures for self-driving vehicles, would one pass by obeying the laws or by disobeying the laws?”

About the author

Previously found writing at Wired.com, Cult of Mac and Straight No filter.

More

Video