When they crash, self-driving Mercedes will be programmed to save the driver, and not the person or people they hit. That’s the design decision behind the Mercedes Benz’s future Level 4 and Level 5 autonomous cars, according to the company’s manager of driverless car safety, Christoph von Hugo. Instead of worrying about troublesome details like ethics, Mercedes will just program its cars to save the driver and the car’s occupants, in every situation.
One of the biggest debates about driverless cars concerns the moral choices made when programming a car’s algorithms. Say the car is spinning out of control, and on course to hit a crowd queuing at a bus stop. It can correct its course, but in doing so, it’ll kill a cyclist for sure. What does it do? Mercedes’s answer to this take on the classic Trolley Problem is to hit whichever one is least likely to hurt the people inside its cars. If that means taking out a crowd of kids waiting for the bus, then so be it.
That’s a callous example, but it shows how we think, as opposed to how cars think, and by extension, how engineers think. “If you know you can save at least one person, at least save that one. Save the one in the car,” von Hugo told Car and Driver in an interview. “If all you know for sure is that one death can be prevented, then that’s your first priority.”
So far, the highest-profile death in a self-driving car was when a Tesla crashed on May 7, 2016, while in Autopilot mode. The car didn’t see a semi truck pull out across the road ahead, and drove under it, killing the driver, Joshua Brown. That was a straight-up error, but future crashes will involve the car choosing to point itself at humans, and in all likelihood killing them.
The moral confusion is deepened when we consider that autonomous cars may save millions of lives that would otherwise have been snuffed out by careless human drivers. That’s no consolation if a Mercedes chooses to use you as an airbag to save its owner, but maybe you’d already have been killed a few years before if a particular human driver hadn’t been replaced by a driverless car.
Mercedes’s von Hugo, then, thinks that the ethical problems will be outweighed by the fact that cars will be better drivers overall. “There are situations that today’s driver can’t handle, that . . . we can’t prevent today and automated vehicles can’t prevent, either. The self-driving car will just be far better than the average human driver,” he told Car and Driver.
He also points out that, even if the car were to sacrifice its occupants, it may not help anyway. The car may end up hitting the crowd of school kids regardless. “You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save.”
As we dig deeper, it seems that the problems faced by driverless cars and by human drivers are much the same. We try to avoid crashes and collisions, and we have to make split-second decisions when we can’t. Those decisions are governed by our programming and experience. The differences are that computers can think a lot faster, but they can also avoid many crashes that a human driver wouldn’t have. These differences pull in different directions, but they don’t cancel each other out.