Fast company logo
|
advertisement

TECH

Do Self-Driving Cars Dream Of Safe Streets?

One of the key questions pondered by automakers is how many deaths per year in self-driving cars will be acceptable.

Do Self-Driving Cars Dream Of Safe Streets?

[Photo: Flickr user Jörg Schubert]

BY Marcus Baram8 minute read

Earlier this year at the SXSW Conference in Austin, Texas, Bill Ford said out loud what a lot of people in the auto industry were thinking–or, more precisely, worrying about more than they care to admit. The Ford CEO was talking about the advent of driverless vehicles, a topic that’s getting a lot of ink these days as every automaker and some of the biggest players in Silicon Valley pour billions of dollars into the development of “naked” robotic cars (so-called Tier 5 autonomous vehicles, or AVs, without steering wheels or pedals).

Engineering the autos will be the easy part, Ford said, because the technology is ramping up quickly. More daunting, though, will be deciding how to program autonomous cars to make life-and-death decisions. “If a vehicle has to choose who does it hit (if it is about to be in an accident), does it save the occupant or 10 pedestrians? Those are all decisions that you and I as drivers don’t have time to make, we just react. But these vehicles will have the processing capability to actually choose the outcome. And if the outcome is that it chooses to crash you, the occupant, are you going to get into that vehicle?”

Although Ford tweaked the auto industry by posing this problem publicly, it echoed a line of thinking already common among American car buyers. In a survey conducted earlier this year, AAA found that three-quarters of motorists said they fear riding in a self-driving auto and 54% of respondents said that they would feel less safe sharing the road with an AV. And once Ford let the issue out of the bag, a few others in the AV world began to stretch it to its logical conclusion: “If we have to teach a car whether to kill a nun or a Boy Scout, there’s not going to be a self-driving car industry,” says Amnon Shashua, senior vice president of Intel and cofounder of Mobileye, the autonomous car technology company that Intel purchased for $15.3 billion in March.

But to address this issue, Shashua believes, it needs to be framed more practically, distinct from impossibly complex philosophical or ethical conundrums. Simply put, he says, the question of whether autonomous vehicles are widely embraced by consumers comes down to this weighty question: How many deaths per year in self-driving cars will be acceptable to the public?

At this point, the answer is probably none. Every accident of an autonomous vehicle, whether the fatal crash in Florida last year in which a truck plowed into a self-driving Tesla or the new self-driving shuttle in Las Vegas that was hit by a delivery van two hours after its debut, makes national headlines and is talked about for months. By contrast, 20-car pileups routinely occur just about every day and in some weeks more than 1,000 people are killed in crashes of traditional vehicles, meriting little more than a shrug even in the local press.

This misplaced attention and alarm directed at the safety of autonomous vehicles is further punctuated by the fact that regulators around the world are itching to put these cars on the road in order to minimize the relentless scourge of traffic fatalities. Last year, the U.S. National Highway Transportation and Safety Administration (NHTSA) endorsed self-driving cars as a way to “dramatically decrease the number of crashes tied to human choices and behavior.”

[Image: courtesy of Mobileye]
Given this set of circumstances, Shashua and a team of programmers at Mobileye have come up with a creative solution to overcome the fear of AVs. Based in part on NHTSA data covering 6 million incidents, Mobileye produced a set of nearly 40 scenarios that cover the situational permutations of all vehicle accidents. From this, the company designed algorithmic formulae that codify perfect human driver judgment vis à vis traffic laws and road conditions. By using these rules as the basis for programming AV decision-making in every possible circumstance, Shashua claims, automobile makers can guarantee car buyers that a self-driving vehicle will never be at fault in an accident.

At the heart of Mobileye’s safety system is a concept known as Cautious Command, which describes the options available to an AV to continually maintain its safe zone. For example, the illustration below shows a common situation in which the blue car is trying to exit a parking lot but is blocked by a building from seeing whether another vehicle is coming down the street. A conscientious human driver would edge out into the road, slowly expanding the field of view until he feels comfortable pulling out. In actuality, he will never have full vision and ultimately make a decision based on gut instinct at the last moment. Although the driver may do his best, an accident that he would be blamed for is still possible. And if the driver were not diligent–distracted, say, by setting the GPS or calling ahead to his next stop–a crash becomes more likely.

By contrast, an AV in Cautious Command would calculate the highest reasonable velocity of the red car (based on the speed limit), and determine how much it can tiptoe out into the road, giving the red car the opportunity to brake while gradually gaining total visibility. If the red car were to go faster than the speed limit, it would be at fault if an accident occurs. Moreover, if the red car were the AV, it would drive more guardedly, perhaps leaning toward the center lane, anticipating that a car out of its range of sight may be exiting the parking lot.

advertisement

RSS encompasses situations where objects (vehicle, pedestrians, et al) are occluded (hidden) by other objects. [Image: courtesy of Mobileye]
The promise of Cautious Command would seem to make moot Bill Ford’s concerns about the intractable choices a self-driving car must make, primarily because ethical dilemmas about who lives and who dies in an impending accident are replaced by an unyielding, if dispassionate, rule: If a crash is imminent (one that, by definition, will be the non-AVs fault), the self-driving car will try to avoid that accident but not mitigate it by causing another. In other words, absent other safe options, it will take the hit even if its passenger may be killed in the process. “Coming at this issue any other way, takes us down a slippery slope,” says Shashua. “Perhaps what looks like a car with no one in the back seat–so it could take a rear hit from an AV about to be hit itself–has babies in the back. There were hidden parameters and in hindsight the AV did something worse.”

Based on thousands of accident simulations, Mobileye believes that if every automaker adopted its formulae, which it is open sourcing to give the industry free rein to use it, the number of vehicle deaths per year in the U.S. could be improved by three orders of magnitude–that is, from one traffic fatality for every 1 million hours of driving currently to 1 for every 1 billion hours, or 40 deaths a year. (This assumes, of course, that all cars are AVs; the fatalities would be from equipment malfunctions and poor driving conditions).

Some robotics experts have taken issue with Mobileye’s safety claims. For instance, Missy Cummings, director of Duke University’s Humans and Autonomy Lab, told EETimes that Mobileye may have underplayed the number of potential software bugs likely to plague an AV. She pointed to the 2016 Stout Risius Ross Automotive Warranty and Recall Report, which found that since 2012, software issues are increasingly responsible for recall activity. The study found that between 2005 and 2012, 32 unique software-related recalls affected about 3.6 million vehicles; from the end of 2012 through June 2015, those numbers nearly doubled. However, all of the academic and engineers interviewed by EETimes praised Mobileye for its effort in pushing the AV industry toward a legitimate safety solution.

Thus far, the response in the auto industry to Mobileye’s algorithms has been muted, largely because there is so much jockeying for position in the AV landscape right now, with everyone from Google, Uber, and Lyft to dozens of startups and every major automaker promising fully autonomous cars by early in the next decade. At this point, no one wants to risk losing potential market leadership by signing on too early to standards that turn out to be also-rans. (Remember Betamax? Probably not). “Mobileye is right that there should be some standards for safety but with so much riding on autonomous vehicles for automakers there isn’t too much appetite for anything but a Wild West approach to technology development,” says Doug Newcomb, president and cofounder of automotive technology analysts C3 Group.

Which is precisely the attitude that Shashua has categorized as a suicide pact for the industry. Without agreed upon safety criteria, automakers are each coming up with their own features, potentially over-engineering AVs based on tens of thousands of redundant simulations and piling on more sensors and computing power than are necessary–all to claim that they have safety-proofed their vehicles even if they won’t be able to categorically say that their vehicle cannot cause an accident. The outcome could be tremendously high legal exposure for automakers after deadly accidents and exorbitant production costs for cars bedecked with expensive parts, which would leave AVs out of reach for many vehicle buyers. At the same time, some automakers may choose to sacrifice safety concerns for a lower price point, confusing consumers about how to choose a safe self-driving car and exacerbating the perception that AVs are dangerous.

Under pressure from autonomous vehicle developers to give them room to work, the federal government seems to have agreed to take a laissez-faire stance. A bill passed by the House of Representatives and now before the Senate would replace the patchwork of state rules that generally permit limited testing of AVs on certain roads (although some states have still not approved the use of these vehicles) with omnibus legislation that gives NHTSA the power to regulate AV vehicle design, construction, and performance while states only handle vehicle registration and licensing. And perhaps most importantly, the bill would allow NHTSA to grant AV makers exemptions to put on U.S. roads as many as 100,000 vehicles annually (phased in over three years) that fail to meet current federal motor vehicle safety standards.

With such freewheeling language in the pending law, Mobileye may have an uphill climb to convince the AV industry to tackle safety and accident blame in any concerted way. Which means that we may find out sooner than we want to whether developers playing God by programming self-driving vehicle decision-making are as infallible as they believe they can make their machines.

Recognize your company's culture of innovation by applying to this year's Best Workplaces for Innovators Awards

.
PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

Explore Topics