The death of Elaine Herzberg, who was struck by a Volvo XC90 Uber car in self-driving mode, has launched frenzied theorizing throughout the tech and automotive industries. Intel took its turn this week, through the car-sensor subsidiary Mobileye, which it acquired in 2017. In a blog post, Mobileye’s combo CEO/CTO Amnon Shashua says that the self-driving industry relies too much on emerging technologies:
“Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted.”
Computer vision, an older form of image recognition, requires painstaking annotation of video streams. It’s powered advanced driver assistance systems (ADAS), such as automatic emergency braking and lane-keeping support. Shashua argues that computer vision should serve as a backstop for the newer technologies, in which the AI figures out how to identify objects after seeing many, many examples.
Moreover, it’s far from clear if the crash happened because the deep-learning AI wasn’t up to the task or simply because engineers screwed up in how they implemented it. The car’s Lidar (laser scanner) and radar should have easily spotted Herzberg as some kind of obstacle long before she was visible on video.
Avoiding this accident wasn’t a “hard problem”–the term engineers use to describe big technical challenges. The incident could have been a celebrated example of self-driving technology exceeding human capability to avoid an accident. Instead, it’s a dumbfounding example of failure.