In the wake of Uber's fatal self-driving car crash in Tempe, Arizona, suppliers of core autonomous technology are struggling to address a cloud of doubt surrounding sensors and software that might have seemed ready for public introduction.
Their response has been both a frank acknowledgment that many systems under development haven't been refined enough for wider deployment, and rising concern that similar test failures could derail the progress that has been achieved and undermine faith in the technology among consumers and regulators.
More Uber-like incidents "could do further harm to already fragile consumer trust and spur reactive regulation that could stifle this important work," wrote Amnon Shashua, a senior vice president at Intel and chief technology officer of Mobileye, its self-driving technology subsidiary, in a blog post discussing the crash.
Nvidia, the chipmaker whose computing platform underlies several companies' self-driving systems, paused testing in the days following the crash. "Safety is the single most important thing," CEO Jensen Huang said at the company's developer conference last week. "We're also working on the hardest computing problem."
Huang emphasized that Uber doesn't use Nvidia's Drive PX computer platform for self-driving cars.
Experts say the shortcomings of autonomous hardware such as radar, lidar and cameras go beyond failures specific to Uber's system. And without a set of standards companies can adhere to, they say, it's unclear how regulators will verify that self-driving tests are safe for public roads in the future.
Video footage taken from the vehicle indicates that Uber's sensor suite of cameras, radar and lidar, a laser-based sensor, failed to detect and react to the pedestrian crossing the road, 49-year-old Elaine Herzberg, who died of her injuries after being struck by the modified Volvo XC90. Some experts have faulted Uber for reducing the number of sensors it uses on the Volvo test vehicles it's now using.
But even before the crash, engineers were expressing caution regarding the performance of commercially available sensor sets.
"The sensor suite is not satisfactory," Jin Woo Lee, vice president of the Intelligent Safety and Technology Center at Hyundai Motor Group, said in a January interview, citing the limited field of view and range of off-the-shelf sensors such as lidar and radar.
Kobe Marenko, CEO of Arbe Robotics, an Israeli sensor startup, said current sensors installed on vehicles for testing were designed for more limited uses, such as adaptive cruise control. That hardware, he said, "is not mature for prime time."
While the components are useful on the test vehicles for gathering data or improving autonomous guidance and object detection algorithms, Marenko said, they shouldn't be relied for full self-driving capabilities. Among the shortcomings he cited:
Radar resolution and field of view that limits how many objects a vehicle can detect and creates difficulty in distinguishing smaller objects in between larger ones
The range and refresh rate of lidar sensors, which can create gaps in the images the laser captures.
Persistent issues related to rain and fog that limit the performance of sensors.
Budding supply chain
Those limitations are exacerbated by the lack of standard specifications from automakers that hardware suppliers should adhere to during r&d.
"Everybody is doing what everybody wants," Marenko said. "Everybody has their own view of what is the minimum tech for driving."