The chief technology officer of a technology supplier that enables Tesla's semi-autonomous Autopilot driving technology believes the carmaker is pushing the safety envelope too far.
"It is not designed to cover all possible crash situations in a safe manner," Amnon Shashua, CTO and executive chairman at Israel-based Mobileye NV, told Reuters Wednesday.
Shashua's comments came the same day a second fatal accident was revealed through a lawsuit filed against Tesla by the father of a man allegedly driving a Model S with the Autopilot engaged.
The father of the 23-year-old Tesla driver who died in the crash filed the lawsuit in July in a Beijing court. The suit alleges that the accident, which occurred in January in the northeastern province of Hebei, was the fault of Tesla's advanced driver assistance system (ADAS), according to Chinese state broadcaster CCTV.
Tesla confirmed this week it is investigating the fatal crash, but said it has no way of knowing if Autopilot was engaged due to the amount of damage the vehicle sustained. The driver, Gao Yaning, had borrowed his father's Model S.
China's CCTV published dashcam footage from the vehicle slamming into a street-sweeping truck in the far left lane of a highway.
Mobileye makes machine-vision chips and software that process images received from vehicle cameras. The technology is critical in enabling systems such as Autopilot.
In July, Mobileye announced it was ending its relationship with Tesla after a high-profile Model S fatality that occurred while Autopilot was engaged. The vehicle slammed into the side of an 18-wheel semi-tractor trailer truck that turned left in front of it on a divided highway.
The National Transportation Safety Board (NTSB), which is conducting an investigation of the May 7 accident, released a preliminary report that revealed the Model S was speeding and that Autopilot's Autosteer lane-keeping assistance and traffic-aware cruise control system was engaged at the time of the crash.
Tesla, which has been adamant that drivers using Autopilot must keep their hands on the steering wheel and eyes on the road, said that neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brakes were not applied.
Autopilot uses both radar and camera vision systems to discern objects in the path of a moving vehicle. Since rolling out Autopilot in 2014, however, Tesla drivers have publicized many videos showing them disengaged with their vehicles, allowing the ADAS technology to have complete control.
Yesterday, Shashua clarified why the company ended its relationship with Tesla.
"No matter how you spin it, (Autopilot) is not designed for that. It is a driver assistance system and not a driverless system," Shashua said.
On Sunday, Tesla announced refinements to its Autopilot software. Among "dozens of small refinements" Autopilot Version 8 contains a more advanced signal processing algorithm "to create a picture of the world using the onboard radar."
In a blog, Tesla described its radar, which is on all Tesla vehicles as part of the Autopilot hardware suite, as a supplementary sensor to the primary camera and image processing system.
"After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition," the company stated. "The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar."
Alex Lidow, founder and CEO of Efficient Power Conversion (EPC), said that while radar may be useful in semi-autonomous and fully autonomous vehicles, LiDAR (Light Detection and Ranging) technology is far more accurate. LiDAR uses multiple lasers to create 3D scans of objects around a vehicle.
EPC makes semiconductors based on Gallium Nitride that the company claims are hundreds of times faster than traditional silicon chips. LIDAR manufacturers are using EPC's chips because their fast speed enables LIDAR systems to paint a much clearer picture of objects in the road.
Radar, Lidow said, offers too low a resolution picture for ADAS systems to accurately discern all the objects in the path of a moving car, so cameras are needed in addition as part of a meshed "vision system."
"Radar penetrates flesh, so human beings, small kids and dogs appear semi-transparent to it. So in order to understand what's around with radar, you also need cameras," Lidow said. "Also, remember it's not a direct view, but an interpretation of what's around you. And, that can be interpreted incorrectly."
The interpretation being performed by radar and cameras also takes time to be processed by a vehicle's computers, which means there's latency between when an object appears before a vehicle and when the vehicle recognizes it.
Lidow said there's simply no contest when it comes to comparing radar and lasers for autonomous driving systems.
LiDAR uses up to 64 lasers to send out a pulse of 30,000 photons, and as few as two returning photons in less than 10 nanoseconds (a nanosecond is a billionth of a second) can be used to determine if an object is in the path of a moving vehicle.
"For example, if a pulse returns in 10 nanoseconds, that means the object is 5 feet in front of you," Lidow said. "Pulses are 4 nanoseconds wide, so each laser beam can send 250 million pulses a second, and with 64 lasers that means the car is receiving billions of pixels a second to form an image. That high resolution 3D image is unambiguous and not open to interpretation."
Until recently, LiDAR systems have been too expensive for mass production ---- as much as twice that of the cost of the vehicle itself or about $75,000.
Tesla CEO Elon Musk has gone on record saying he doesn't think his all-electric vehicles need LiDAR, and argues that passive forward-looking radar can accomplish ADAS functions.
"I think that completely solves it without the use of LIDAR. I'm not a big fan of LIDAR, I don't think it makes sense in this context," Musk said last year.
Lidow, however, believes LiDAR on semi-autonomous and autonomous vehicles will eventually be as common as anti-lock brakes or airbags in vehicles today.
This month, MIT and the Defense Advanced Research Projects Agency (DARPA) were able to pack the complex array of LiDAR sensors onto a single silicon chip, meaning the most expensive part of the technology could be fabricated in common commercial CMOS foundries; that would reduce the cost of a LiDAR system by an order of magnitude.
Quanergy Systems, Velodyne LiDAR and Israeli startup Innoviz have already promised that by 2018 they will have sub-$250 LiDAR systems based on a single solid-state chip. Start-up Scanse recently ran a Kickstarter campaign and raised more than $272,000 toward bringing a $250 LiDAR unit to market.
Ford recently announced it is using LiDAR on Fusion sedan pilot vehicles making use of semi-autonomous driving technology. Those pilot vehicles are now part of a partnership with Uber that recently launched in Pittsburgh and has autonomous vehicles picking up passengers throughout the city.
"I think we're probably five to eight years away from a highly integrated LiDAR system," Lidow said. "It's a Catch 22. Integration makes sense when have high volume and high volume happens when you have low cost. And only high integration leads to low cost."