A non-profit consumer advocacy group yesterday called on the California Department of Motor Vehicles (DMV) to keep robotic cars off the road until the federal government enacts "enforceable" safety standards.
The Santa Monica-based, non-partisan consumer rights group said Tesla has been irresponsible in deploying its current semi-autonomous Autopilot vehicle technology.
"Tesla used humans as guinea pigs, killing at least two, while hyping its 'autopilot' technology. The proposed regulation would stop such abuses," John Simpson, Consumer Watchdog's Privacy Project director, said in a statement.
In January, a 23-year-old driver in China died after borrowing his father's Tesla Model S; the sedan slammed into a street-sweeping truck in the far left lane of a highway. The father of the victim is suing Tesla, saying Autopilot was engaged at the time of the crash.
Tesla said it is investigating the fatal crash, but said it has no way of knowing whether Autopilot was engaged due to the amount of damage the vehicle sustained.
In July, the National Transportation Safety Board (NTSB) released a preliminary report that detailed the circumstances of a fatal accident involving a Tesla Model S driving with its Autopilot engaged.
The accident, which took place May 7 in Williston, Fla., was the first known fatal crash involving a vehicle using an advanced driver assistance system (ADAS) based on computer software, sensors, cameras and radar. The all-electric Tesla hit the side of an 18-wheeler that turned left in front of it. The impact sheared away the roof of the Model S and killed Joshua Brown, 40, of Canton, Ohio.
The NTSB report stated that Tesla system performance data downloaded from the car indicated that the Model S' speed just before impact was 74 mph, nine miles an hour over the speed limit on the four-lane, divided highway. The Autopilot's Autosteer lane-keeping assistance and traffic-aware cruise control system was engaged at the time of the crash.
Tesla has repeatedly warned drivers that they must keep their hands on the steering wheel and remain alert at all times, even when the current iteration of Autopilot is engaged.
The new hardware announced on upcoming vehicle models, however, would enable a fully-autonomous vehicles without steering wheels to hit the roads within two years, Musk said.
Musk has been adamant that self-driving cars will be far more capable of avoiding accidents than human drivers because of vastly faster reaction capabilities; yesterday, he called on the news media to stop creating an atmosphere of fear around the technology.
"Because -- and you need to think carefully about this -- because if, in writing some article that's negative, you effectively dissuade people from using an autonomous vehicle, you're killing people," Musk said during a press and analyst call yesterday.
In 2013, the National Highway Traffic Safety Administration (NHTSA) released guidance for automakers developing vehicles with self-driving capabilities.
The NHTSA guidelines don't carry the force of law, but suggest autonomous vehicle testing data be recorded and shared with the appropriate agencies, "especially when things go wrong." They also state that a vehicle's process of switching from automated driving to human-controlled driving should be smooth and safe, and that none of the autonomous technologies used on self-driving cars interfere with or override federally mandated equipment.
Carmen Balber, executive director of Consumer Watchdog, said in a statement that current proposed DMV rules offer "absolutely no safety performance standards."
"The proposed DMV rules would let robot cars without a driver on our roads if the manufacturer simply answers Yes, No or Maybe to each point on NHTSA's 15-point safety checklist," Balber said. "We need more than a safety checklist written on toilet paper before we are sure driverless vehicles are safe to operate on public roads in California. That's why we're calling on the DMV to hold until federal regulators enact enforceable safety standards for driverless cars."
Last year, California's DMV proposed new rules what would require a licensed driver to be in a self-driving vehicle and for that vehicle to have a steering wheel.
Earlier this month, however, the DMV issued a draft revision to its regulations that would allow self-driving cars without human drivers as long as "federal officials deem them safe enough."
Another key requirement of that regulation is that manufacturers report all crashes involving their robot cars. They must submit an annual "disengagement" report detailing all the times that the autonomous technology failed.
"Google, for example, reported its technology failed 341 times. There were 272 times that the software turned over control to the driver and 69 times when the driver felt compelled to override the robot system," Consumer Watchdog stated.
Consumer Watchdog said the regulations should be tweaked to require disengagement reports on a quarterly basis and that video and technical data associated with any crash should be made public. Police should investigate all crashes.
Last month, NHTSA issued its 112-page "Federal Automated Vehicles Policy" that includes a 15-point safety assessment carmakers are asked to complete. The assessment has no standards, but merely asks manufacturers to describe how they deal with such issues as where the robot car is supposed to operate, perception and response functionality of the robot technology, privacy, cybersecurity and ethical issues.
Responding to the 15-point safety assessment is completely voluntary, Consumer Watchdog said, though NHTSA said it plans a rule making to require a response.
NHTSA said it is relying on carmakers and tech companies such as Apple and Google to be honest when submitting their safety assessments.
"Typically, we would say a car has to meet X standard a certain way," said U.S. Secretary of Transportation Anthony Foxx. "We recognize that there are going to be different types of innovation that will come to us, and we intend to evaluate each of those on its own returns."