The Auto Channel
The Largest Independent Automotive Research Resource
The Largest Independent Automotive Research Resource
Official Website of the New Car Buyer

NHTSA Wrong To Say Computers Count As Drivers Of Robot Cars


PHOTO

MORE INFO: Driverless Cars News Archive
MORE INFO:Autonomous Cars News Archive

SANTA MONICA, Calif., Feb. 10, 2016 The National Highway Traffic Safety Administration is wrong to say the artificial intelligence guiding an autonomous robot car counts as the driver, Consumer Watchdog said today, adding that Google's own test data demonstrates the need for a human driver who can take control when necessary.

"Google says its robot technology failed and handed over control to a human test driver 272 times and the driver was scared enough to take control 69 times," said John M. Simpson, Consumer Watchdog's Privacy Project Director. "The robot cars simply cannot reliably deal with everyday real traffic situations. Without a driver, who do you call when the robots fail?"

Consumer Watchdog reiterated its support for regulations proposed by the California Department of Motor Vehicles covering the general deployment of autonomous robot cars on the state's highways.

"The DMV would require a licensed driver behind the wheel," Simpson noted. "If you really care about the public's safety, that's the only way to go."

Commenting on NHTSA's interpretation that the robot technology can count as a driver, Anthony Foxx, Secretary of Transportation said, "We are taking great care to embrace innovations that can boost safety and improve efficiency on our roadways. Our interpretation that the self-driving computer system of a car could, in fact, be a driver is significant. But the burden remains on self-driving car manufacturers to prove that their vehicles meet rigorous federal safety standards."

Consumer Watchdog said it will press NHTSA and the DOT to ensure that robot car manufacturers prove their cars are safe. The group also called on NHTSA to learn from California's experience with self-driving robot cars.

The companies' own data in reports filed with the California DMV makes clear that a human driver able to take control of the vehicle is necessary to ensure the safety of both robot vehicles and other vehicles on the road, Consumer Watchdog said.

Google, which logged 424,331 "self-driving" miles over the 15-month reporting period, said a human driver had to take over 341 times, an average of 22.7 times a month. The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times, according to its "disengagement report" filed with the DMV.

Other testing companies, driving far fewer autonomous miles than Google, also reported substantial numbers of disengagements to the DMV. Bosch had 625 disengagements with 934.4 miles driven. Nissan with 1,485 miles driven had 106. Mercedes-Benz reported 1,031 with 1,738 miles driven. Delphi reported 405 disengagements with 16,662 miles. Volkswagen with 10,416 miles reported 260. Tesla claimed it had none, but did not say how many miles its drove.

It's important to understand that these "disengagements" were promoted by real situations that drivers routinely encounter on the road, Consumer Watchdog said. Among reasons cited by Bosch were failures to detect traffic lights and heavy pedestrian traffic.

Google's robot technology quit 13 times because it couldn't handle the weather conditions.  Twenty-three times the driver took control because of reckless behavior by another driver, cyclist or pedestrian. The report said the robot car technology disengaged for a "perception discrepancy" 119 times. Google defines such a discrepancy as occurring when the car's sensors don't correctly perceive an object, for instance over-hanging branches. The robot technology was disengaged 55 times for "an unwanted maneuver of the vehicle." An example would be coming too close to a parked car. The human took over from Google's robot car three times because of road construction.

"What the disengagement reports show is that there are many everyday routine traffic situations with which the self-driving robot cars simply can't cope," said Simpson. "Self-driving vehicles simply aren't ready to safely manage many routine traffic situations without human intervention."

Read NHTSA's interpretation here: http://isearch.nhtsa.gov/files/Google%20--%20compiled%20response%20to%2012%20Nov%20%2015%20interp%20request%20--%204%20Feb%2016%20final.htm

Read Google's disengagement report here: http://www.consumerwatchdog.org/resources/cadmvdisengagereport-dec.2015.pdf

Visit our website at www.consumerwatchdog.org