Consumer watchdog warns: Self-driving cars are not safe to be deployed on public roads
Consumer Watchdog, the nonprofit organization dedicated to providing an effective voice for taxpayers and consumers, today warns the U.S. Senate that autonomous cars are not safe to be deployed on public roads. The watchdog is basing its warning on recent analysis of required reports from companies testing robot cars in California and called on senators to halt a bill that would allow robot cars on public roads. The safety issue is not all about the self-driving cars. According to Scientific American, even when self-driving cars are doing everything they’re supposed to, the drivers of nearby cars and trucks are still flawed, error-prone humans.
Back in September 2017, the senate passed a bill, called the SELF DRIVE Act. Now the Senate is considering a robot car bill, the AV Start Act, S. 1885, which was approved by the Commerce, Science, and Transportation Committee last year. However, Senator Dianne Feinstein, D-CA, decided to place the bill on hold because she is concerned about the safety of robot cars and whether the technology is ready for public roads.
In an open letter to Senators, Privacy and Technology Project Director, John M. Simpson and Consumer Advocate, Sahiba Sindhu warned senators that the technology is not ready for safe deployment on public roads. “It would be a great threat to the public for the Senate to authorize the deployment of robot cars without protections requiring certification of the vehicles when testing shows the state of the technology imperils the public if a human driver cannot take over the car,” they wrote. The California reports revealed that robot cars tested could not cope when faced with the task of making some decisions humans make every day when they drive. Among the failures that required the human test driver to take control:
- GPS signal failure,
- shorter-than-average yellow lights,
- rapid fluctuations in street traffic,
- sudden lane blockages,
- cars parked incorrectly nearby
- hardware failure
- software failure
“We need to verify that self-driving cars can actually drive themselves before we put them on public roads. What makes a car self-driving other than an opinion of a car manufacturer interested in selling their product? Legislation must protect the public by designating standards that guarantee that new vehicles on the road can meet their purported capabilities,” said Simpson and Sindhu in their letter to the Senate.
Twenty companies released the only publicly available data about the state of robot car technology to the California Department of Motor Vehicles. The required “disengagement reports” released last week show so-called self-driving cars cannot go more than 5,596 miles in the best-case scenario without a human test driver taking over at the wheel. In most cases, the vehicles cannot travel more than a few hundred miles without needing human intervention, Consumer Watchdog noted.
Based on its analysis of the disengagement reports, the nonprofit, nonpartisan public interest group called on the Senate to halt the AV START Act:
“Consumer Watchdog calls on you to act to protect highway safety and halt the AV START Act, S. 1885, unless it is amended to require enforceable safety standards that apply specifically to autonomous technology. For now, given the state of the technology as indicated by developers themselves, any AV legislation should require a human driver behind a steering wheel capable of taking control.”
To that end, Consumer Watchdog called for “carefully crafted regulations, designated performance metrics, and a system of certification that guarantees the technology will not imperil the public if a human driver cannot take over the so-called ‘self-driving’ vehicle.” You can read more the Consumer Watchdog’s letter to the Senate here.
Twenty companies with permits to test robot cars in California were required to file “disengagement reports”, covering 2017 listing miles driven in autonomous mode and the number of times the robot technology failed. The reports were released last week. Nine of those companies, including Waymo (a subsidiary of Google’s parent company) and GM Cruise, offered specific data showing reasons their robot technology failed.
Read their 2017 disengagement reports here.
All other companies that released specific data detailing reasons for the disengagements, including Nissan and Drive.ai, a technology startup partnered with Lyft, confirmed Waymo’s and GM Cruise’s experiences. Nissan said it tested five vehicles, logged 5007 miles and had 24 disengagements. Meanwhile, Drive.ai had 151 disengagements in the 6,572 miles the company logged.