Self-driving vehicles under development today may not be capable of avoiding some of the common driver errors that lead to car accidents, a recent study says. Self-driving vehicles may only eliminate about a third of car crashes, far shy of the goal of making car accidents a thing of the past.
The study from the Insurance Institute for Highway Safety (IIHS) found that driver error is the leading cause in more than 9 out of every 10 crashes. While automated cars can be programmed to identify traffic hazards, there are other types of decision-making and performance errors that have to be evaluated. “For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds,” the IIHS study says. In other words, to eliminate all car accidents, autonomous vehicles would need to be specifically programmed to prioritize safety over users’ need for speed and convenience.
Avoiding Common Causes of Car Accidents Through Automation
The IIHS study provides a good look at the types of human error behind about 94 percent of all car crashes which our car accident attorneys deal with every day.
IIHS researchers examined more than 5,000 police-reported crashes from the National Motor Vehicle Crash Causation Survey and separated the driver errors that contributed to the crashes into five categories:
- “Sensing and perceiving” errors, including driver distraction, impeded visibility and failing to recognize hazards before it was too late.
- “Predicting” errors, including misjudging a gap in traffic, incorrectly estimating how fast another vehicle was approaching or making an incorrect assumption about what another driver was going to do.
- “Planning and deciding” errors, including driving too fast or too slow for road conditions, driving aggressively, or leaving too little following distance from the vehicle ahead (tailgating).
- “Execution and performance” errors, including inadequate or incorrect evasive maneuvers, overcompensation, and other mistakes in controlling the vehicle.
- “Incapacitation,” including impairment due to alcohol or drug use, medical problems, or falling asleep at the wheel.
Obviously incapacitation would not be a problem for self-driving cars. Also, autonomous vehicles could prevent crashes caused exclusively by perception errors because cameras and sensors of fully autonomous vehicles would identify potential hazards better than a human driver. Accounting for 10 and 24 percent of car accidents, these two categories of crashes could be eliminated if all vehicles on the road were self-driving so long as sensors worked perfectly and other automotive systems never malfunctioned.
The report refers to a much-publicized Uber automated car accident in Tempe, Arizona in 2018. In that fatal accident, the automated car recognized 49-year-old Elaine Herzberg standing beside the road but could not predict that she would attempt to cross the road in front of the oncoming vehicle. When she did, it failed to execute the correct evasive maneuver to avoid striking her.
Jessica Cicchino, the IIHS vice president of research and co-author of the study, suggested another scenario in a discussion of the study with Autoblog. If a cyclist or another vehicle suddenly veers into the path of an autonomous vehicle, the automated car may not be able to stop fast enough or steer away from in time, Cicchino said.
“Autonomous vehicles need to not only perceive the world around them perfectly, but they also need to respond to what’s around them as well,” she said.
Self-driving vehicles could be programmed to obey all traffic laws such as speed limits. But if artificial intelligence allows them to drive and react more like humans, then fewer crashes will be stopped, Cicchino said.
“Building self-driving cars that drive as well as people do is a big challenge in itself,” Alexandra Mueller, IIHS research scientist and lead author of the study, said in the report. “But they’d actually need to be better than that to deliver on the promises we’ve all heard.”
When Will Automated Cars Be Mainstream?
Most car companies are betting that self-driving cars are inevitable and they’re all investing billions of dollars in self-driving vehicle initiatives, according to a March 2020 report from Emerj, an artificial intelligence research and advisory company.
SAE International, a recognized engineering standards arbiter, defines six levels of driving automation, ranging from no driving automation (level 0) to full driving automation (level 5) in the context of motor vehicles and their operation on roadways.
What technology experts and car manufacturers are promising is either cars that drive themselves for a large part of a person’s highway commute (level 3) or cars that can almost drive themselves in a covered metropolitan area (level 4), Emerj says. At both levels, the human must remain alert and ready to assume control of the vehicle.
After reviewing what 10 U.S. automakers are doing to develop self-driving vehicles, Emerj concludes that manufacturers are walking back predictions of having autonomous vehicles available and on the road by the early 2020s.
In most industries, the use of artificial intelligence (AI) and machine learning (ML) has not advanced much beyond digitizing paper documents and searching digital databases.
In addition to the technological problems to be conquered, autonomous vehicles can only advance in a welcoming regulatory environment. There are serious product liability concerns, the article says, pointing out that self-driving cars have already been involved in the deaths of five people since 2016.
A 2014 Brookings Institute paper cited by Emerj says it would be a mistake to let liability concerns slow or prevent consumer access to advanced autonomous vehicle technology. It says:
The legal precedents established over the last half a century of product liability litigation will provide manufacturers of autonomous vehicle technology with a very strong set of incentives to make their products as safe as possible. In the overwhelming majority of cases, they will succeed. However, despite these efforts, there will inevitably be some accidents attributable in whole or in part to defects in future vehicle automation systems. While this will raise complex new liability questions, there is no reason to expect that the legal system will be unable to resolve them.
In short, the liability concerns raised by vehicle automation are legitimate and important. But they can be addressed without delaying consumer access to the many benefits that autonomous vehicles will provide.
If you have been injured in a car collision caused by another drive in Raleigh or elsewhere in the Triangle, contact an experienced car accident lawyer at Younce, Vtipil, Baznik & Banks to evaluate your legal options. The consultation is free.