Are self-driving cars safe enough to hit the road on their own? Not by a long shot, according to a recent study.
In fact, in order to make sure such vehicles are truly safe enough to avoid causing a car accident involving human drivers, such vehicles would need to be driven "hundreds of millions of miles" and "sometimes hundreds of billions of miles," researchers say.
The study was conducted by the Rand Corporation. Using traditional safety testing models, in which vehicles are driven on actual roads or closed safety courses, it would take manufacturers of self-driving cars years of testing to determine if such vehicles are as safe or safer than human drivers.
Specifically, one fatality currently occurs for every 100 million miles driven by human drivers, the Rand Corporation reports. As a result, in order to prove with 95 percent accuracy that self-driving cars are just as safe, the Rand Corporation says that such vehicles would have to operate on the road for 275 million miles without a fatality.
In order to achieve such results, 100 autonomous vehicles would need to be on the road 24 hours a day, 7 days a week for 12 years. That timeline could be cut to six weeks if 10,000 vehicles conducted similar tests. But even having 100 vehicles conducting such testing would be much more than the number of vehicles currently conducting such tests, according to the Rand Corporation report.
Do autonomous vehicles cause accidents?
Using current testing models, it will be "nearly impossible for autonomous vehicles to log enough test-driving miles on the road to statistically demonstrate their safety when compared to the rate at which injuries and fatalities occur in human-controlled cars and trucks," according to Rand senior scientist Nidhi Kalra, co-author of the study, in an article published by CNBC News.
"Our results show that developers of this technology and third-party testers cannot drive their way to safety," Kalra added in her CNBC News interview.
An accident earlier this year reportedly caused by a self-driving car suggests that such vehicles might not be perfect. On Feb. 14, a Google self-driving car crashed into a bus in California. The accident reportedly was the first one in which a self-driving car caused a crash involving another vehicle, according to numerous news services, including The Guardian newspaper. No one was injured in the accident.
Who's responsible for self-driving car accidents?
This is the question many people likely have their minds. If a driverless car crashes into another vehicle and injures someone, who would be responsible for the crash? The person or company that owns the vehicle? The vehicle's manufacturer? Someone else?
The most logical answer would seem to be the vehicle manufacturer, according to auto accident attorney Robert S. Marcus of Marcus & Mack law firm. If that's the case, then the person injured in such an accident would likely have a legitimate product liability case.
But what if the self-driving car wasn't properly maintained by the owner of the vehicle? What if the tires were bald or necessary repairs weren't made? What if road conditions (such as potholes) or bad weather (snow, ice, etc.) were contributing factors? Suddenly, who's at fault can be extremely confusing.
More testing clearly needs to be done before self-driving vehicles can hit the road on their own. There's simply no room for error when people's lives are at stake.