California Asks: How Should We Regulate Driverless Cars?
Google Inc. has tested its driverless cars on California roads for a few years now, and it expects to make the cars available to consumers by 2020. With more than 400,000 miles clocked, there has been only one accident. Yet, even if there was only one accident per 400,000 miles, if everyone rode an autonomous vehicle, there could be more than 800,000 car accidents every year.*
While that may sound like a lot, it’s worth taking a look at driverless car crash statistics. Consulting firm McKinsey & Company did a study on the impact of driverless cars on the rate of error in the technology vs. human rate of error over the next 50 years. Their findings? That driverless cars could help reduce car accidents by 90% every year, saving 28,000 lives annually. They may reduce the number of accidents on our roads by eliminating drunk driving, drowsy driving, distracted driving, and other forms of negligence.
So how should the DMV regulate autonomous cars, and who should be held accountable when they cause accidents? Proposed regulations would make the licensed driver of the autonomous car liable for any accident caused by the technology. That still doesn’t solve the question of how systems could be held responsible for poor weather or poor highway conditions.
Reasonable courts would still be able to hold technology companies accountable for faults in their system, but the degree of error that is “permissible” is a question brought up by driverless cars. Future technology could save lives and avert car accidents. Yet, California must also consider the safety of drivers today.
Recording Devices On Self-Driving Cars
Estimates vary greatly on when, or if, we can expect self-driving automobiles to be a common sight on American roads. If they do become a legitimate option, there is some debate about how the devices should record what they do, or if they should record it at all. Piecing together an accurate picture of the circumstances surrounding serious car accidents has always been difficult.
In fatal accidents where there are no impartial witnesses to provide testimony, the work required to re-create the accident is substantial. Accidents involving self-driving vehicles might be even worse, because any passengers in that vehicle might not know what took place leading up to the crash. The chairman of the National Transportation Safety Board is on record in support of the data recording capabilities.
Some critics are concerned about the privacy risks of expanded data recording. The Alliance of Automobile Manufacturers conducted a poll in 2013 that indicated that nearly 75% of respondents were concerned about driverless car makers using the devices to record personal data. An additional 70% expressed concern that the data collected would accessible by the government.
Driverless cars must be able to gather data concerning the conditions around them. So, in addition to providing data regarding their own accidents, they might be used to supply information regarding accidents within range of their data collection devices. Such concerns caused the Consumer Watchdog group to seek a total ban on data collection by autonomous cars when Google initially requested permission to test them on California roads. The issue will only grow as the functionality of autonomous vehicles improves.
UPDATED JANUARY 2016:
As of December 2015, the California Department of Motor Vehicles has solidified their policy regarding driverless cars. They are proposing regulations that would require all driverless cars to have a licensed driver at the ready to take control, should the system encounter an issue it cannot maneuver. While it sounds like a reasonable request, there is reason to believe it is an essentially meaningless requirement—while defeating the advantage that driverless cars would have offered anyway.
Google, the Wall Street Journal, Forbes, and other tech industry voices are protesting the DMV’s potential regulations as unconstructive, meaningless, and cautious in a way that impedes technology. Of course, this has often been the fight between innovators and state regulators. Innovators are supposed to be daring and bold as much as state regulators are supposed to be slow-moving and cautious.
Where they may have a point is that if a driverless car is going to crash, having a licensed driver at the ready may not matter. Donald Norman, a noted usability design expert, has asserted that “decades of research” make it clear that human beings are not capable of long-term alertness and monitoring—more importantly, studies have shown that operators are unable to avert emergencies after long periods of monotony. Consider how distracted drivers are already—humans aren't effective backup.
The Bloomberg View notes that even if a driver were to remain alert and capable of averting an emergency, long periods of not driving would rob them of the skills and reaction time they would need. These regulations also limit the value that driverless cars might have offered to the elderly or the disabled, who may be incapable of traditional driving anyway.
* Drivers are on the road an average of 13,476 miles a year (Federal Highway Administration, "Average Annual Miles per Driver by Age Group"), and there are more than 23.9 million drivers in California (Department of Motor Vehicles, "Driver License and Identification Card Statistics").