The riskiest thing about self-driving vehicles may turn out to be the human drivers of other vehicles.
Four of the nearly 50 self-driving cars undergoing tests on California roads since September, when the state began issuing permits to auto companies, have been involved in crashes.
But the cars, three owned by Google and one by Delphi, were in collisions caused By human error.
Driver inattention was behind the collisions involving the Google cars, said Katelin Jabbari, a spokeswoman for the tech giant, which is developing a fleet of autonomous vehicles.
The crash of the Delphi car was in October while the vehicle waited to turn left at a light. Another car crossed a median and struck it, company officials said.
Despite the mishaps, self-piloted vehicles hold the promise of improved safety, said Xavier Mosquet, who heads Boston Consulting Group's automotive practice in North America.
"These cars are prototypes and experiments. You can't yet derive long-term conclusions," Mosquet said.
But so-called active safety systems, which serve as the building blocks for automated driving, are already being built into cars and are making roads safer, he said.
Some of the systems, such as sensors that alert drivers to a potential crash and slam on the brakes, are reducing injury and property insurance claims, according to the Insurance Institute for Highway Safety.
"If you can help the driver make the right decision, it is helpful," Mosquet said. "I think the improvement will be real."
It's a mistake to draw conclusions about self-driving vehicles from the recent crashes, said Bryant Walker Smith, who is both a law and engineering professor at the University of South Carolina.
"I am not surprised that autonomous vehicles were hit," Smith said. "Any vehicle out on the road long enough will be in a crash."
Google said its automated cars have driven nearly 1 million miles on autopilot and are now averaging around 10,000 self-driven miles a week mostly on city streets. The cars have traveled another 700,000 miles with humans at the helm.
"Over the 6 years since we started the project, we've been involved in 11 minor accidents during those 1.7 million miles of autonomous and manual driving with our safety drivers behind the wheel, and not once was the self-driving car the cause of the accident," Google said in its blog Monday.
Smith said it all sounds routine, given the state of human driving.
"These were not catastrophic, high-profile crashes that would be of particular alarm, for example when that vehicle does something that a human would not do such as speeding up and not stopping," Smith said.
With humans at the wheel, more than 30,000 people die annually in auto collisions in the U.S.
"That's a staggering number," Smith said. "People will accept an unacceptable status quo and be concerned about the things that are new."
Google and Delphi are two of seven companies in the state that have obtained permits for self-driving cars. There are 48 vehicles approved for testing and
269 people permitted to control them, said Jessica Gonzalez, a spokeswoman with the Department of Motor Vehicles.
Still, the involvement of autonomous vehicles in collisions, and the lack of public reporting of the incidents, raises red flags, consumer groups said.
"It is important that the public know what happened," John M. Simpson, the privacy project director for Santa Monica-based Consumer Watchdog, wrote in a letter to Google. "You are testing driverless vehicles on public highways, quite possibly putting other drivers at risk."
Consumer Watchdog wants Google to release current and future collision reports involving its driverless cars, he said. His organization learned of the Google's robotic cars crashes after it filed a Public Records Act request with the DMV.
The DMV told Consumer Watchdog that self-driving vehicle accident reports are confidential and declined to release them. However, the Associated Press reported details about the four crashes Monday.
Rosemary Shahan, president of Consumers for Auto Reliability and Safety in Sacramento, also expressed concern about driverless cars. But she said news of four incidents didn't mean much.
"It is a little early to conclude anything based on such tiny numbers," Shahan said, adding that the technology still faces challenges.
She noted that even the developers of autonomous cars have said they aren't yet ready for driving in inclement weather, including snow, fog and heavy rain – situations California drivers confront.
Moreover, the idea that a passenger can suddenly take control of the vehicle if needed is questionable, Shahan said. "You can't make an emergency maneuver if you are watching stock quotes or a ball game."
Regardless of these issues, traffic safety officials and automakers see self-driving cars as the future.
By 2025, as many as 250,000 self-driving vehicles could be sold each year globally, and that number could swell to 11.8 million a decade later, according to a January study by IHS Automotive, an industry research firm.
"It is no longer question of if – but when – autonomous vehicles will hit the road," Mosquet said.
Already vehicles with varying levels of self-driving capability- including single-lane highway driving, valet parking and maneuvering in traffic jams – are starting to reach dealer showrooms this year and next, he said.
Tesla Motors plans to update the software on many of its cars to enable the luxury electric sedans to steer, brake and even find parking places all by themselves.
One new model planned for Cadillac will have a "Super Cruise" mode that will allow drivers to switch the vehicle into a semi-automated mode in which it will automatically keep the car in its lane, making necessary steering adjustments, and autonomously trigger braking and speed control to maintain a safe distance from other vehicles.
These functions, and maybe even full self-driving vehicles, are legal in much of the country, Smith said.
For now, just four states – California, Florida, Michigan and Nevada, as well as Washington D.C. – regulate the research and development testing of automated vehicles, Smith said.
But Smith said the legal presumption is that unless something has been specifically prohibited by law, it is permitted.
Ironically, he said, that means that Michigan, one of the states that has passed regulations on self-driving vehicles \x97 is actually more restrictive than states that have not considered the issue.