Monday, May 18, 2015

You Probably Drive Worse Than Google’s Self-Driving Cars

Google’s self-driving cars have been involved in a few accidents since the company has started testing them on public roads. But according to Google, not one of those accidents was actually caused by the driverless car. Instead, they were all caused by the biggest hazard on the road: the errors that human drivers make when they’re distracted, impatient, inattentive, or just make poor judgments on distance and speed. And it’s precisely our aptitude for human error that will make the road to better safety — even with the help of driverless cars — a long and difficult one to navigate.
Chris Urmson, director of Google’s self-driving car program, writes on Backchannel that because about 33,000 people die on America’s roads every year, much of the enthusiasm for driverless carshas focused in on their potential to reduce accident rates. And consequently, Google is “thinking a lot about how to measure our progress and our impact on road safety” and finding that it’s not so easy to quantify the safety performance of its self-driving cars.
In order for Google to figure out how its cars are performing, Urmson writes, it needs to establish a baseline of typical “accident activity” on suburban streets. That baseline is difficult to understand because many incidents never make it into official statistics. The most common accidents involve light damage and no injuries, and while they often aren’t reported to the police, the National Highway Traffic Safety Administration (NHTSA) says that they account for 55% of all crashes.
That makes it difficult for Google to determine “how often we can expect to get hit by other drivers.” Urmson writes that with its fleet of more than 20 self-driving cars, which have logged 1.7 million miles, Google has learned that “Even when our software and sensors can detect a sticky situation and take action earlier and faster than an alert human driver, sometimes we won’t be able to overcome the realities of speed and distance; sometimes we’ll get hit just waiting for a light to change.”
Urmson writes that if you spend enough time on the road, accidents will happen whether you’re in a regular car or a self-driving car. Over the six years since Google began its driverless car project, it’s been involved in 11 minor accidents, and Urmson writes that “not once was the self-driving car the cause of the accident.” Rear-end crashes — the most frequent accident type in America — accounted for seven of Google’s accidents, and its cars have also been side-swiped and hit by a car rolling through a stop sign.
Urmson cites NHTSA statistics that driver error causes 94% of crashes, and writes that Google has identified patterns of driver behavior that it considers “leading indicators of significant collisions,” even though these behaviors don’t show up in official statistics. Distracted driving, lane drifting, running red lights, making impatient or distracted or just plain “crazy” turns are a few of the behaviors that Google observed. And while a driverless car — with its 360-degree visibility, full attention, and array of sensors — can plan for and detect these behaviors, it can’t unilaterally eliminate the safety hazards posed by other drivers’ behavior.
Re/Code’s Mark Bergen reports that Urmson’s Backchannel post is part of Google’s defense against an Associated Press report that three of its driverless cars have been involved in accidents since September, when California first allowed self-driving cars on public roads. The report suggests that Google’s driverless cars were involved in property damage incidents at rates higher than the national average.
“The national rate for reported ‘property-damage-only crashes’ is about 0.3 per 100,000 miles driven, according to the National Highway Traffic Safety Administration,” AP’s Justin Pritchard writes. “Google’s 11 accidents over 1.7 million miles would work out to 0.6 per 100,000, but as company officials noted, as many as 5 million minor accidents are not reported to authorities each year — so it is hard to gauge how typical this is.”
Urmson’s post highlights Google’s belief that comparing its accident figures with existing statistics provides an inaccurate picture of how well the driverless cars’ extensive safety features are performing. “Many minor accidents — the dings and fender benders — go unreported when only human-driven cars are involved,” Bergen explains. “Google wants them reported, in part, to prove the need for its autonomous systems. It may be a tricky balancing act for Google to weigh transparency with its self-driving project versus a public image of risky robotics.”
Pritchard notes that a major selling point of driverless cars being developed by Google and others is their safety. “Their cameras, radar and laser sensors provide a far more detailed understanding of their surroundings than humans have. Reaction times should be faster. Cars could be programmed to adjust if they sense a crash coming — move a few feet, tighten seat belts, honk the horn or flash lights at a distracted driver.”
But the top priority at this point in the development of self-driving cars is not to avoid the minor accidents caused by other drivers’ errors or inattention — factors that would be impossible to eliminate unless driverless cars were the only ones on the road — but to avoid causing a serious accident that would severely impact consumers’ estimation of the technology for years to come. Consumers still need to be convinced that self-driving cars will actually keep them safer, and transparency about how the technology is performing along the way is key.

No comments:

Post a Comment