Skip to content
Technology & Innovation

Why Google’s Self-Driving Cars Are Considered ‘Too Polite’

Human motorists can't handle their do-goodery driving.

Before self-driving cars take over the transportation industry, they’ll need to live alongside humans for a period of time, gaining our trust. Google is learning that for all their efficient qualities, humans don’t drive well, and they don’t drive well alongside autonomous cars. As a way to prevent accidents, Google will have to make its robotic cars drive more like humans.

Self-driving cars don’t drive how we expect them to — they don’t cut the corner or edge forward slowly into junctions. They give that corner a wide berth and stop abruptly, indirectly causing accidents. Chris Urmson, who’s heading up Google’s self-driving car project, explained to The Wall Street Journalthat the company’s cars are “a little more cautious than they need to be. We are trying to make them drive more humanistically.”

No Google car has been the direct cause of an accident in the 16 times one has been involved in an incident in its 2-million-mile testing tour along the streets of Palo Alto, California. In the accidents Google’s self-driving cars have been in, they were all caused by human drivers.

Here’s a visualization of one of the incidents created by the self-driving car’s sensors:

Content not available

Had the other car been a self-driving vehicle, the accident wouldn’t have occurred. Herein lies the issue: So long as human drivers are still out there, self-driving cars will need to conform to the etiquette of human driving. They’ll have to be a little more inefficient, but it will help gain the acceptance of human motorists, paving the way for a future of autonomous vehicles.

Brad Templeton explains why a world of self-driving cars would be good for us, and why we shouldn’t be afraid.

 —

Photo Credit: KAREN BLEIER / Getty Staff


Related

Up Next