It’s No Use Honking. The Robot at the Wheel Can’t Hear You

Written by Ryan Beene. This article first appeared in Bloomberg News.

As auto accidents go, it wasn’t much: twelve minutes before noon on a cool June day, a Chevrolet Bolt was rear ended as it crawled from a stop light in downtown San Francisco.

What made this fender bender noteworthy was the Bolt’s driver: a computer.

In California, where companies like Cruise Automation Inc. and Waymo LLC are ramping up testing of self-driving cars, human drivers keep running into them in low-speed fender benders. The run-ins highlight an emerging culture clash between humans who often treat traffic laws as guidelines and autonomous cars that refuse to roll through a stop sign or exceed the speed limit.

“They don’t drive like people. They drive like robots,” said Mike Ramsey, an analyst at Gartner Inc. who specializes in advanced automotive technologies. “They’re odd and that’s why they get hit.”

A Chevrolet Bolt with AV Technology is displayed outside of the General Motors Co. Orion Assembly Plant in June. Photographer: Jeff Kowalsky/Bloomberg

Companies are now testing autonomous vehicles from Phoenix to Pittsburgh and developers are closely watching how they interact with their human-driven counterparts as they prepare for a future in which they will be sharing the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy Inc.

“If the cars drive in a way that’s really distinct from the way that every other motorist on the road is driving, there will be in the worst case accidents and in the best case frustration,” he said. “What that’s going to lead to is a lower likelihood that the public is going to accept the technology.”

Sensors embedded in autonomous cars allow them to “see” the world with far more precision than humans, but the cars struggle to translate visual cues on the road into predictions about what might happen next, Iagnemma said. They also struggle to handle new scenarios they haven’t encountered before.

California is the only state that specifically requires reports when an autonomous vehicle is involved in an accident. The records show vehicles in autonomous mode have been rear-ended 13 times in the state since the beginning of 2016, out of 31 collisions involving self-driving cars in total, according to the California Department of Motor Vehicles.

The collisions also almost always occur at intersections rather than in free-flowing traffic. A Cruise autonomous vehicle was rear-ended last month, for example, while braking to avoid a vehicle drifting into its lane from the right as traffic advanced from a green light.

Waymo’s now-retired “Firefly” autonomous vehicle prototypes were rear-ended twice at the same intersection in Mountain View, California, in separate instances less than a month apart in 2016. In both cases, the Waymos were preparing to make a right hand turn before they stopped to yield for oncoming traffic and got hit from behind.

Another time a vehicle was rear-ended by a cyclist after it braked to avoid another car. And a truck racing to pass a slow-moving self-driving vehicle before a stop sign clipped it as it scooted back to the right.

The state’s crash reports don’t assign blame and provide only terse summaries of the incidents, but a few themes are common. They’re almost always low-speed fender benders with no injuries. The Bolt, for example, was traveling at less than one mile per hour when it was rear-ended. While they represent a minuscule share of crashes in the state, autonomous vehicles are also a very small share of the vehicles on the road.

“You put a car on the road which may be driving by the letter of the law, but compared to the surrounding road users, it’s acting very conservatively,” Iagnemma said. “This can lead to situations where the autonomous car is a bit of a fish out of water.”

A spokeswoman for Cruise, which was acquired by General Motors Co. last year, said the crash reports speak for themselves.

The company’s chief executive officer Kyle Vogt said in a September blog post that the company’s third-generation autonomous Chevrolet Bolts are “designed to emulate human driving behavior but with the human mistakes omitted.”

More Natural

San Francisco’s streets are chaotic, but that’s helping Cruise program its cars to learn how to react to those challenges, Vogt said in a separate blog post.

“People put junk in the street. They park everywhere. People don’t obey crosswalks,” Vogt wrote. “Our vehicles must be assertive, nimble, and sometimes a bit creative.”

Waymo, Alphabet Inc.’s self-driving car unit, has tried to refine how its vehicles act so that they are more natural. For example, the developer altered its software dictating how the cars handled turns to be more comfortable for passengers, says Duke University robotics professor Missy Cummings.

“They were cutting the corners really close, closer than humans would,” she said. “We typically take wider turns.”

Waymo is also using simulations to try to teach its cars to inch forward at flashing yellow lights. Dmitri Dolgov, Waymo’s technology chief, wrote in a December 2016 blog post that the companies were getting better at navigating the nonverbal dance of interacting with others on the road.

Related: Driverless Van Mystery Revealed as Ford Test of Hidden Human

Ford Motor Co. went so far as to put a vehicle on the road along with a driver masked to resemble the car’s seat. The experiment, conducted in cooperation with the Virginia Tech Transportation Institute, was designed to assess how driverless cars could communicate with other roadway users, using light signals to replace the eye contact and other signals that humans use to navigate city streets.

“Humans violate the rules in a safe and principled way, and the reality is that autonomous vehicles in the future may have to do the same thing if they don’t want to be the source of bottlenecks,” Iagnemma said.

A warm, clear climate and hands-off approach to regulations has recently made Phoenix, Arizona a hotbed of testing. Waymo began offering rides in a fleet of self-driving Chrysler Pacifica minivans to the public there in April.

Sergeant Alan Pfohl, a spokesman for the Phoenix Police Department, says the testing is going smoothly thus far.

The only crash he’s aware of is one last March in which an Uber Technologies Inc. self-driving Volvo SUV was toppled after being hit by another vehicle that failed to yield. No injuries were reported.

“Technology can always fail, but so can humans,” Pfohl said.

About BloombergNEF

BloombergNEF (BNEF) is a strategic research provider covering global commodity markets and the disruptive technologies driving the transition to a low-carbon economy. Our expert coverage assesses pathways for the power, transport, industry, buildings and agriculture sectors to adapt to the energy transition. We help commodity trading, corporate strategy, finance and policy professionals navigate change and generate opportunities.
 
Sign up for our free monthly newsletter →

Want to learn how we help our clients put it all together? Contact us