Waymo blames human for its self-driving car accident

Waymo blames human for its self-driving car accident
Ryan is an editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter: @Gadget_Ry

Waymo has detailed how its self-driving vehicle in Mountain View came to hit a motorbike, and it’s all to do with a meddling human.

Driverless cars are expected to improve road safety as they’re not prone to human ailments which often cause accidents such as distractions, intoxication, general lack of ability, or medical problems at the wheel such as a heart attack.

Of course, it’s going to take some time before driverless car AIs are robust enough for public use. Until then, on road tests require a backup driver to take control in potentially dangerous situations.

Waymo’s backup driver took control after noticing a car on the left moving into their lane. In response, the driver moved the car into the right-hand lane. Unfortunately, a motorbike was moving from behind to overtake at the same time.

The collision resulted in minor damage to Waymo’s vehicle but sent the motorcyclist to hospital.

According to Waymo, this is the kind of situation a self-driving car would prevent. In fact, when running a simulation of what its vehicle would have done without the human control, it would have reduced the car’s speed and avoided the collision.

In a blog post, Waymo CEO John Krafcik wrote:

“Our review of this incident confirmed that our technology would have avoided the collision by taking a safer course of action.

While our test driver’s focus was on the car ahead, our self-driving system was simultaneously tracking the position, direction and speed of every object around it.

Crucially, our technology correctly anticipated and predicted the future behaviour of both the merging vehicle and the motorcyclist.

Our simulation shows the self-driving system would have responded to the passenger car by reducing our vehicle’s speed, and nudging slightly in our own lane, avoiding a collision."

Surrendering control to a self-driving car is going to feel wrong for some time, especially for long-term drivers. During current tests, sometimes taking control is still required.

When an Uber self-driving car hit and killed a pedestrian in Arizona, the subsequent investigation found the backup driver was distracted by their phone. If the human driver had taken over it’s quite likely the collision would have been avoided.

That situation is probably on every backup driver’s mind. Unfortunately, on this occasion, the driver’s attempt to be cautious appears to have caused the accident.

“People are often called upon to make split-second decisions with insufficient context,” explains Krafcik. “In this case, our test driver reacted quickly to avoid what he thought would be a collision, but his response contributed to another.”

https://www.iottechexpo.com/wp-content/uploads/2018/09/iot-tech-expo-world-series.pngInterested in hearing industry leaders discuss subjects like this? Attend the IoT Tech Expo World Series events with upcoming shows in Silicon Valley, London, and Amsterdam.

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *