# Waymo’s self-driving car crashed because its human driver fell asleep at the wheel



## jocker12 (May 11, 2017)

In June, one of Waymo's self-driving Chrysler Pacifica minivans crashed on the freeway outside of the company's office in Mountain View, California, after its lone safety driver fell asleep at the wheel.

Tech news site The Information, which first reported the crash (paywall), said the human driver manning the vehicle "appeared to doze off" after about an hour on the road, according to two people familiar with the matter. The safety driver unwittingly turned off the car's self-driving software by touching the gas pedal. He failed to assume control of the steering wheel, and the Pacifica crashed into the highway median.

The dozing driver didn't respond to any of the vehicle's warnings, including a bell signaling the car was in manual mode and another audio alert, the Information reported. He regained alertness once the car crashed, then turned around and headed back to the Mountain View office. He no longer works for Waymo.

Waymo got lucky with the accident. The safety driver wasn't hurt and no other vehicles were involved. Waymo reported the vehicle sustained "moderate damage to its tire and bumper." The company told The Information in a statement that it is "constantly improving our best practices, including those for driver attentiveness, because the safe and responsible testing of our technology is integral to everything we do."

Improvements in this case meant altering night-shift protocol to have two safety drivers instead of one, to guard against someone nodding off at the wheel. At a company meeting to discuss the incident, one attendee reportedly asked whether safety drivers were on the road too long, and was told that drivers can take a break whenever they need to.

Waymo is pursuing fully self-driving software that wouldn't require any intervention from humans, in contrast to automakers like Tesla and General Motors, which have started with selectively automated features to assist human drivers. As Waymo has gotten closer to true autonomy, it has also tried to reduce its reliance on human safety drivers by, for example, cutting the number of safety drivers in a test vehicle to one from two. Waymo plans to launch a commercial ride-hail service with driverless cars in the Phoenix area this year.

After a self-driving Uber struck and killed a pedestrian in Tempe, Arizona in March, one point of focus was Uber's safety-driver policies. Jalopnik pointed out that "almost everyone"-Toyota, Nissan, Ford's Argo AI-uses two people to test self-driving cars. In the Uber Volvo that crashed, on the other hand, Rafaela Vasquez was a lone safety driver, at night. She was later found by police to be streaming_The Voice_ on her phone at the time of impact.

One thing everyone working on driverless cars agrees on is that humans are bad drivers. People from Waymo CEO John Krafcik to disgraced former Uber engineer Anthony Levandowski-try finding a more diametrically opposed pair-like to talk about how driverless cars will save lives by eliminating thousands of preventable highway fatalities a year.

It is baffling, then, that these companies trust the very humans they seek to unseat to watch over their adolescent technology, alone and for hours on end. An autonomous safety driver once described to me working 10- to 11-hour shifts unaccompanied, including nights that began in the early evening and ended well past midnight. Drivers could take breaks whenever they wanted, this person said, but it was still a challenge to stay focused for that long without anyone to talk to, or much to do beyond watching the road.

A few months after the Tempe accident, Uber laid off most of its self-driving car operators in Pittsburgh and San Francisco. Uber said it would replace these people with "mission specialists" trained to monitor its cars on roads and on specialized test tracks. These mission specialists are supposed to be more involved in the actual development of the cars, tasked with tracking, documenting, and triaging any issues that might crop up. Per a current job listing, they should have "the ability to operate independently with little or no supervision."

There is a great essay by reporter Tim Harford about how our quest to automate all things may be setting us up for disaster. The more we let computers fly planes, drive cars, operate machinery, and so on, the less time the people we've put in place for backup-pilots, safety drivers, and other operators-are able to practice their skills, and the greater the odds they'll be unprepared in a true emergency. This problem is known as the paradox of automation, and it applies to benign problems as well, like how we struggle to remember phone numbers that are stored in our mobile devices, or to do mental math that we can punch into a calculator. Like any skill, these need to be practiced to be maintained, and become rusty with disuse. Instead of designing technology for humans to babysit, Harford wonders, why aren't we making technology that babysits humans?

https://qz.com/1410928/waymos-self-driving-car-crashed-because-its-human-driver-fell-asleep/


----------



## goneubering (Aug 17, 2017)

So now these miracle “self-driving” cars actualy need TWO humans to make them safe??!!


----------



## jocker12 (May 11, 2017)

goneubering said:


> So now these miracle "self-driving" cars actualy need TWO humans to make them safe??!!


This is another unexpected problem.

You need to have pedals and steering wheel in the car, if you want to drive the car in case the system malfunctions or you need to move it around tight parking lots, places where the software would potentially have problems maneuvering the car. Right?

Well, if they completely remove the monitors and get a group of children in the car, and those children start playing and accidentally drop an object (toy or personal belongings) on the pedals, or touch the steering wheel and disengage the self driving system, they are trapped inside a 2 ton out of control moving vehicle. Brilliant.

At this point, if Waymo wants to improve safety, they need to ADD more humans to monitor the car and the road or/and to monitor each other (as long as the company remains stubborn and prohibits them from actively driving the car for 100% of the time). Seems like only human monitoring/driving presence could guarantee safety for any passengers.

Then again, why are they spending billions to remove the driver?


----------



## Stevie The magic Unicorn (Apr 3, 2018)

This is proof that Americans (and probobly the rest of the world) are too stupid to have partially self driving cars.

They need to be either 100% or zero % (ok sure cruise control would be ok but that’s it.

That’s 2 notable incidents caused by idiots not operating these things properly. One of them fatal.


----------



## goneubering (Aug 17, 2017)

You know what? This story did not get nearly enough press coverage.


----------



## tohunt4me (Nov 23, 2015)

jocker12 said:


> In June, one of Waymo's self-driving Chrysler Pacifica minivans crashed on the freeway outside of the company's office in Mountain View, California, after its lone safety driver fell asleep at the wheel.
> 
> Tech news site The Information, which first reported the crash (paywall), said the human driver manning the vehicle "appeared to doze off" after about an hour on the road, according to two people familiar with the matter. The safety driver unwittingly turned off the car's self-driving software by touching the gas pedal. He failed to assume control of the steering wheel, and the Pacifica crashed into the highway median.
> 
> ...


Who ALLOWS these " SCIENCE EXPERIMENTS " on OUR ROADS !

Let WAYMO build their OWN road !

Not endanger the public on tax payer owned property !


----------



## jocker12 (May 11, 2017)

goneubering said:


> You know what? This story did not get nearly enough press coverage.


I agree, but this is the time when most of the technology journalists don't know what is right and wrong anymore - safety or dangerous dreaming. Ultimately, the consumers will clarify this for the dreamers.


----------



## goneubering (Aug 17, 2017)

jocker12 said:


> I agree, but this is the time when most of the technology journalists don't know what is right and wrong anymore - safety or dangerous dreaming. Ultimately, the consumers will clarify this for the dreamers.


Here's what a former insider says.

https://www.dailymail.co.uk/news/ar...s-end-delusion-making-world-better-place.html

Former Google boss launches scathing Silicon Valley attack urging tech giants to end the delusion that it's making the world a better place

'I want Silicon Valley to end the self-delusion and either fess up to the reality we are creating, or live up to the vision we market to the world each day. Because if you're going to tell people you're their saviour, you better be ready to be held to a higher standard.'

Miss Powell, 40, launched her searing criticism in an essay and satirical novel, both published this week, which together paint a damning portrait of the Silicon Valley culture.

She said: 'We go about saying that we're building these amazing things and doing great things for the world but we're also causing a lot of serious problems.


----------



## jocker12 (May 11, 2017)

goneubering said:


> Here's what a former insider says.
> 
> https://www.dailymail.co.uk/news/ar...s-end-delusion-making-world-better-place.html
> 
> ...


Yup

Here is her novel "The Big Disruption" (348 min. read) published and free on medium.com - https://medium.com/s/the-big-disrup...bed0268cf?_branch_match_id=577153626433753295


----------



## uberdriverfornow (Jan 10, 2016)

The first lie is that the driver fell asleep at the wheel. The second is that somehow the driver decided to touch the pedal, which somehow turned off the self-driving. I mean, the entire story is likely a lie. We will never know the truth and there was no public investigation so we never will. 

It sucks that the drivers are forced to sign confidentiality agreements. No reporter or person will ever talk to them and ask them what the real story was and how bad these things are. I am dying to see one of these drivers get out of their car here in Mountain View so I can try to have a talk with them.


----------

