# Ford's Self-Driving Test Car Severely Damaged In Crash



## just_me (Feb 20, 2017)

It's never good news when an autonomous automobile is involved in an accident.* Lately, Alphabet's Waymo crashed an autonomous bus in Las Vegas, and Uber managed to flip a self-driving Volvo in Arizona.*

In the latest installment of autonomous car accidents across America, a self-driving test car from the Ford-backed startup Argo-AI was severely damaged Wednesday that *sent two people to the hospital.*










According to the Pittsburgh Post-Gazette, a box truck ran a red light about 10 a.m. at the 16th and Progress streets in Pittsburgh's North Side and smashed into an Argo AI self-driving car with four people inside.* Two of the four passengers in the Argo AI car were injured and taken to the hospital in stable conditions.
*









Alan Hall, a communications manager for Ford, who handles public relations on behalf of Argo AI, stated, *"we're aware that an Argo AI test vehicle was involved in an accident. We're gathering all the information. Our initial focus is on making sure that everyone involved is safe."*

Hall offered limited information on whether the car was in self-driving mode during the accident, and or if the Argo AI fleet has been suspended.

*In recent times, this is the second autonomous car crash in Pittsburgh.* In September, Uber grounded its fleet of self-driving cars for a half day after one of its autonomous cars crashed. After an investigation, the company determined the car's autonomous systems were not at fault during the accident.

In early 2017, Ford invested $1 billion in Argo AI, an artificial intelligence company that Ford has outsourced to build the brains in the company's next generation of self-driving vehicles. The startup anticipates the deployment of a fully driverless car, without a steering wheel or pedals, by 2021.

*Back in November, we stated that just because its legal to test autonomous cars on public streets, doesn't necessarily mean they've been optimized for safety*&#8230;

Waymo published a report for California's Department of Motor Vehicles about how frequently its driverless cars "disengaged" because of a system failure or safety risk and forcing a human driver to take over. In the report, Waymo said this happened once every 5,000 miles the cars drove in 2016, compared with once every 1,250 miles in 2015. While that's certainly an improvement, these types of incidents are hardly rare.

https://www.zerohedge.com/news/2018-01-12/fords-self-driving-test-car-severely-damaged-crash


----------



## tohunt4me (Nov 23, 2015)

Death Traps !!!


----------



## just_me (Feb 20, 2017)

I would expect both Uber and Lyft to fire me if I was getting in an accident every 5,000 miles.


----------



## everythingsuber (Sep 29, 2015)

just_me said:


> I would expect both Uber and Lyft to fire me if I was getting in an accident every 5,000 miles.


Let's face it. If you were crashing every 5000 miles you would walking away long before they fired you. Most human drivers I know don't crash every million miles. That is how far the robots have to go to brake even.


----------



## Sydney Uber (Apr 15, 2014)

tohunt4me said:


> Death Traps !!!


The truck ran a red light.

Unfortunately all this accident proves is the SOONER all vehicles are Autonomous, the fewer accidents and fewer injuries we'll be seeing.


----------



## heynow321 (Sep 3, 2015)

Sydney Uber said:


> The truck ran a red light.
> 
> Unfortunately all this accident proves is the SOONER all vehicles are Autonomous, the fewer accidents and fewer injuries we'll be seeing.


that's not true at all.


----------



## dogemuffins (Mar 16, 2017)

The truck ran a red light, but given the opportunity a human driver might have been able to see hey that truck isn't stopping let's get out of the way.


----------



## Sydney Uber (Apr 15, 2014)

heynow321 said:


> that's not true at all.


Read the story

The Truck was witnessed running a red light

Perhaps a human driven car could've avoided the truck.

But it will practically be impossible for a SD car to blast a red light in the future. 30 years ago I watched a friend of mine get collected a a driver running a red light. There were mitigating circumstances, end of day, the Sun was low behind the lights and car driver blinded somewhat.

I still have the slo-mo vision imprinted in my mind, of the airborne cartwheel my friend did, luckily landing on a soft grass verge.

Traffic lights here already transmit a control command when changing phases. That control command will be a second input that SD cars would act on, Teslas only act on sighting a red light via its cameras. When all systems are fully in place SD cars will never run a light - saving lots of road trauma


----------



## just_me (Feb 20, 2017)

Sydney Uber said:


> The Truck was witnessed running a red light
> 
> Perhaps a human driven car could've avoided the truck.


No insurance company that I know of will insure a driver who gets into an accident every 5,000 miles driven. And, there are 100s of millions of drivers out there. So obviously, the human driver does do things to avoid accidents, yes?



Sydney Uber said:


> Traffic lights here already transmit a control command when changing phases. That control command will be a second input that SD cars would act on, Teslas only act on sighting a red light via its cameras. When all systems are fully in place SD cars will never run a light - saving lots of road trauma


Q: Who is going to pay for every single traffic light in the country to get upgraded with radio signals that transmit control commands?

A: Not the taxpayer. (Edit - not in this political environment).


----------



## jocker12 (May 11, 2017)

Sydney Uber said:


> Read the story
> 
> The Truck was witnessed running a red light
> 
> ...


You need to study what is reported more thoroughly - "According to _The Incline: _Pittsburgh Department of Public Safety spokesperson Alicia George said the crash occurred at about 10 a.m. at the intersection of 16th and Progress streets between North Shore and Troy Hill near the northern side of the 16th Street Bridge (if you turn the StreetView a little over the right, you will recognize the red building from the first picture in the article which is the same in the second picture on the backround with a "One Progress Court" grey sign above the Argo's car Lidar white sensor on it's roof). A box truck *apparently ran a red* light and T-boned a self-driving vehicle with four occupants inside, according to George. Two people from the vehicle were injured and were stable when they were transported to a nearby hospital. The crash site is about a mile from Argo AI's garage."
and
"George told _The Verge _that there was no word on charges at this time. "Not sure if there will be any charges," she said in an email. "*The accident report has not been completed. That could take time.*" (from - https://www.theverge.com/2018/1/10/16875066/argo-ai-self-driving-car-crash-pittsburgh-ford)

Now I am asking you, why the report is not completed because if called, the police will need to have it completed by the end of the day? Also, why is DMV not asking Argo to submit a report with the data they have recorded with all the functioning instruments on board? And why are ARGO and the passengers not taking legal action for the damage or for medical expenses? ("It is not yet known whether charges will be filed against the driver of the box van, or whether the Argo car was in self-driving mode at the time of the accident")

I am telling you that in this case, is a lot more information that is not getting released by the victims and by Argo, the owner of the damaged car. I want to see the police report as well as statements from the passengers and the truck driver involved in the accident.


----------



## Sydney Uber (Apr 15, 2014)

jocker12 said:


> You need to study what is reported more thoroughly - "According to _The Incline: _Pittsburgh Department of Public Safety spokesperson Alicia George said the crash occurred at about 10 a.m. at the intersection of 16th and Progress streets between North Shore and Troy Hill near the northern side of the 16th Street Bridge (if you turn the StreetView a little over the right, you will recognize the red building from the first picture in the article which is the same in the second picture on the backround with a "One Progress Court" grey sign above the Argo's car Lidar white sensor on it's roof). A box truck *apparently ran a red* light and T-boned a self-driving vehicle with four occupants inside, according to George. Two people from the vehicle were injured and were stable when they were transported to a nearby hospital. The crash site is about a mile from Argo AI's garage."
> and
> "George told _The Verge _that there was no word on charges at this time. "Not sure if there will be any charges," she said in an email. "*The accident report has not been completed. That could take time.*" (from - https://www.theverge.com/2018/1/10/16875066/argo-ai-self-driving-car-crash-pittsburgh-ford)
> 
> ...


You have raised fair and reasonable questions. I guess we just wait to hear the outcome


----------



## Billys Bones (Oct 2, 2016)

So the other night I was driving down a one way street with passengers in the vehicle following the computer designated route using waze. Another driver was going the wrong way proceeding towards us for a head on collision and not slowing down. I made a sharp right turn down a side street to avoid a collision. Easy for a human. Would an autonomous vehicle continue to follow the computer designated route or make that sharp right turn?


----------



## RamzFanz (Jan 31, 2015)

So, all human caused crashes?

We know.



just_me said:


> Waymo published a report for California's Department of Motor Vehicles about how frequently its driverless cars "disengaged" because of a system failure or safety risk and forcing a human driver to take over. In the report, Waymo said this happened once every 5,000 miles the cars drove in 2016, compared with once every 1,250 miles in 2015. While that's certainly an improvement, these types of incidents are hardly rare.


Completely false.



just_me said:


> I would expect both Uber and Lyft to fire me if I was getting in an accident every 5,000 miles.


They aren't. It's false.



everythingsuber said:


> Let's face it. If you were crashing every 5000 miles you would walking away long before they fired you. Most human drivers I know don't crash every million miles. That is how far the robots have to go to brake even.


They aren't. It's false.



just_me said:


> No insurance company that I know of will insure a driver who gets into an accident every 5,000 miles driven. And, there are 100s of millions of drivers out there. So obviously, the human driver does do things to avoid accidents, yes?


They aren't. It's false.



just_me said:


> Q: Who is going to pay for every single traffic light in the country to get upgraded with radio signals that transmit control commands?
> 
> A: Not the taxpayer. (Edit - not in this political environment).


Unneeded.



jocker12 said:


> Now I am asking you, why the report is not completed because if called, the police will need to have it completed by the end of the day?


You watch too many police TV shows.



jocker12 said:


> Also, why is DMV not asking Argo to submit a report with the data they have recorded with all the functioning instruments on board?


How do you know they haven't? Or are you, once again, making things up?



jocker12 said:


> And why are ARGO and the passengers not taking legal action for the damage or for medical expenses? ("It is not yet known whether charges will be filed against the driver of the box van, or whether the Argo car was in self-driving mode at the time of the accident")


How do you know they haven't? Or are you, once again, making things up?


----------



## observer (Dec 11, 2014)

https://www.google.com/amp/s/www.en...bet-is-better-than-uber-at-self-driving-cars/


----------



## everythingsuber (Sep 29, 2015)

By my reckoning human intervention equals taking action to avoid minor issues escalating to a major accident. The butterfly effect. Every 5000 kilometres for human intervention needs to become every 1 million.


----------



## just_me (Feb 20, 2017)

RamzFanz said:


> So, all human caused crashes?
> 
> We know.
> 
> ...


Proof?


----------



## Sydney Uber (Apr 15, 2014)

Billys Bones said:


> So the other night I was driving down a one way street with passengers in the vehicle following the computer designated route using waze. Another driver was going the wrong way proceeding towards us for a head on collision and not slowing down. I made a sharp right turn down a side street to avoid a collision. Easy for a human. Would an autonomous vehicle continue to follow the computer designated route or make that sharp right turn?


At this point in human motoring history, 99.99999% of all car accidents had humans in "control" of vehicles at the time when the car "lost" orderly safe control, impacting on other cars, pedestrians or stationary objects.

Methinks Robots will NEVER catch up to that terrible record.


----------



## jocker12 (May 11, 2017)

RamzFanz said:


> So, all human caused crashes?
> 
> We know.
> 
> ...


I know you smoke some crazy crap (because we read your deviant comments) and I know you neglect your wife and daughter, ignoring to teach them how to be better divers because, your statement, they are bad drivers, but..... how do you qualify questions (the question marks at the end, bud) as "making things up"? Awwww, you're high again.... aren't you?


----------



## RamzFanz (Jan 31, 2015)

everythingsuber said:


> By my reckoning human intervention equals taking action to avoid minor issues escalating to a major accident. The butterfly effect. Every 5000 kilometres for human intervention needs to become every 1 million.


Human interaction could be for a thousand reasons based on the human's perception and judgment.

Rerunning the scenario to it's outcome had they not interviened using their simulator shows almost none are actually required anymore.

Waymo went 7 straight months without a nessecarily intervention... In 2015.

The 5,000 mile nonsense is myth perpetrated by you know who on here.



jocker12 said:


> I know you smoke some crazy crap (because we read your deviant comments) and I know you neglect your wife and daughter, ignoring to teach them how to be better divers because, your statement, they are bad drivers, but..... how do you qualify questions (the question marks at the end, bud) as "making things up"? Awwww, you're high again.... aren't you?


Oh, look who it is.



just_me said:


> Proof?


Proof?

Uh, they are already in service and not a single company is asking for any infrastructure changes. None. Zip. Nada.


----------



## Mista T (Aug 16, 2017)

Billys Bones said:


> Would an autonomous vehicle continue to follow the computer designated route or make that sharp right turn?


A SDC would have stopped and waited to be hit.


----------



## RamzFanz (Jan 31, 2015)

Billys Bones said:


> So the other night I was driving down a one way street with passengers in the vehicle following the computer designated route using waze. Another driver was going the wrong way proceeding towards us for a head on collision and not slowing down. I made a sharp right turn down a side street to avoid a collision. Easy for a human. Would an autonomous vehicle continue to follow the computer designated route or make that sharp right turn?


The actual answer is they will avoid the danger whenever possible. They don't blindly follow a route. They examine trillions of data points a second. They could have a plan in place faster than the light information in your retna makes it to your brain, much less you making a decision.

You made a decision that worked. They could come up with dozens of options and alter them in fractions of a second based on information they gather in 360 degrees. We simply can't compete with their ability.


----------



## just_me (Feb 20, 2017)

RamzFanz said:


> Proof?
> 
> Uh, they are already in service and not a single company is asking for any infrastructure changes. None. Zip. Nada.


Uh, I wasn't asking for proof about infrastructure changes. But since you failed to back up your two claims of 'completely false data' from the article with your own data, then I think it's safe to say that the article's claim of self driving cars getting into accidents every 5,000 miles is an accurate statistic - even by your standards, yes?


----------



## RamzFanz (Jan 31, 2015)

just_me said:


> Uh, I wasn't asking for proof about infrastructure changes. But since you failed to back up your two claims of 'completely false data' from the article with your own data, then I think it's safe to say that the article's claim of self driving cars getting into accidents every 5,000 miles is an accurate statistic - even by your standards, yes?


The data from the article is from two year old reports by Waymo widely available on the web. Even the article didn't exaggerate take overs into accidents like you are. However, they did exaggerate take overs into required take overs to avoid an accident, which is false.

Do you seriously think Waymo just launched their cornerstone product with no driver knowing it needed a human take over every 5,000 miles? Seriously? On a scale of 1-10, how stupid and reckless would that be?


----------



## just_me (Feb 20, 2017)

RamzFanz said:


> The data from the article is from two year old reports by Waymo widely available on the web.


Correction: One year old data. 2018 isn't over yet. 2017 - 2016 = 1 year. But, since you seem to be asking for more up to date data, then here is Waymo's last 12 months of data: ''Table 1: All Reportable Disengagements'' on California roads, from 12/2016 through 11/2017 - Waymo reported 63 disengagements in 352,544.6 driven which equals 5,595.9 miles between disengagements. The article talked about 5,000 miles between disengagements. That's about an 11% improvement. That's not statistical difference, therefore there really wasn't much of an improvement since 2016's data, yes?



RamzFanz said:


> Even the article didn't exaggerate take overs into accidents like you are. However, they did exaggerate take overs into required take overs to avoid an accident, which is false.


(Definition of a disengagement: ''*California DMV definition, which means "a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle*." Section 227.46 of Article 3.7 (Autonomous Vehicles) of Title 13, Division 1, Chapter 1, California Code of Regulations.'')

Nope. Looks like you got that wrong too. It doesn't matter why the autonomous system was deactivated, failure of technology versus safe operation of of the vehicle, it only matters that is was taken over by the human. -- Note: Waymo views all of their miles driven as planned miles, thus all disengagements as planned disengagements because only testing is allowed on California roads as of right now. (See below table 2 of Waymo's report). The article didn't exaggerate anything.

Yes, I may have been to quick to call a disengagement an accident. My bad. But I'm not too quick in saying that if the human wasn't there that there would have been more accidents - or that if I only had 5595.9 miles in between disengagements (possible accidents), that I would lose my driver's license.

P.S. infrastructure changes will be needed to accommodate autonomous cars: from how roads get repaired; to possible radio transmitting traffic lights; to possibly radio transmitting road signs; etc. - which will cost 100's of billions to trillions of dollars in future infrastructure spending. But first, the autonomous cars are going to have to get more than 5595.9 miles per disengagement (possible accident), yes? - especially in this political environment, yes?

Source: https://www.dmv.ca.gov/portal/wcm/c...5-a72a-97f6f24b23cc/Waymofull.pdf?MOD=AJPERES


----------

