# Tempe police release photographs from Uber’s self-driving crash



## Trump Economics (Jul 29, 2015)

Tempe police released photographs from the pedestrian death involving an Uber self-driving car. A 49-year-old woman was hit and killed by a self-driving Volvo operated by Uber while crossing a street in Tempe


----------



## Trump Economics (Jul 29, 2015)

https://www.azcentral.com/picture-g...hs-from-uber-self-driving-car-crash/36266819/


----------



## Fuzzyelvis (Dec 7, 2014)

Seems strange they have the OLD uber trade dress on the windshield. I guess even they don't like the new one.


----------



## Trump Economics (Jul 29, 2015)

Watch out for bikes










Fuzzyelvis said:


> Seems strange they have the OLD uber trade dress on the windshield. I guess even they don't like the new one.


----------



## Trump Economics (Jul 29, 2015)

SAN FRANCISCO (Reuters) - Police in Tempe, Arizona said evidence showed the "safety" driver behind the wheel of a self-driving Uber was distracted and streaming a television show on her phone right up until about the time of a fatal accident in March, deeming the crash that rocked the nascent industry "entirely avoidable."

A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.

According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.

Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.

It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination. The Maricopa County Attorney's Office referred the case to the Yavapai County Attorney's office because of a conflict and that office could not be reached late Thursday.

Vasquez could not immediately be reached for comment and Reuters could not locate her attorney.

The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving car developers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.

Vasquez looked up just 0.5 seconds before the crash, after keeping her head down for 5.3 seconds, the Tempe Police report said. Uber's self-driving Volvo SUV was traveling at just under 44 miles-per-hour.

Uber declined to comment.

Last month, an Uber spokeswoman said the company was undergoing a "top-to-bottom safety review," and had brought on a former federal transportation official to help improve the company's safety culture. The company prohibits the use of any mobile device by safety drivers while the self-driving cars are on a public road, and drivers are told they can be fired for violating this rule.

Police said a review of video from inside the car showed Vasquez was looking down during the trip, and her face "appears to react and show a smirk or laugh at various points during the times that she is looking down." The report found that Vasquez "was distracted and looking down" for close to seven of the nearly 22 minutes prior to the collision.

Tempe Police Detective Michael McCormick asked Hulu for help in the investigation, writing in a May 10 email to the company that "this is a very serious case where the charges of vehicle manslaughter may be charged, so correctly interpreting the information provided to us is crucial." Hulu turned over the records on May 31.

According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.

Herzberg, who was homeless, was walking her bicycle across the street, outside of a crosswalk on a four-lane road, the night of March 18 when she was struck by the front right side of the Volvo.

The police report faulted Herzberg for "unlawfully crossing the road at a location other than a marked crosswalk."

In addition to the report, police released on Thursday a slew of audio files of 911 calls made by Vasquez, who waited at the scene for police, and bystanders the night of the crash; photographs of Herzberg's damaged bicycle and the Uber car; and videos from police officers' body cameras that capture the minutes after the crash, including harrowing screams in the background.

The crash dealt Uber a major setback in its efforts to develop self-driving cars, and the company shuttered its autonomous car testing program in Arizona after the incident. It says it plans to begin testing elsewhere this summer, although in some cities it will have to first win over increasingly wary regulators.

https://www.reuters.com/article/us-...driving-car-crash-police-report-idUSKBN1JI0LB


----------



## doyousensehumor (Apr 13, 2015)

I doubt Uber told the "safety driver" it is okay to sit in the driver seat and watch Hulu all night.

When Arizona allowed the testing to go on here, the car isn't the one with a drivers licence. The person in the left front seat is. In my view the status of the "autonomous mode" is the same as cruise control. The driver has the personal responsibility to take over as necessary, just as he would in any car with only cruise engaged.

About testing on a closed track... there is only so much that can be tested in a proving ground. Eventually they have to test out on public roads--with a driver ready to take over at any moment. There are just too many real world driving conditions that can't be recreated in a controled environment.

We all like to criticize Uber. Heck I'm sure I'm not the only one here that doesn't mind the setback to self driving cars. I would encourage us to look at the at the facts objectively, and not just jump to bagging on Uber by default.

Like many tragedies, there where several failures that led to it. Any one of them could have prevented this. The pedestrian failed to yield when crossing road. The autostop failed because it was disabled. Why was it disabled? Well, that IS a good question, and it needs to be answered! But the final factor, the most important one, is there was a driver in that car that was supossed to take over in any hazard. Uber was paying someone to fill that role, and she failed. Failed because she was watching Hulu.


----------



## jocker12 (May 11, 2017)

Trump Economics said:


> ccording to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.


People need to understand those MONITORS are not there to actively drive the cars, but to intervene in case the so called SELF driving system FAILS. Doing here job for Uber, of course Rafaela failed, because the software failed and killed a woman, but as a driver, you need to remember - the monitor is NOT an active driver. Is Uber and all the other culprits, *not clarifying this for everybody to understan*d. If that person is hired to drive, there is NO self driving testing, only driving.



Trump Economics said:


> The Uber car was in autonomous mode at the time of the crash, but Uber, like other self-driving car developers, requires a back-up driver in the car to intervene when the autonomous system fails or a tricky driving situation occurs.


There is no "tricky situation". Is ONLY the system FAILING to navigate the roads, as it was promised by the developers.



doyousensehumor said:


> I doubt Uber told the "safety driver" it is okay to sit in the driver seat and watch Hulu all night.


Uber told that MONITOR, to intervene and COVER UP any system failures during testing, not to actively drive that vehicle.



doyousensehumor said:


> When Arizona allowed the testing to go on here, the car isn't the one with a drivers licence.


You are completely WRONG.
"The Self-Driving Oversight Committee will advise the Department of Transportation, the Department of Public Safety, universities and other public agencies on how best to advance the testing *operation of self-driving vehicles on public roads*."
They refer to the system not to the human inside. The word SELF is simply self explanatory.



doyousensehumor said:


> The person in the left front seat is. In my view the status of the "autonomous mode" is the same as cruise control


Your view is wrong again. When a vehicle is in Cruise Control, the driver actively maintains the car on the correct path (the road) while in self driving mode, the monitor is there JUST IN CASE.



doyousensehumor said:


> About testing on a closed track... there is only so much that can be tested in a proving ground. Eventually they have to test out on public roads--with a driver ready to take over at any moment. There are just too many real world driving conditions that can't be recreated in a controled environment.


You are wrong again. Testing should not interfere with the general public. When pharmaceutical companies are testing their drugs, they DO NOT sell them to the general public, but get volunteers to take them. During the third phase of testing, the advance clinical trial, the drug is tested by patients that previously AGREED to be part of testing.



doyousensehumor said:


> The autostop failed because it was disabled. Why was it disabled? Well, that IS a good question, and it needs to be answered!


You really show you never payed attention to this tragedy to understand the details, because you are wrong again.

Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.

Self driving cars developers goal is to ELIMINATE any conflicting readings in their systems, because those could generate errors. You need to learn what a false positive is and why developers want to eliminate all of them from their sensors readings.


----------



## doyousensehumor (Apr 13, 2015)

Whoa, whoa, honestly we agree more points than you realize. Except I think they were testing a system, and you think thought they were done.


jocker12 said:


> People need to understand those MONITORS are not there to actively drive the cars, but to _intervene in case the so called SELF driving system FAILS. _


So if the SDC that is being tested is good enough that the (monitor or driver) doesn't have to intervene for days he's more of a monitor? Is an airplane pilot a monitor while the autopilot is on? Nope, he's still a pilot. Should the autopilot go wonky, its his responsibility to intervene.
_ Uber SDC was under development._
_Testing. _
_Read: its not ready yet!_
I watched these Volvos in person! They literally drive around in circles on the same route between Scottsdale and Tempe day and night. You don't seem to understand they were testing them with a driver inside that intervenes as necessary.



jocker12 said:


> There is no "tricky situation". Is ONLY the system FAILING to navigate the roads, as it was promised by the developers.


"The developers" were not finished developing it! Thats why there was a driver in every one of the tester mules, ready to take over.



jocker12 said:


> Uber told that MONITOR, to intervene and COVER UP any system failures during testing, not to actively drive that vehicle.


Cover up, thats not good. To my knowledge they were looking for any shortcomings so they could tweak the software code, and make adjustments.



jocker12 said:


> You are completely WRONG.
> "The Self-Driving Oversight Committee will advise the Department of Transportation, the Department of Public Safety, universities and other public agencies on how best to advance the testing *operation of self-driving vehicles on public roads*."
> They refer to the system not to the human inside. The word SELF is simply self explanatory.


The word TESTING is self explanatory.
Ducey set up a committee of a few guys that* talk *to other guys in the DOT and the state police and the universities. What do they talk about? Maybe they meet up at Hooters once a month, and talk about how to roll out the red carpet for Uber since Uber donated a lot of $$ for his election campaign! Nothing there that gave SDC software any legal driving license.



jocker12 said:


> Your view is wrong again. When a vehicle is in Cruise Control, the driver actively maintains the car on the correct path (the road) while in self driving mode, the monitor is there JUST IN CASE.


Legally the driver is the one responsible for the car. Cruise control is one dementional compared to SDC, but you can buy aftermarket cruise and install it onto a car that doesn't have one. I could try to test it in a parking lot, i suppose, but the only way to know for sure if it is road worthy is to take it on a public road. I may find out that it needs adjustment. Or i might have to intervene if it gets stuck. Ether way it would be ultimately my responsibility as a driver. I can't just relax and watch Hulu!



jocker12 said:


> You are wrong again. Testing should not interfere with the general public. When pharmaceutical companies are testing their drugs, they DO NOT sell them to the general public, but get volunteers to take them. During the third phase of testing, the advance clinical trial, the drug is tested by patients that previously AGREED to be part of testing.


It was rumored Uber took pax in some of these tester mules. I thought that was a liability for them.
When General Motors, VW, Ford, Toyota, design a car from scratch, they first test it at their proving grounds. After a while of testing there is only so much that can be done in the proving grounds! So then they test in real world public roads. The Pruis, for example, introduced electric steering, electric throttle, electric brakes, all in one car. 2 electric motors and 1 gas. It is constantly doing a dance between 3 motors. All software. The public didn't "agree" to share the roads with that vehicle but the drivers testing those vehicles assumed personal responsibility.



jocker12 said:


> You really show you never payed attention to this tragedy to understand the details, because you are wrong again.


Wrong! I have been. I find it interesting.



jocker12 said:


> Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.


Of course.


----------



## Gung-Ho (Jun 2, 2015)

Funny they blame the human driver for being inattentive and could have prevented the accident....True.

But the car was in autonomous mode and ran the bicyclist down like rodent road kill. 

The car F’ed up.


----------



## jocker12 (May 11, 2017)

doyousensehumor said:


> Whoa, whoa, honestly we agree more points than you realize


Yes, I agree, and those "wrongs" could look and sound harsh, but that was not my intention. Sorry if it felt that way.

The TESTING (term you focus on) is done for a SELF (term you seem to ignore) driving car. Because none of those cars are privately owned, the permit is given to the company that owns them. The company hires or has the MONITORS, that are required to have a driver license in order to take over/cover up IN CASE of a failure.

A self driving car, it's name defines that, is driving by itself. Usually, testing is not supposed to be done on public roads, and you very well mentioned


doyousensehumor said:


> When General Motors, VW, Ford, Toyota, design a car from scratch, they first test it at their proving grounds.


There is NO car manufacturer to test their projects on public roads because there is NO reason to do that. The only system that deals with *active driving variables* is the self driving software. Avoiding pedestrians, stopping for the red lights, making right and left turns or avoiding obstacles doesn't require testing on public roads if the car is designed to have a driver, because drivers have tests every 4 years in order to renew their driver licences. On the other hand, only a self driving car system lacks of experience, mileage and adjusting on public roads, so the SDC's developers pushed for testing on public roads, which is TOTALLY unnecessary and highly dangerous for *the general public that never voluntarily agreed to be part of it*. Do you agree, or I should just assume you already agreed and ignore your opinion, like the local authorities did by allowing companies to use people as guinea pigs?

Your comparison with Autopilot from a plane is well exaggerated because (and I am going to use your words from your previous comment) "_There are just too many real world driving conditions" _that make driving a lot more complex and difficult than flying.



doyousensehumor said:


> "The developers" were not finished developing it! Thats why there was a driver in every one of the tester mules, ready to take over.


You are making very dangerous statements and if you can prove the SDC's are on the roads as NOT well developed and finished products, I will nicely ask you to post your information right here. I WANT TO SEE YOUR SOURCE ON THIS, PLEASE! If you only assume that, is another story. Your perception could be wrong. That individual sitting in the car is hired to watch the systems, monitor them, take over IF necessary, and report back.

"Vasquez had previously told investigators from the NTSB that she had been "monitoring the self-driving system interface," which is displayed on an iPad mounted on the vehicle's center console, at the time of the crash." so they have an Ipad in the car displaying system parameters in real time.

In May this year, a Mobileye self driving car got through a red light while having a tv crew inside and they've recorded it. "Nobody was hurt, and Channel 10's video seems to show* a Mobileye safety driver monitoring the vehicle, but allowing the car to proceed without trying to stop it.* "

Also "Uber expects that a driver* may sometimes* need to take control of the vehicle, but the specific circumstances in which that's the case are somewhat unclear." "The navigation and self-driving tech in Uber's vehicles is also *a lot more advanced* than Tesla's semi-autonomous Autopilot mode, which isn't meant to completely replace the need for a driver, and is still in beta according to the company. Uber uses LIDAR, a system that creates a 3D map of the areas surrounding the car using lasers, as well as a typical radar system, and cameras to detect objects before collisions."
As you can easily understand, Uber doesn't want to specifically explain what monitor's duties are, particularly because in case of an accident, the company wants to get away with it, and ignore its clear responsibility of making the decision to hire that person (suddenly not fit after a tragedy) and putting the robots (unfinished as you say) on the roads, intentionally endangering people lives.



doyousensehumor said:


> Ducey set up a committee of a few guys that* talk *to other guys in the DOT and the state police and the universities. What do they talk about? Maybe they meet up at Hooters once a month, and talk about how to roll out the red carpet for Uber since Uber donated a lot of $$ for his election campaign! Nothing there that gave SDC software any legal driving license.


What they talk about? Is the first sentence under that link - "Governor Doug Ducey has announced the members of the Arizona Self-Driving Vehicle Oversight Committee - a team of transportation, public safety and policy experts who will support the state in the research and development of new "self-driving technology" that will allow vehicles to drive *without direct or active human operation*." So NO driving license whatsoever, only permits to operate given to the companies doing the research and development.



doyousensehumor said:


> Wrong! I have been. I find it interesting.


Again, I know it looks and sounds harsh. Was not intentional and I am sorry for making it look and sound that way. I am sure if you would have had more information about the "autonomous cars" oxymoron, you would have been able to immediately understand the scam.

Politicians maybe going at Hooters once a month, to enjoy Uber's golden shower and roll the red carpet for them, could be more realistic than you think. I think you're onto something big here....


----------



## uberdriverfornow (Jan 10, 2016)

The driver needs to do some time for this. Unfortunately an example needs to be made so all of these joke sdc "safety drivers" know that these things will never work and to never take your eyes off the road for even a second.


----------



## jocker12 (May 11, 2017)

uberdriverfornow said:


> The driver needs to do some time for this. Unfortunately an example needs to be made so all of these joke sdc "safety drivers" know that these things will never work and to never take your eyes off the road for even a second.


IMO is the self driving cars software developers that are responsible for their system failure. This tragedy proves if the driver is removed/not present/not paying attention, SDC's software developed by incompetents kills people on the roads.


----------



## uberdriverfornow (Jan 10, 2016)

people in the vehicle are still responsible for the actions of the car, whether its the driver or rider, if no driver


----------



## RamzFanz (Jan 31, 2015)

Gung-Ho said:


> Funny they blame the human driver for being inattentive and could have prevented the accident....True.
> 
> But the car was in autonomous mode and ran the bicyclist down like rodent road kill.
> 
> The car F'ed up.


False.

The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.

You can blame Uber, you can blame the driver (the police have), but you can't blame a SDC that was prevented from doing its job.



uberdriverfornow said:


> The driver needs to do some time for this. Unfortunately an example needs to be made so all of these joke sdc "safety drivers" know that these things will never work and to never take your eyes off the road for even a second.


Except, of course, as you well know, the car could have avoided this if it had been allowed to and thus, did work as intended.


----------



## jocker12 (May 11, 2017)

RamzFanz said:


> The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.
> You can blame Uber, you can blame the driver (the police have), but you can't blame a SDC that was prevented from doing its job.


This guy here, has no clue whatsoever and he clearly shows he doesn't know what he is taking about.

"_The car detected the pedestrian and, *if it had been allowed to*_" -










"_UBER F'ed up by *disabling the programming* that* allowed the car to react*."










" but you can't blame *a SDC that was prevented from doing its job"*-








_



uberdriverfornow said:


> people in the vehicle are still responsible for the actions of the car


So if something goes wrong with this rope railway it is people's that are inside fault?


----------



## uberdriverfornow (Jan 10, 2016)

RamzFanz said:


> False.
> 
> The car detected the pedestrian and, if it had been allowed to, could have avoided the accident. UBER F'ed up by disabling the programming that allowed the car to react. The driver KNEW they were 100% responsible in a situation like this.
> 
> ...


lol the only way it worked as intended is if it was specially designed to mow down people on the road

and this article only seems to confirm what I was one of the few people to state when this originally happened, and that's that this driver is liable for manslaughter, for those paying attention

while elaine contributed to the accident, the driver had a duty to avoid hitting her


----------



## uberdriverfornow (Jan 10, 2016)

jocker12 said:


> This guy here, has no clue whatsoever and he clearly shows he doesn't know what he is taking about.
> 
> "_The car detected the pedestrian and, *if it had been allowed to*_" -
> 
> ...


nope, just as it relates to sdc's

it already happened somewhere, there was an article about it

a rider got a ticket when the sdc he/she was riding in, committed an infraction


----------



## jocker12 (May 11, 2017)

uberdriverfornow said:


> nope, just as it relates to sdc's
> 
> it already happened somewhere, there was an article about it
> 
> a rider got a ticket when the sdc he/she was riding in, committed an infraction


Yes, that person (identified by the TV station reporting on that story as "the driver") got a ticket.
And No, getting a ticket doesn't mean the person inside is responsible for software behavior (sensors readings and actuators acts upon the environment).

There are 2 stories
Cruise Self-driving car passenger slapped with ticket in San Francisco, police say
British Tesla driver banned after caught in the passenger seat while Autopilot was engaged

You probably refereed to the first one, where "The ticketing officer believed that the car was in self-driving mode, however the person inside was cited for failing to yield to a pedestrian, Linnane said. That individual, whether they were driving or not, "is still responsible for the vehicle," she added."

I will go through this step by step.

1. The motorcycle officer is following traffic on a street in SF.
2. A car fails to yield to a pedestrian that is crossing the street.
3. The officer turns the lights showing intention to stop that vehicle.
4. The monitor inside sees the following cop, disengages the self driving mode, takes control and pulls over. _The monitor is the only person knowing that car was in self driving mode_.
5. The officer approaches the car asking for driver licence and proof of insurance/registration. While the monitor grabs the licence and proof on insurance, the cop continues asking if the monitor knows why he got pulled over.
6. The monitor mentions the vehicle is in testing, has an operating permit from the city of SF, and_ it was in self driving mode_.
7. The cop gets the driver licence and proof of insurance going to his motorcycle to verify the individual in the car and the police records. _He doesn't know if the car was in self driving mode as the monitor told him_, but needs to make a decision according to the law. - "California law requires the vehicle to yield the right of way to pedestrians, allowing them to proceed undisturbed and unhurried without fear of interference of their safe passage through an intersection,"
8. The cop issues a traffic violation citation/ticket on the monitor's name, for not yielding to a pedestrian, and let's him go.

Now, that issued citation doesn't mean that individual in the car did something wrong. That citation states the traffic officer witnessed a traffic violation and proceeded as required.

*There is no law in the US or in the world to penalize an infraction done by a robot*. And *was no way to check the system engaging-disengaging logs on the spot* to verify if the monitor was telling the truth or not.

*The cops are not establishing fault*. That is courts job, but people, by not going to court for their tickets are subsequently admitting fault and choose to pay than make their case.

*By issuing that citation, the officer gave that monitor the opportunity to go to court*, see a judge, make his case and prove he was not actively driving that car so he didn't do anything wrong. _The judge is the one to analyse the information presented by both parties and decide._

The only monitor's wrong doing was regarding the job Cruise hire him to do - monitor the robot, take over/cover up in case something goes wrong, and report back. He failed to cover up/take over.

In the second article, that idiot already admitted fault. Tesla is not testing, there are well defined software limitations described by the company in their legal documentation given to their customers and it is almost impossible once you signed you acknowledge that, to come out and say you never knew about that. "Patel has pled guilty to the offense, and has been banned from driving for 18 months, and will be required to pay a £1,800 fine, carry out 10 days rehabilitation, and to perform 100 hours of community service."


----------



## RamzFanz (Jan 31, 2015)

uberdriverfornow said:


> lol the only way it worked as intended is if it was specially designed to mow down people on the road
> 
> and this article only seems to confirm what I was one of the few people to state when this originally happened, and that's that this driver is liable for manslaughter, for those paying attention
> 
> while elaine contributed to the accident, the driver had a duty to avoid hitting her


The car was NOT in autonomous mode. Its avoidance features were disabled. It detected the woman 6 seconds before impact and would have reacted according to data retrieved from the system. Blaming the car is wholly inaccurate. The driver was 100% in charge of avoidance. But you knew that, didn't you?


----------



## uberdriverfornow (Jan 10, 2016)

RamzFanz said:


> The car was NOT in autonomous mode.=


Yet again, you lie. I mean, nothing comes out of your mouth that isn't a lie.

https://www.google.com/search?q=ube...rome..69i57.6183j0j9&sourceid=chrome&ie=UTF-8



> The vehicle was traveling in autonomous mode at the time of the crash.





> company's sensing system, was in autonomous mode with a human





> was in autonomous mode when it struck Elaine Herzberg around 10 p.


I can go on and on.



jocker12 said:


> Yes, that person (identified by the TV station reporting on that story as "the driver") got a ticket.
> And No, getting a ticket doesn't mean the person inside is responsible for software behavior (sensors readings and actuators acts upon the environment).
> 
> There are 2 stories
> ...


When the cop issues an infraction, he's obviously issuing it to the person in the car. You don't issue citations to robots. Lol

I mean, come on, man. Get real.


----------



## RamzFanz (Jan 31, 2015)

uberdriverfornow said:


> Yet again, you lie. I mean, nothing comes out of your mouth that isn't a lie.
> 
> https://www.google.com/search?q=ube...rome..69i57.6183j0j9&sourceid=chrome&ie=UTF-8
> 
> ...


Clickbait headlines? That's all you have?

Let's address the facts you continue to ignore:

A) The car's avoidance system was disabled. By default, it was not in autonomous mode. Autonomous, by its very definition, requires the ability to avoid an accident. This car could not by intent and, thus, was not a self-driving car. It was a data gathering test vehicle in manual mode.

B) The driver full well knew she was 100% responsible for avoiding any impact. If the human is 100% responsible, that is by definition, manual mode.

C) The driver, who lied about what she was doing, has been found at fault and was distracted driving.

Human driver responsible = manual mode

No human driver responsible = Autonomous mode.

Also, despite my use of your terms, it's not autonomous in any mode, it's self-driving. Autonomy hasn't been reached and won't be for a long time.


----------



## uberdriverfornow (Jan 10, 2016)

RamzFanz said:


> Clickbait headlines? That's all you have?
> 
> Let's address the facts you continue to ignore:
> 
> ...


lol all you do is lie, it's really comical


----------



## RamzFanz (Jan 31, 2015)

uberdriverfornow said:


> lol all you do is lie, it's really comical


Once again, I challenge you to point out the lie.


----------



## uberdriverfornow (Jan 10, 2016)

RamzFanz said:


> Once again, I challenge you to point out the lie.


lol

can't help but feel bad for people like you


----------



## jocker12 (May 11, 2017)

RamzFanz said:


> *The car was NOT in autonomous mode*. Its avoidance features were disabled. It detected the woman 6 seconds before impact and would have reacted according to data retrieved from the system. Blaming the car is wholly inaccurate. The driver was 100% in charge of avoidance. But you knew that, didn't you?












From the NTSB preliminary report










Somebody... anybody... please call an ambulance for Jason Buchanan @RamzFanz from St. Louis. Fast!



uberdriverfornow said:


> When the cop issues an infraction, he's obviously issuing it to the person in the car. You don't issue citations to robots


"*The cops are not establishing fault*. That is courts job, but people, by not going to court for their tickets are subsequently admitting fault and choose to pay than make their case.
"*By issuing that citation, the officer gave that monitor the opportunity to go to court*, see a judge, make his case and prove he was not actively driving that car so he didn't do anything wrong. _The judge is the one to analyse the information presented by both parties and decide."_


----------



## RamzFanz (Jan 31, 2015)

uberdriverfornow said:


> lol
> 
> can't help but feel bad for people like you


Still waiting.

The next time you call me a liar and refuse to address the facts, you go on ignore.


----------



## uberdriverfornow (Jan 10, 2016)

RamzFanz said:


> Still waiting.
> 
> The next time you call me a liar and refuse to address the facts, you go on ignore.


lol

It's like talking to Satan himself, the father of all lies.


----------



## uberdriverfornow (Jan 10, 2016)

jocker12 said:


> View attachment 238818
> 
> 
> From the NTSB preliminary report
> ...


as I said, the officer is issuing the citation to the person in the vehicle

exactly which part of that statement is wrong ?


----------



## jocker12 (May 11, 2017)

You are describing a procedure but that is irrelevant. Apparently you believe that is enough to prove that person was responsible for a traffic violation, but is not.

The fact that a cop follows a procedure, giving the traffic violation citation/traffic ticket to the person in the car, doesn't mean that person is automatically at fault.

"the ticket constitutes only a citation and summons to appear at traffic court, with a determination of guilt to be made only in court.' - https://en.m.wikipedia.org/wiki/Traffic_ticket


----------



## uberdriverfornow (Jan 10, 2016)

jocker12 said:


> You are describing a procedure but that is irrelevant. Apparently you believe that is enough to prove that person was responsible for a traffic violation, but is not.
> 
> The fact that a cop follows a procedure, giving the traffic violation citation/traffic ticket to the person in the car, doesn't mean that person is automatically at fault.
> 
> "the ticket constitutes only a citation and summons to appear at traffic court, with a determination of guilt to be made only in court.' - https://en.m.wikipedia.org/wiki/Traffic_ticket


i never said issuing a citation is an automatic guilty verdict


----------



## doyousensehumor (Apr 13, 2015)

Here's another (imperfect) simile: 

It's like a house under construction, with a new fire sprinkler system. There is a guy paid to be there with a bucket of water, (a monitor) just in case. The house burns down. 

NTSB investigates and it turns out the sprinkler system did indeed detect a fire, but, some dumbass turned off the water supply to the sprinklers. Also they discover the water boy was watching Hulu before the incident and didn't notice the fire until it was too late.

To conclude from that example, that automatic sprinklers are homicidal-fire-killing-machines would be a stretch. I would reason it nether proves or disproves that particular sprinkler design was effective. I was focused on why the monitor didn't do his job. Jocker12 probably would say the sprinkler failed. And he is right. It matters how you look at it. All I am suggesting is try to be objective. I'll admit though, SDC just might turn out to be homicidal machines! 

I would agree that the Uber Volvo was in autonomous mode. It was driving itsself. One setting (whatever you call it) of the autonomous mode was disabled. Still offically autonomous.


----------



## jocker12 (May 11, 2017)

doyousensehumor said:


> Here's another (imperfect) simile:
> 
> It's like a house under construction, with a new fire sprinkler system. There is a guy paid to be there with a bucket of water, (a monitor) just in case. The house burns down.
> 
> ...


One detail - the disabled system was a Volvo factory installed driver assist feature, not part of the autonomous hardware and software developed by Uber, From comment #7 above - "Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.

Self driving cars developers goal is to ELIMINATE any conflicting readings in their systems, because those could generate errors."

More details here - https://uberpeople.net/threads/fata...positives-all-the-competitors-havetoo.258965/


----------



## doyousensehumor (Apr 13, 2015)

Referring to Tesla Autopilot:


jocker12 said:


> In the second article, that idiot already admitted fault. *Tesla is not testing, there are well defined software limitations described by the company in their legal documentation given to their customers* and it is almost impossible once you signed you acknowledge that, to come out and say you never knew about that. "Patel has pled guilty to the offense, and has been banned from driving for 18 months, and will be required to pay a £1,800 fine, carry out 10 days rehabilitation, and to perform 100 hours of community service."


Please correct me if I am wrong,

You see that Tesla Autopilot is a _driver aid_. Where if the car F'ed up and the driver doesnt intervene, it is the drivers fault.

VS

Uber autonomous is _not driver aid. Driver replacement_. Where the monitor is just there to monitor. If the car F'ed up it is the car (Uber's) fault.

I think I understand where you stand. It makes sense now.

Earlier when you were focusing on "Testing self driving cars" I was reading that as "cars that are self driving in the future". Meaning, I was assuming that _for now, _they have the "*status" *of a driving aid. Like Testla Autopilot. Hence my train of thought the driver has the ultimate responsibility." Testing self driving cars" wording the govoner is vague.


----------



## doyousensehumor (Apr 13, 2015)

jocker12 said:


> One detail - the disabled system was a Volvo factory installed driver assist feature, not part of the autonomous hardware and software developed by Uber, From comment #7 above - "Volvo factory installed Active Driver Assist was/is NOT part of the self driving system additionally installed by Uber on that SUV. Because these 2 different systems have different ways to detect obstacles in the car's path, they also could generate conflicting decisions, and that is the reason the Volvo FACTORY system was disabled.
> 
> Self driving cars developers goal is to ELIMINATE any conflicting readings in their systems, because those could generate errors."
> 
> More details here - https://uberpeople.net/threads/fata...positives-all-the-competitors-havetoo.258965/


Wholeheartedly agree. Its a given even more so OEM blind spot monitors would be woefully inadequate to wire into self driving.

About false positives, the amount of effort that would have to be put in programming for that is insane. And it can be subjective. A paper bag blows into lane of travel probably false positive. Ignore it.
Rabbit, maybe light braking. Dog bolts on to road- hard brake. Little girl on tricycle, evade at all costs even if you have to hit curb.

I once was on freeway in my truck in traffic going 65 rounding a curve, and suddenly there was a full size mattress in my lane. 3 seconds ahead. With no time to stop or to check to see if there were cars either side, split second decision was that it was safest to stay in lane. Bounced over it and it felt like a speed hump. How would a SDC prioritise something like that?


----------



## jocker12 (May 11, 2017)

doyousensehumor said:


> Referring to Tesla Autopilot:
> 
> Please correct me if I am wrong,
> 
> ...


Absolutely correct!



doyousensehumor said:


> Wholeheartedly agree. Its a given even more so OEM blind spot monitors would be woefully inadequate to wire into self driving.
> 
> About false positives, the amount of effort that would have to be put in programming for that is insane. And it can be subjective. A paper bag blows into lane of travel probably false positive. Ignore it.
> Rabbit, maybe light braking. Dog bolts on to road- hard brake. Little girl on tricycle, evade at all costs even if you have to hit curb.
> ...


Here - https://www.adeccousa.com/jobs/self...te-chandler-arizona/?ID=US_EN_2_011897_396393

Is a self driving vehicle operator Waymo requirements list. and they don't say the person will drive the car, but OPERATE it.

Of course it's ambiguous because in case something bad happens, the corporation wants to get away with it, while channeling the blame towards the operator.


----------



## Bernice Jenkins (Dec 4, 2016)

Trump Economics said:


> Tempe police released photographs from the pedestrian death involving an Uber self-driving car. A 49-year-old woman was hit and killed by a self-driving Volvo operated by Uber while crossing a street in Tempe
> 
> View attachment 238281
> View attachment 238282
> ...


I didn't notice until now but all the pics the police released show the road to be as dark as the original dash cam video that was released that people swore up and down was not accurate because the road is brightly lit. Sure doesn't look brightly lit to me in any of those pics.


----------



## heynow321 (Sep 3, 2015)

Bernice Jenkins said:


> I didn't notice until now but all the pics the police released show the road to be as dark as the original dash cam video that was released that people swore up and down was not accurate because the road is brightly lit. Sure doesn't look brightly lit to me in any of those pics.


doesn't matter. lidar and radar don't need visible light to "see". that's the whole ****ing point. SDC's are supposed to be "safer" b/c they can "see" better in crappy conditions like low light or fog.


----------



## Stevie The magic Unicorn (Apr 3, 2018)

Gung-Ho said:


> Funny they blame the human driver for being inattentive and could have prevented the accident....True.
> 
> But the car was in autonomous mode and ran the bicyclist down like rodent road kill.
> 
> The car F'ed up.


The very presence of safety driver is Uber is admitting their car is _*NOT*_ good enough to operate on it's own by itself.

Uber isn't in the wrong, the "not" driver was.


----------

