# Prosecute driverless car developers for software snafus, say cyclists



## jocker12 (May 11, 2017)

https://www.theregister.co.uk/2017/11/16/prosecute_driverlesscar_devs_cycling_uk/

A cyclists' association wants software developers for any "errors" in driverless car software to be "criminally prosecuted".

Cycling UK's submission to the Parliamentary committee considering the Automated and Electric Vehicles Bill (AEV Bill) also suggested that owners of driverless cars should be liable for criminal prosecution if they "switch to the autonomous modes in inappropriate or unsafe locations".

However, the association also made some well thought-through points, referring to how the current Road Traffic Act only allows human drivers or owners of a vehicle to be prosecuted if someone is killed or injured by a car: "If an AV system designed purely for motorway use offered and were allowed to take control of a vehicle on a busy urban street and that vehicle then overtook a cyclist too closely, hitting the cyclist and them, an offence of careless or dangerous driving would be impossible as the legislation is currently written."


----------



## tohunt4me (Nov 23, 2015)

jocker12 said:


> https://www.theregister.co.uk/2017/11/16/prosecute_driverlesscar_devs_cycling_uk/
> 
> A cyclists' association wants software developers for any "errors" in driverless car software to be "criminally prosecuted".
> 
> ...


PROSECUTE THEM ALL !

Before the Judges are Robots too !


----------



## WeirdBob (Jan 2, 2016)

tohunt4me said:


> PROSECUTE THEM ALL !
> 
> Before the Judges are Robots too !


A preview:


----------



## mikes424 (May 22, 2016)

It probably is different in the UK than in the US. Cyclists need to get their own act on order before accusing anyone else.

In Chicago, especially downtown, cyclists run red lights make illegal turns, weave back and forth in traffic, amomg other dangerous movements, all of which puts them at risk. None of which receive any type of citation from the police. When they start obeying the laws, I will listen to their complaints.


----------



## The Gift of Fish (Mar 17, 2017)

mikes424 said:


> It probably is different in the UK than in the US. Cyclists need to get their own act on order before accusing anyone else.
> 
> In Chicago, especially downtown, cyclists run red lights make illegal turns, weave back and forth in traffic, amomg other dangerous movements, all of which puts them at risk. None of which receive any type of citation from the police. When they start obeying the laws, I will listen to their complaints.


No, in the UK (especially in the cities) cyclists also tend to ride like complete tools.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> No, in the UK (especially in the cities) cyclists also tend to ride like complete tools.


Blaming the bycicle riders, or the drivers, or the passangers, or the pedestrians, or any other traffic participant wont change THE FACT that, if people will keep dying, obviously the self driving cars are not changing anything.

The developers are not even pretending there will be no fatalities anymore. Lately, they started admitting how there will still be collisions and possible deaths. If that is the case, and they are so relaxed about it, why not prosecute them when death will occur?


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> The developers are not even pretending there will be no fatalities anymore. Lately, they started admitting how there will still be collisions and possible deaths. If that is the case, and they are so relaxed about it, why not prosecute them when death will occur?


For the same reason that gun manufacturers are not prosecuted when their products kill people. Responsibility lies with the person wielding the weapon, according to current law. In this case, though, who is the person wielding the autonomous car? I'd say that it is Uber.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> For the same reason that gun manufacturers are not prosecuted when their products kill people. Responsibility lies with the person wielding the weapon, according to current law. In this case, though, who is the person wielding the autonomous car? I'd say that it is Uber.


More precisely - the software and hardware developers.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> More precisely - the software and hardware developers.


Don't know... if lawmakers did that then they would effectively kill self driving cars. As a developer I would not want to be accountable/jailable for deaths 10, 20, 30 years after I did the design work. Talk about a sword of Damocles....

As far as I know, no individual hardware developer at GM was personally prosecuted as a result of the ignition lock - related deaths. That would be the closest comparison. Designers at GM designed a faulty component that was responsible for causing deaths. In the end, GM paid fines and compensation and that was the end of it. The same is likely to happen when SDCs start to cause death attributable to similar design faults.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Don't know... if lawmakers did that then they would effectively kill self driving cars. As a developer I would not want to be accountable/jailable for deaths 10, 20, 30 years after I did the design work. Talk about a sword of Damocles....
> 
> As far as I know, no individual hardware developer at GM was personally prosecuted as a result of the ignition lock - related deaths. That would be the closest comparison. Designers at GM designed a faulty component that was responsible for causing deaths. In the end, GM paid fines and compensation and that was the end of it. The same is likely to happen when SDCs start to cause death attributable to similar design faults.


In yourr very good example, GM owned the design and used the part as a component of their 100% product.

In our case, Tesla is responsible for their 100% product. In any other cases at this point, software and hardware developers are teaming up with car manufacturers in order to use their product, so you can make a clear distinction between the vehicle itself and the integrated autonomous systems used in self driving mode (if they will have the nerve to sell such unpredictable monsters ever).

When I say developers, I am referring at them as teams or corporate entities directly responsible for loss of human life.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> When I say developers, I am referring at them as teams or corporate entities directly responsible for loss of human life.


Maybe both manufacturer and operator should be responsible. There have been airliner crashes due to mechanical faults (United 232) after which the operating airline (which had nothing to do with building the plane) was successfully sued along with the the airplane manufacturer and the manufacturer of the individual faulty engine. It looks like precedent says that everyone involved in the provision of the service, from component supply through manufacture through service provision is responsible?


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Maybe both manufacturer and operator should be responsible. There have been airliner crashes due to mechanical faults (United 232) after which the operating airline (which had nothing to do with building the plane) was successfully sued along with the the airplane manufacturer and the manufacturer of the individual faulty engine. It looks like precedent says that everyone involved in the provision of the service, from component supply through manufacture through service provision is responsible?


I think with the airlines it also involves whoever provides the maintenance on the aircraft. In case of a disaster, you know they will try to pin the fault even on the plane operators (the pilots, if I correctly remember they've tried in 2009, after Chesley Sullenberger "landed" US Airways flight 1549 in the Hudson River of Manhattan, saving everybody on board).

In my opinion, if you have testing 1 ton unpredictable moving objects on the road, you'll have to have strict legislation to make sure the idiots building those monsters are going to be kept responsible in case something goes wrong.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> I think with the airlines it also involves whoever provides the maintenance on the aircraft.


True, it seems that United should have detected the faulty engine component during maintenance on flight 232. This leads to the question of to what extent Uber/Lyft will be responsible for assuring that the products they will hire out to pax are safe to ride in. When the first pax death in a fully autonomous Uber/Lyft is caused by failure of a component, I don't think their assumed argument of "we didn't know; we didn't build it" would fly in court.


> In my opinion, if you have testing 1 ton unpredictable moving objects on the road, you'll have to have strict legislation to make sure the idiots building those monsters are going to be kept responsible in case something goes wrong.


Yes. There is a driving test for human drivers and there should be a government driving test for autonomous vehicles, covering all conditions the vehicle could encounter on the road - wind, rain, snow, ice, fog, construction zones, fallen tree/large object blocking the road, emergency stops, hazard perception, police officer/fireman/construction worker directing traffic, lane closures, cones in the road etc etc. Producers of autonomous cars claim that their cars will drive to a much higher standard than human drivers; we should require them to demonstrate that before allowing full autonomy on the road. What we have now, though, from government on this is, "You car guys say your cars are ready for full auto? That's good enough for us!".

Trouble is, how do we know that it is good enough? Only a fool would believe any claims made by Uber about its products. Google, I don't know.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Trouble is, how do we know that it is good enough?


First of all, I don't think any corporation will get to the point where they will deploy large numbers of autonomous vehicles on the road, simply because the public will reject the concept long before regulators will potentially face debating about laws and regulations regarding self driving robots. It is only an artificially created hype, meant to shift public perception about an oxymoron. The developers need the hype because they want to sell their products - software and hardware. As we can see, there is nobody else willing to build a new vehicle other than Tesla, and Tesla will win with their electric car and not with their "AutoPilot"/Windows Vista faulty software or their ridiculous set of cameras around the car. All the partnerships floating around are because nobody from Silicon Valley wants to risk entering a well established car manufacturing industry, and fail miserably after they promised to change the worlds transportation with dangerous tormenting dinosaurs on wheels. Before this point in time, Silicon Valley played a well enough orchestrated mind game with the actual car manufacturers, planting the idea of self driving cars potentially being the next big thing for transportation. The nerds from Silicon Valley only needed to scare Detroit of their competition and into thinking they will finally be able to get revenge after they got humiliated by the Japanese car manufacturers in the '60s.

Second - In case they will be asked to provide proof of their safe driving operating software, for sure the nerds of Silicon Valley will indicate future updates, upgrades and improvements, using the model we have for any piece of software running any small, medium or big computer today. *The reality though is that any driving operating system should be perfect from day one*, even better than any military software, because that will potentially be the set of instructions the dumb dinosaurs/ autonomous cars will use in order to move people from point A to point B by protecting everything inside and outside of them. If that level is not achieved from the beginning, then whoever corporation will choose to take the risk and put self driving cars on the road commercially, should be forced to place clear signs in plain view like the tobacco industry was forced to show "_smoking kills_" on their products - THIS VEHICLE IS NOT GUARANTEED TO KEEP YOU SAFE AND COULD BE DANGEROUS FOR ALL TRAFFIC PARTICIPANTS.

And third - I really want to see autonomous cars enthusiasts and advocates faces when this insanity will start fading away. All that will be left out of this self driving robots delusion will successfully scrub Walmart floors in the middle of the night, with nobody in the stores to get accidentally harmed by the stupid machines - Walmart is 'secretly' testing self-driving floor scrubbers, signaling that more robots are coming. Millennials will tell their grandchildren how Uber, Waymo, GM and all the other culprits, promised to fly high and ended up in a stinky ditch.

The answer to your question - "How do we know that is good enough?" is that we will know that, when corporations will decide enough was enough and will shut down this ridiculous circus with buffoons pretending to be world's transportation great fortunetellers and visionaries.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> *The reality though is that any driving operating system should be perfect from day one*, even better than any military software, because that will potentially be the set of instructions the dumb dinosaurs/ autonomous cars will use in order to move people from point A to point B by protecting everything inside and outside of them. If that level is not achieved from the beginning, then whoever corporation will choose to take the risk and put self driving cars on the road commercially, should be forced to place clear signs in plain view like the tobacco industry was forced to show "_smoking kills_" on their products - THIS VEHICLE IS NOT GUARANTEED TO KEEP YOU SAFE AND COULD BE DANGEROUS FOR ALL TRAFFIC PARTICIPANTS.


Conventional, human-driven cars today are not guaranteed to keep occupants safe, and could be dangerous for all occupants and anyone else the car may hit, but they are not required to have warning labels on them. Nor do we expect the vehicles we ride in to be perfect.

The concept of acceptable losses applies here. Governments and societies accept that there will inevitably be accidental deaths in transportation, be it cars, planes, trains, ships etc. This is why such industries are permitted to churn out cars, planes etc even though we know that people will be killed and injured in some of them. This is acceptable, as long as the number of deaths and injuries themselves are acceptable. In other industries, the acceptable death & injury rate is very much closer to zero, such as children's toys. Which is why big, sharp, heavy, pointy darts for kids get banned and cars don't, even though cars kill many times the number of children than big pointy darts do.










Because autonomous cars are in the transportation industry, deaths and injuries will not force these cars off the roads, at least not permanently. But I think it's likely that the government will reconsider its decision to let the autonomous car industry self-regulate. Self-regulation is not a feature of any other transportation sector or indeed in any industry where public safety is a factor, and it's going to prove to not be the way forward. However, governments are reactive, not proactive, only putting in controls and protocols after they are needed:

- Driving licences required _after_ cars were introduced
- SEC founded _after _the 1929 stock market crash
- United Nations formed _after_ WW2

It'll be the same with autonomous cars - something will go wrong first, then governments will put the necessary controls and protocols in place.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Conventional, human-driven cars today are not guaranteed to keep occupants safe,


Go in any dealership around the nation, then step in by telling customers this line and watch what happens next.



The Gift of Fish said:


> could be dangerous for all occupants and anyone else the car may hit, but they are not required to have warning labels on them


If there is no driver, they should have labels to make sure people make an informed decision if they decide to use any self driving car. In regular cars today, the driver (so the human element), makes the difference when it comes to operating the car or taking responsibility for driving errors. With no driver, like the tobacco industry was forced to do, self driving cars need to be labeled accordingly in case their operating systems are known to be faulty.

In the 60's, the tobacco industry was advertising smoking as HEALTHY, the same way the self driving cars developers advertise their products as MUCH SAFER today. Do you see any similarities here?

Edit - allow me to add an image here for visual efect











The Gift of Fish said:


> acceptable losses applies here


I will invite you to think of your ENTIRE family as "acceptable losses" and then come back to explain how "acceptable" they will be to you as "losses" in any industry process or corporate experiment.



The Gift of Fish said:


> deaths and injuries will not force these cars off the roads, at least not permanently


It will simply stop the public from using them, and as a consequence, it will force them into museums and/or junkyards.

If you need perspective in order to understand what I am telling you, I will give you two different examples - Segways and Concorde.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> I will invite you to think of your ENTIRE family as "acceptable losses" and then come back to explain how "acceptable" they will be to you as "losses" in any industry process or corporate experiment.


Take parents for example. As rational adults, they know that every time they take their children out in the family car/minivan/SUV etc, there is a chance that one, or all of them, could be injured or killed during their journey. This is an acceptable risk to the vast majority of parents; they perceive the risk as minute compared with benefits of driving their children over alternatives such as walking. So, they take the risk that their children may be killed or injured every day. Even walking in the street has an inherent risk to it; you could be hit by a bus. Eating in a restaurant has a risk - you could catch botulism and die an agonizing death. We don't require every activity in life to be risk free, because we accept that some death and injury is not practically/feasibly avoided. Otherwise, people would never leave their houses and live in a plastic bubble like Michael Jackson did for a while. Acceptance of risk is what is meant by acceptable losses; it is unrelated to the sadness that individuals feel after such a loss.


> If you need perspective in order to understand what I am telling you, I will give you two different examples - Segways and Concorde.


Segways were not successful because people did not want to look like a complete tit riding one down the street. Concorde was axed not because of the Paris crash - other planes such as 747s, 727s, 737s, L1011s etc had many more crashes than one and were not withdrawn - Concorde was axed because it was not economically viable due to spiralling maintenance costs on 35 year old aircraft, no permission to fly over the continental US, noise and contaminant pollution, massive fuel consumption and other factors.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> As rational adults, they know that every time they take their children out in the family car/minivan/SUV etc, there is a chance that one, or all of them, could be injured or killed during their journey.


I beg to differ. As rational adults and parents, they know they are probably the only ones to protect their children from any harm or dangers, especially while having one of the parents driving a vehicle will all of them inside. Everything changes when they are using other forms or transportation when they cannot be in control.

So if all your family will be declared "acceptable losses" in an industry process or a corporate experiment you'll be OK with it? No problem, you were waiting for it to happen, sort of thing. Right?



The Gift of Fish said:


> Segways were not successful because people did not want to look a complete tit riding one. Concorde was not axed because of the Paris crash - other planes such as 747s, 727s, 737s, L1011s etc had many more crashes than one and were not withdrawn - Concorde was axed because it was not economically viable due to spiralling maintenance costs on 35 year old aircraft, no permission to fly over the continental US, noise and contaminant pollution, massive fuel consumption and other factors


A Lesson in Innovation - Why did the Segway Fail?

The Real Reason Why the Supersonic Passenger Jet Concorde Failed

My point is - Where are they today? Still here or in the museums/junkyards?


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> I beg to differ. As rational adults and parents, they know they are probably the only ones to protect their children from any harm or dangers, especially while having one of the parents driving a vehicle will all of them inside. Everything changes when they are using other forms or transportation when they cannot be in control.


Rational drivers know that they cannot prevent all accidents. The proof of this is that if people could prevent all accidents, there would be no accidents. No sane person crashes on purpose.


> So if all your family will be declared "acceptable losses" in an industry process or a corporate experiment you'll be OK with it?


Individuals are not personally identified/selected as the ones to be in accidents. so that could not happen. Not sure what you mean by "industry process", but I would not allow my family to participate in a "corporate experiment". Actually, I'd probably allow my sister-in-law, but she'd be about it.


> A Lesson in Innovation - Why did the Segway Fail?
> 
> The Real Reason Why the Supersonic Passenger Jet Concorde Failed
> 
> My point is - Where are they today? Still here or in the museums/junkyards?


These articles are correct - these products ultimately failed due to mismatches between the product offered and customer demands, and additionally due to operating conditions, costs and environment in the case of Concorde, not because of injuries or fatalities.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Rational drivers know that they cannot prevent all accidents. The proof of this is that if people could prevent all accidents, there would be no accidents. No sane person crashes on purpose.
> Individuals are not personally identified/selected as the ones to be in accidents. so that could not happen. Not sure what you mean by "industry process", but I would not allow my family to participate in a "corporate experiment". Actually, I'd probably allow my sister-in-law, but she'd be about it.
> These articles are correct - these products ultimately failed due to mismatches between the product offered and customer demands, and additionally due to operating conditions, costs and environment in the case of Concorde, not because of injuries or fatalities.


I will invite you to read my post - https://uberpeople.net/threads/repl...l-take-30-years-or-more-aurora-ceo-ch.200691/

I know is long, but you will understand how self driving cars are NOT SAFER. It's only the corporations pitching for what potential customers will like to hear. There is NO imminent danger for an individual to die in car accident in 5 to 10.000 years of driving, based on a fluctuating yearly average.

You can ask your sister in law to go to Mars. Convince her she will be the one to change human kind destiny and if she agrees, you have a win win situation.

Those two articles speak about 2 very different kinds of transportation (similarity), which got great budgets spent (similarity), promised to be convenient (similarity), to revolutionize the industry (similarity), to change the way people live (similarity), and failed. Autonomous cars industry is following the same steps, making the same allegations and also making the same mistakes.


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> There is NO imminent danger for an individual to die in car accident in 5 to 10.000 years of driving, based on a fluctuating yearly average.


Ok, we'll have to disagree on that. I believe that there is a (albeit very small) risk of accident every time I get in a car, board a plane, no matter who/what is driving.


> You can ask your sister in law to go to Mars.


The technology's not ready yet. Too long a wait for my liking.


> Convince her she will be the one to change human kind destiny and if she agrees, you have a win win situation.


Convincing her to move to New Jersey would be a win. But anyway, enough about her.


> Those two articles speak about 2 very different kinds of transportation (similarity), which got great budgets spent (similarity), promised to be convenient (similarity), to revolutionize the industry (similarity), to change the way people live (similarity), and failed.


Yes, lots of new products fail to meet their producers' sales expectations for lots of different reasons. But that generality doesn't really help us predict which future products will succeed or fail; we'd need to look at each of the specific reasons individually and see if they transfer. Segway and Concorde failed for very different reasons, and it's unclear whether any of the reasons for either failure are directly relevant to autonomous cars.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> Ok, we'll have to disagree on that. I believe that there is a (albeit very small) risk of accident every time I get in a car, board a plane, no matter who/what is driving.


Would you like to see the statistics and understand the mathematical probabilities? Chris Urmson speaks about the same statistics during his interview I invited you to read. Well, according to National Highway Traffic Safety Administration - Fatality Analysis Reporting System (FARS) Encyclopedia, on the bottom, under National Rates: Fatalities








you'll see how many deaths occur every 100 million miles driven. That spreadsheet contains data for the last 22 years, from 1994 to 2015.

For 2015, the first column on the left, the number is 1.13 deaths per 100 million miles driven, so if a person drives 10.000 miles per year in average (let's say), that individual will potentially be a victim in a fatal crash (as a driver, car passenger or pedestrian) in 8.849 years, 200 days and 18 hours- 1.13 people die every 10.000 years of driving, or 1 death every 8.849,55 years.

Would you agree NHTSA has accurate data about this?


----------



## The Gift of Fish (Mar 17, 2017)

jocker12 said:


> Would you agree NHTSA has accurate data about this?


Vehicle miles travelled are estimates, so any calculation which uses it as an input will therefore be an estimate. However, as long as the estimation methodology remains the same though time, these figures are useful in identifying trends over a number of years.


> if people drive 10.000 miles per year in average (let's say), that means 1.13 people die every 10.000 years of driving, or 1 death every 8.849 years.


This interpretation of the data is flawed. Where you are going wrong is in the interpretation of the figure that 1.13 people die for every 100 million miles driven. This is not 1.13 deaths for every 100 million miles that each person drives, it is 1.13 deaths for every 100 million miles that all 218,084,00 licensed drivers in the US drive. The figures include pedestrians, who may drive 0 miles per year.

Another reason this matters is because very few miles are driven in total isolation with no other cars or pedestrians around. An example: 5 people each live one mile from a freeway on ramp. Each gets in his car, drives to the ramp; a journey that takes 3 minutes. They all arrive at the ramp at the same time, get in a pile-up on the ramp and one of them is killed. Each person involved in the fatal accident only travelled one mile (1 fatality per driven mile) yet there were 5 vehicle miles travelled in total by all involved in the accident (1 fatality per 5 driven miles) in the same three minutes leading up to the crash. Clearly, 1 fatality per driven mile is not the same as 1 fatality per 5 driven miles) So, in working out the average number of miles (or years) driven. it's not simply about how many miles an individual drives; it's also about the miles driven by those around you.

I don't know enough about statistics to be able to calculate the average number of miles driven or number of years driven between fatal accidents for an individual driver. Maybe someone who does can work it out.


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> it is 1.13 deaths for every 100 million miles that all 218,084,00 licensed drivers in the US drive. The figures include pedestrians, who may drive 0 miles per year.


I think you are getting confused. If a licensed driver chooses to be a pedestrian or a cyclist and drive zero miles, than he still could be a victim as a pedestrian or as a cyclist for every 100 million miles driven in the US. This statistic, which is very clear and very explanatory by giving you context to the total number of victims, shows you how driving today is incredibly safe as it is. The probability for a person to face a fatal accident is called in mathematical terms Direct Proportion - "A relationship between two variables *(in our case time and driven distance*) in which one is a constant multiple of the other. In particular, when one variable changes the other changes in proportion to the first.". In other words, the more time you drive, the more distance you cover, the more likely you are going to face a fatal accident (as a driver, car passenger, or other form of traffic participant). For the US, that figure is illustrated by the published spreadsheet.

Essentially it doesn't matter if we conclude how a person could drive 8.849 years without facing a fatal accident, or if a fatal accident occurs every hour during a day (so 24 fatal accidents), other 2.123 billion miles were covered safely and 75.8 million people were safe (10.000 miles a year equals ~28 miles/day).* As long as those 100 million miles are driven, in a day *(by 3.571.429 people at 28 miles per day)* or in 10.000 years *(by one person)*, only 1.13 fatalities will occur*. I can expand that statistic to be relevant for any metric I want, as long as that metric is part of the mathematical equation used for that matter.

Edit - the problem you have in your example is that you need to expand you vision from only few people involved that covered only few miles before the accident. For that fatal accident to happen, a lot more million miles were safely covered by thousands and thousands of people.


----------



## The Gift of Fish (Mar 17, 2017)

I don't know. My knowledge of statistics isn't good, but I still don't think that fatality rates can be converted into expected time until a given person is involved in a fatal crash.

In my example of the five people who crashed, suppose the accident happened on the morning of January 1, and after the accident all of the survivors were airlifted to hospital where they remained until January 1 of the following year - no more driving that year. Of the sample population of those 5 drivers only, total annual mileage would be 5 and annual fatalities 1. That gives a fatality rate of 0.2 deaths per mile. We also know that annual mileage per driver is 1. If we use your methodology to predict the number of years a driver can drive before being involved in a fatal accident, we would get (5/1)/1 = 5 years. However, all of the drivers drove only one mile in the year, and 100% of them were involved in a fatal accident. Their individual fatal accident rate is 1 fatality per mile and one fatality per year, not one per five years.

¯\_(ツ)_/¯


----------



## jocker12 (May 11, 2017)

The Gift of Fish said:


> I don't know. My knowledge of statistics isn't good, but I still don't think that fatality rates can be converted into expected time until a given person is involved in a fatal crash.
> 
> In my example of the five people who crashed, suppose the accident happened on the morning of January 1, and after the accident all of the survivors were airlifted to hospital where they remained until January 1 of the following year - no more driving that year. Of the sample population of those 5 drivers only, total annual mileage would be 5 and annual fatalities 1. That gives a fatality rate of 0.2 deaths per mile. We also know that annual mileage per driver is 1. If we use your methodology to predict the number of years a driver can drive before being involved in a fatal accident, we would get (5/1)/1 = 5 years. However, all of the drivers drove only one mile in the year, and 100% of them were involved in a fatal accident. Their individual fatal accident rate is 1 fatality per mile and one fatality per year, not one per five years.
> 
> ¯\_(ツ)_/¯


You seem to maintain a tunnel vision, while NHTSA - FARS Encyclopedia puts you on Earths orbit to see and understand everything from above. Your model can only be considered in case the entire US population will be your five people, and all the variables will apply to them.

There is no problem though, because I am not the only one explaining how safe driving is today. Here is the interview done by Recode with Chris Urmson, former Google X (actual Waymo) CTO, today's CEO of Aurora Innovation, startup dedicated to driverless car software, data and hardware, though not the cars themselves. During the interview he mentions the same statistic I am telling you about









Do you think he knows what he is talking about?


----------

