# The first death resulting from a crash involving a self-driving car!!



## DriverX

And so it begins.

http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s

https://www.teslamotors.com/blog/tragic-loss

Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?

THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.

Fear not drivers they will need us for quite sometime to come.


----------



## Feisal Mo

What a monumental waste of time and money by Uber and other companies on this BS technology.

http://www.nytimes.com/2016/07/01/b...n-region&region=top-news&WT.nav=top-news&_r=0


----------



## Bart McCoy

Some wild stuff here
Detecting the street as being "clear" because it didn't anticipate trailers with high road clearance


----------



## ChortlingCrison

That'll set it back for a few more years.


----------



## SafeT

Yes, the self driving car is here. Finally!

And the good news is... it drives idiots dumb enough to believe it works straight into semi trucks.


----------



## F213

Ah yes, human error still blueprints it's self in their rise to the thinking age.


----------



## LAuberX

the driver was not using his head anyway...


----------



## LAuberX

It was on autopilot with the 40 year old driver/owner behind the wheel...


----------



## Uberchampion

I wonder if the poor guy would have fared any differently if the car wasn't on autopilot?

When it's your time to go....


----------



## There’s no need to tip

Feisal Mo said:


> What a monumental waste of time and money by Uber and other companies on this BS technology.
> 
> http://www.nytimes.com/2016/07/01/b...n-region&region=top-news&WT.nav=top-news&_r=0


This article provides a little more info: http://www.latimes.com/business/la-fi-tesla-crash-20160630-snap-story.html

Yes, this BS technology that has resulted in 1 (ONE) fatality thus far vs how many with human drivers on a DAILY basis? On my way home from work today 3 different people almost changed lanes into me. It is a known fact that these systems aren't sophisticated enough yet to handle all situations. The article I posted explains more about some of the issues. The technology is still very young. So I guess according to you anything that takes time and money to perfect is just a waste and we shouldn't bother right? Look how far the technology has come in such a short period of time.


----------



## LAuberX

Uberchampion said:


> I wonder if the poor guy would have fared any differently if the car wasn't on autopilot?
> 
> When it's your time to go....


I'm pretty sure if he was actually looking out the windshield he would have seen the tractor trailer in front of him.

the autopilot in high end cars is amazing... but you can't fall asleep or have your head down in your phone!


----------



## Hidden Leaf Shinobi




----------



## observer

http://jalopnik.com/first-fatal-tesla-autopilot-crash-sparks-nhtsa-investig-1782916450


----------



## painfreepc

Self-driving cars are not ready, they are nowhere near ready and I've said this in several trends, all you guys who think this is going to happen in a few years you're crazy keep dreaming..

Some of you guys may think I'm nuts or something, but this is why I like the movie Oblivion with Tom Cruise, because even the advanced AI in that movie knew that a human being would be more resourceful than an AI Drone..

https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk


----------



## Bart McCoy

Not surprised. I still refuse to believe yo can program a computer to account for any infinite driving scenario. If they ccan't account for making sure the roadway is clear more than 3 feet off the ground, then we are still a decade away from those auto cards being used for livery


----------



## Fireguy50

Ironically, they're still safer than human drivers! LOL


----------



## observer

I would not attribute this to the car but to driver error.

Driver should have been paying attention.


----------



## DriverX

There's no need to tip said:


> This article provides a little more info: http://www.latimes.com/business/la-fi-tesla-crash-20160630-snap-story.html
> 
> Yes, this BS technology that has resulted in 1 (ONE) fatality thus far vs how many with human drivers on a DAILY basis? On my way home from work today 3 different people almost changed lanes into me. It is a known fact that these systems aren't sophisticated enough yet to handle all situations. The article I posted explains more about some of the issues. The technology is still very young. So I guess according to you anything that takes time and money to perfect is just a waste and we shouldn't bother right? Look how far the technology has come in such a short period of time.


Hyperbolic defenses of tech like this always come from the Lemmings. Look I'm no Luddite, I've had tech careers for over a decade, but I'm also very aware of the hype that the marketing departments of these tech companies drum up to convince people how awesome and advanced they are.

If you've ever worked in software development you see how buggy software really is and how unreliable and based on a lot of guess work it can be. Cases in point: the fitbit is complete crap for getting any reliable data and it burns your wrist OR windows installed an update this morning despite me having it set to NOT auto update and now my machine has crashed 3 times. Fortunately most people don't use software for things that can risk their lives until now. Go ahead though you be the guinea pig while your corporate overlords use you as a pawn in a beta test.


----------



## painfreepc

observer said:


> I would not attribute this to the car but to driver error.
> 
> Driver should have been paying attention.


Why would you put on autopilot into a car then think the customer is going to pay 100% attention to the road,
there are drivers out here now driving normal cars than don't pay 100% attention to the road..

Isn't the point of having autopilot so you don't have to pay 100% attention to the controls,

this technology is stupid, it's illogical and makes absolutely no sense whatsoever..

I first saw this car autopilot in one of the Arnold Schwarzenegger movies,
no I'm not talking about Johnny cab, it was that movie where he was cloned, him and his friend was sitting in the truck they were both facing each other,
I was watching the movie saying to myself there's no way in hell that's going to happen for real no one's watching the road and here we are in 2016 and it's happened for real - how ironic


----------



## DriverX

LAuberX said:


> I'm pretty sure if he was actually looking out the windshield he would have seen the tractor trailer in front of him.
> 
> the autopilot in high end cars is amazing... but you can't fall asleep or have your head down in your phone!


they call it AUTOPILOT when clearly it isn't.

This will be a HUGE lawsuit that Tesla settles fast.

edit
Mr. Brown apparently posted videos of himself riding in autopilot mode. "The car's doing it all itself,'' he said in one, smiling as he took his hands from the steering wheel.

DOH famous last youtubes


----------



## SafeT

So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?

A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


----------



## LAuberX

DriverX said:


> I thought you couldn't take your hands off the wheel while in autopilot, not to mention that they call it AUTOPILOT when clearly it isn't.
> 
> This will be a HUGE lawsuit that Tesla settles fast.


the story mentions a warning that comes up when you engage the autopilot to keep your hands on the wheel? 
how about a warning to keep your attention OUTSIDE the car when driving!


----------



## LAuberX

SafeT said:


> So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?
> 
> A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


and here is the problem. they argue the cars are "safer" or "better drivers" than humans... I think it will be great for ambulance chasing lawyers!


----------



## painfreepc

SafeT said:


> So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?
> 
> A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


This question was asked on the George Noory show, are you at George Noory listener if you are good for you,

This is a very serious question and the people behind the technology is not answering it,

your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..


----------



## DriverX

LAuberX said:


> and here is the problem. they argue the cars are "safer" or "better drivers" than humans... I think it will be great for ambulance chasing lawyers!


THis sorta thing is always the folly of people who think they are smarter than they are. I doubt the guys who coded it wanted to release it yet, but marketing always trumps common sense.


----------



## Fireguy50

I'm curious what the insurance industry is doing for vehicles with this auto pilot option? They would have to cover loss before sueing the manufacturer.


----------



## Bart McCoy

DriverX said:


> they call it AUTOPILOT when clearly it isn't.
> 
> This will be a HUGE lawsuit that Tesla settles fast.
> 
> edit
> Mr. Brown apparently posted videos of himself riding in autopilot mode. "The car's doing it all itself,'' he said in one, smiling as he took his hands from the steering wheel.
> 
> DOH famous last youtubes


Yeah they will probably file suit. Even though the driver clearly was partially at fault. Somehow he didn't even see a big 18wheeler,since no brakes were applied. Seems the family could sue the company of the truck as well.

But let's says the computer realized a tractor trailer jumped in the road, what would the computer do:

1) swerve to avoid, but possibly cause another accident with somebody/something else?
2) Jam on the brakes and come to a computer stop on the highway?

How would Google's automated cars have saved the day??


----------



## Jermin8r89

Its gonna get implanted but people know they like control and its just gonna fail in a couple of years. I'm already seeing people going back to taxis because of privacy issues where they get bigger fair then they expected including me. Taxi drivers in Boston are already using Prius


----------



## DriverX

SafeT said:


> So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?
> 
> A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?





painfreepc said:


> This question was asked on the George Noory show, are you at George Noory listener if you are good for you,
> 
> This is a very serious question and the people behind the technology is not answering it,
> 
> your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..


George Noory kinda blows. Coast to coast was better when it was mostly conspiracy stuff. ALl the bigfoot and ghost or psychic crap is boring as hell.

I digress though, I think the obvious answer to the hypothetical is that you program the car to brake and not swerve regardless of wether or not it can stop in time. Swerving, while sometimes could save a human that is in control of the swerve would always be considered more dangerous in a binary system. Braking is really the only option for the machine to consider and this is why these systems will never be better than humans, they will just be more organized and potentially safer when all other vehicles are being controlled by the same system and the roads have been designed for autonomous vehicles. so that'll happen in like 40+ years.


----------



## ExpendableAsset

There are going to have to be some really huge leaps forward in AI programming and sensory technology to make this even close to viable. Anyone who really understands how complicated driving is and how unbelievably stupid the smartest AI is knows that self driving cars are pure science fiction. We will likely cure cancer and old age, build orbital shipyards, and achieve FTL communication before self driving cars could be smart enough to do something like drive in a city and not be a deathtrap.


----------



## MattyMikey

Working in insurance this is going to be a nightmare. The good thing is this is not even in the 10 year plan for big companies, so don't think drivers being replaced is going to happen anytime soon. 

There would be multiple people reasonable, so who would be primary? The driver, the car manufacturer, or the company that programmed the software. 

With so many variables and multiple responsible parties, my guess is insurance would still be around. Then the insurance companies would start subrogation process to seek reimbursement from others that should be reasonable. I would guess that vehicle owners would still pay for it. 

I'm not looking forward to this when it comes. Luckily I'm going to retire in about 15-16 years and likely won't have to deal with it much, if at all.


----------



## Rat

LAuberX said:


> I'm pretty sure if he was actually looking out the windshield he would have seen the tractor trailer in front of him.
> 
> the autopilot in high end cars is amazing... but you can't fall asleep or have your head down in your phone!


But that is exactly why people want them.


----------



## observer

painfreepc said:


> Why would you put on autopilot into a car then think the customer is going to pay 100% attention to the road,
> there are drivers out here now driving normal cars than don't pay 100% attention to the road..
> 
> Isn't the point of having autopilot so you don't have to pay 100% attention to the controls,
> 
> this technology is stupid, it's illogical and makes absolutely no sense whatsoever..
> 
> I first saw this car autopilot in one of the Arnold Schwarzenegger movies,
> no I'm not talking about Johnny cab, it was that movie where he was cloned, him and his friend was sitting in the truck they were both facing each other,
> I was watching the movie saying to myself there's no way in hell that's going to happen for real no one's watching the road and here we are in 2016 and it's happened for real - how ironic


*Always follow manufacturer instructions. Especially when driving a two ton deadly machine.*

When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot "is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

*He had already almost been killed for NOT PAYING ATTENTION WHILE DRIVING.*

_The Verge_ also notes that Tesla referred to Brown as a friend of the company, and that he recorded a somewhat viral video of his Model S nicknamed 'Tessy' having a near crash earlier. He said he had not been watching the road and that his car saved his life.

*In this guys case he got a second chance and didn't learn from his first mistake.*


----------



## Avenig

I think the role of human drivers will be around for a while yet.

http://mobile.abc.net.au/news/2016-07-01/tesla-driver-killed-while-car-was-in-on-autopilot/7560126


----------



## tohunt4me

DriverX said:


> And so it begins.
> 
> http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s
> 
> https://www.teslamotors.com/blog/tragic-loss
> 
> Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?
> 
> THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.
> 
> Fear not drivers they will need us for quite sometime to come.


I saw this on the news.
Can you imagine the horror of a car full of paying customers being DRIVEN INTO AN ACCIDENT ?

DRIVEN INTO DEATH BY A ROBOT ?

can you imagine this case before a jury ?

Not saying a human driver would have or could have done any better.

Just the thought of this,perishing under ROBOT SUPERVISION !
TERRIBLE.


----------



## tohunt4me

Bart McCoy said:


> Some wild stuff here
> Detecting the street as being "clear" because it didn't anticipate trailers with high road clearance


Well . . . . . now they know.

Congress will probably legislate skirting around 18 wheeler trailers for " safety".

Just like the airbags being recalled for killing people.

Just like the asbestos that was legislated for schools and hospitals.

Wal Mart and a few other companies are already doing it.
It aids in fuel economy.some designs will push cars out on a blind spot left turn.


----------



## tohunt4me

F213 said:


> Ah yes, human error still blueprints it's self in their rise to the thinking age.


It was human error that caused the accident.
Now they will address this issue.

( more children died between crib rails before improvements were made,than the amount of people who will die this way.More Pinto gas tanks exploded ,than driverless cars hitting trucks )


----------



## MattyMikey

tohunt4me said:


> It was human error that caused the accident.
> Now they will address this issue.


Exactly. Learn from these type of unfortunate events. This guys death will make sure the future is safer. He may have had error in this, but luckily, he will make sure many more people are safer in the future.

With the millions of miles just for Tesla Autopilot has driven to be this first death, statistically speaking is MUCH safer than human drivers.


----------



## tohunt4me

DriverX said:


> they call it AUTOPILOT when clearly it isn't.
> 
> This will be a HUGE lawsuit that Tesla settles fast.
> 
> edit
> Mr. Brown apparently posted videos of himself riding in autopilot mode. "The car's doing it all itself,'' he said in one, smiling as he took his hands from the steering wheel.
> 
> DOH famous last youtubes


" Hold my beer"
"Watch This !"

Compare this guy to the x pilots and the N.A.S.A. recruited test pilots.
Testing new systems is risky.
Learning is gained from success and Failure.
They learned a Big one out of this.
More will be lost.
More will be learned.


----------



## Uberchampion

Fireguy50 said:


> Ironically, they're still safer than human drivers! LOL


Nothing is 100% unless you are counting how often Uber screws it's drivers....


----------



## observer

Bart McCoy said:


> Not surprised. I still refuse to believe yo can program a computer to account for any infinite driving scenario. If they ccan't account for making sure the roadway is clear more than 3 feet off the ground, then we are still a decade away from being used for livery


I'm not too sure the 3 foot height had anything to do with this accident.

The car may not "see" the trailer from a few feet away but it definitely should have seen it from 100, 200, 500 feet away where the three foot height should not have been a factor.

Apparently the Tesla comes with 12 sensors that "see" objects for sixteen feet around the vehicle. It also has forward facing cameras and radar units.

Even if the cameras couldn't see "The white trailer and brightly lit sky", the radar units didn't detect it.


----------



## tohunt4me

Jermin8r89 said:


> Its gonna get implanted but people know they like control and its just gonna fail in a couple of years. I'm already seeing people going back to taxis because of privacy issues where they get bigger fair then they expected including me. Taxi drivers in Boston are already using Prius


I wouldn't be so sure.
Demographics.
America has an aging population.
Industry wants shipping by truck without drivers.
Shipping with ships want crewless ships.
It will advance.


----------



## New2This

If he'd done this in Alabama his last words would've been 'HEY Y'ALL CHECK THIS SHIT OUT!'

Too soon?


----------



## painfreepc

observer said:


> *Always follow manufacturer instructions. Especially when driving a two ton deadly machine.*
> 
> When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot "is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
> 
> *He had already almost been killed for NOT PAYING ATTENTION WHILE DRIVING.*
> 
> _The Verge_ also notes that Tesla referred to Brown as a friend of the company, and that he recorded a somewhat viral video of his Model S nicknamed 'Tessy' having a near crash earlier. He said he had not been watching the road and that his car saved his life.
> 
> *In this guys case he got a second chance and didn't learn from his first mistake.*


So that's the way it's going to be, every time one of these cars kills somebody, its going to be somehow it was the driver's fault, 
but the fact is this technology should not be out for the general public to begin with, it's not ready..


----------



## DriverX

observer said:


> I'm not too sure the 3 foot height had anything to do with this accident.
> 
> The car may not "see" the trailer from a few feet away but it definitely should have seen it from 100, 200, 500 feet away where the three foot height should not have been a factor.
> 
> Apparently the Tesla comes with 12 sensors that "see" objects for sixteen feet around the vehicle. It also has forward facing cameras and radar units.
> 
> Even if the cameras couldn't see "The white trailer and brightly lit sky", the radar units didn't detect it.


ever seen the radar display on a boat? not very accurate. The military grade stuff is probably better but that ain't in a Tesla. I'd guess for the far away vision they rely mostly on the camera and tracking the lane lines and road edges and things entering the frame, so maybe the truck was there not moving and had a brightly lit trailer that blended into the sky or something. unless it was at night of course. image recognition and tracking is still in it's tween stage so there will be lots of unexpected scenarios that dont' get considered and tested for when applying it to a autonomous driving.

Bottom line, this stuff is beta at best and you'd have to be nuts to trust your life on a beta release, unless your getting paid a lot to do it.


----------



## DriverX

ExpendableAsset said:


> There are going to have to be some really huge leaps forward in AI programming and sensory technology to make this even close to viable. Anyone who really understands how complicated driving is and how unbelievably stupid the smartest AI is knows that self driving cars are pure science fiction. We will likely cure cancer and old age, build orbital shipyards, and achieve FTL communication before self driving cars could be smart enough to do something like drive in a city and not be a deathtrap.


unless they are limited to 15 mph


----------



## observer

painfreepc said:


> So that's the way it's going to be, every time one of these cars kills somebody, its going to be somehow it was the driver's fault,
> but the fact is this technology should not be out for the general public to begin with, it's not ready..


I do think it was the drivers fault but I agree the technology is not ready for consumer use. Some of us give too much credit and put too much faith in computers and technology.


----------



## observer

DriverX said:


> ever seen the radar display on a boat? not very accurate. The military grade stuff is probably better but that ain't in a Tesla. I'd guess for the far away vision they rely mostly on the camera and tracking the lane lines and road edges and things entering the frame, so maybe the truck was there not moving and had a brightly lit trailer that blended into the sky or something. unless it was at night of course. image recognition and tracking is still in it's tween stage so there will be lots of unexpected scenarios that dont' get considered and tested for when applying it to a autonomous driving.
> 
> Bottom line, this stuff is beta at best and you'd have to be nuts to trust your life on a beta release, unless your getting paid a lot to do it.


From article,

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.


----------



## DriverX

observer said:


> From article,
> 
> It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.


How they slipped that past the regulators must have taken one helluva a lobby.


----------



## Pdrivenyc

http://abcnews.go.com/Technology/wireStory/driving-car-driver-died-crash-florida-40260566


----------



## MattyMikey

observer said:


> I do think it was the drivers fault but I agree the technology is not ready for consumer use. Some of us give too much credit and put too much faith in computers and technology.


I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.

So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.

It should however be used as a tool and not completely replace the drivers override capability. At least at this time.


----------



## observer

DriverX said:


> How they slipped that past the regulators must have taken one helluva a lobby.


I don't think Tesla believes this needed to be passed by regulators.

Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.


----------



## Djc

SafeT said:


> So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?
> 
> A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


Technically a proper auto car should be able to senario cycle in a split second just like a human driver and take the path of least damage except computer will have more accurate speed, geometry and physics data (things the brain does automatically by learning from experience). Problem is computer would have to be programmed to recognize all types of objects and situations eg. what if object that jumps in front is a moose and other option is to slightly crash into the metal crash barrier on side of road (car can squeeze between moose and crash barrrier but will hit it along the way). Car would need to know going off road was safe and that hitting crash barrier is better than head on into a moose (knowing not enough room to break). Unfortunately self driving cars will always need manual overide for these situations until we can create an AI as smart as humans in learning, evolving and making descisions as impossible to hard code all these types of scenarios into a program for self driving cars. Also if a self driving car kills someone on the street who iis liable for damages - you as owner/driver or car manufacturer and whose insurance premiums will go up?


----------



## observer

MattyMikey said:


> I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.
> 
> Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.
> 
> So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.
> 
> It should however be used as a tool and not completely replace the drivers override capability. At least at this time.


Even though it may be safer, I don't think it is prudent to just throw out a system like this without more extensive testing or finding some way to make absolutely sure that the driver is paying attention at all times.

All the driverless vehicle proponents need is a couple more freak accidents like this and the public will become very wary of driverless vehicles.


----------



## Jbeck

With that guy dying in the self driving Tesla in Florida today .... I see reluctantance in passangers getting into driverless cars in the future. More people will die and so will self driving cars.... What you think?


----------



## MattyMikey

observer said:


> Even though it may be safer, I don't think it is prudent to just throw out a system like this without more extensive testing or finding some way to make absolutely sure that the driver is paying attention at all times.
> 
> All the driverless vehicle proponents need is a couple more freak accidents like this and the public will become very wary of driverless vehicles.


This is how you get the true extensive testing. Though I will say I'm shocked they didn't notice this glitch about the 3 foot prior to road tests.

But no, if it is safer now and by real use gets even safer, I disagree. Because it is actually saving lives now. It would be more dangerous statistically to stop its use.

So yes, I agree that public make freak out because they're rationale is emotional and not statistical. If they truly looked at the numbers and took emotion out of picture entirely, it shouldn't.

I do not disagree with you about having them add more safeguards as that would make it even safer.

I only disagree with saying it's not ready to be in the field now, it is. Make s few modifications now and safeguards and go back to adding mileage of fatality free.

Now if there is another fatality within the US and it happens within 60 million Autopilot miles then we can revisit this topic as my opinion would change.


----------



## observer

MattyMikey said:


> I disagree with you. Though the technology is not perfect, it is over 30% safer in the US (or 100% in the World) than those driven by human drivers.
> 
> Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.
> 
> So basing things on statistics and not emotions, if it is safer now and we know it will only get even safer as they will make changes to avoid this again, I see no problem using it.
> 
> It should however be used as a tool and not completely replace the drivers override capability. At least at this time.


BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.


----------



## Zoplay

observer said:


> BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.


When will this technology available in the market? Because technology can make a big change and nowadays it was going at a rapid speed.


----------



## MattyMikey

observer said:


> BTW, if this technology was available today at a reasonable cost, I would use it. As it proved to be safer, I would trust it more and more. It will SOME DAY become the norm.


Tesla is still my dream car and I would use Autopilot myself without question.

In Seattle there was an Uber Select driver that had a passenger that was saved by a nasty head on collision.

So this Autopilot I'm sure has saved many, many lives but unfortunately one person parished.

I know in the article with the Seattle driver, the car stopped before he even had a chance to react, meaning the driver involved is not always required.

If you haven't seen the video I encourage people to watch it, it puts this Autopilot in a different perspective:

http://www.cbsnews.com/news/teslas-autopilot-helps-seattle-uber-driver-avoid-car-crash/


----------



## OC Lady Uber Driver

The ride you take in a driverless vehicle will only be as safe as all of the situations programmers allow for. (In that same article, they talked about how Uber wants to use driverless cars in the future.)

The apply breaks code is probably based on the distance from the front bumper to the other vehicle and not distance from other points of the vehicle like the windshield or the top of the car that were sheared off of the driverless vehicle traveling under the semi truck trailer subsequently killing the human driver.


----------



## observer

MattyMikey said:


> This is how you get the true extensive testing. Though I will say I'm shocked they didn't notice this glitch about the 3 foot prior to road tests.
> 
> But no, if it is safer now and by real use gets even safer, I disagree. Because it is actually saving lives now. It would be more dangerous statistically to stop its use.
> 
> So yes, I agree that public make freak out because they're rationale is emotional and not statistical. If they truly looked at the numbers and took emotion out of picture entirely, it shouldn't.
> 
> I do not disagree with you about having them add more safeguards as that would make it even safer.
> 
> I only disagree with saying it's not ready to be in the field now, it is. Make s few modifications now and safeguards and go back to adding mileage of fatality free.
> 
> Now if there is another fatality within the US and it happens within 60 million Autopilot miles then we can revisit this topic as my opinion would change.


I'm thinking the radar malfunctioned or needs a wider peripheral area. The trailer and truck should have been noticed a long time before the crash even if the cameras didn't see it.

** Even if the trailer wasn't spotted because it was 3 ft high, what about the truck?


----------



## MattyMikey

observer said:


> I'm thinking the radar malfunctioned or needs a wider peripheral area. The trailer and truck should have been noticed a long time before the crash even if the cameras didn't see it.


Agree. And I can't wait until the formal investigation is complete. Because if it is more they better recall and fix that immediately.


----------



## observer

MattyMikey said:


> Agree. And I can't wait until the formal investigation is complete. Because if it is more they better recall and fix that immediately.


Srry, I added a little something after you quoted me.

Why didn't the car notice the truck? It's not 3 feet off the ground.


----------



## DriverX

observer said:


> I don't think Tesla believes this needed to be passed by regulators.
> 
> Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.


It's street legal in only a few states right now CA, NV, NY, FL

yeah enhanced cruise control witch hypnotizes the Lemmings into a false sense of security while driving them toward the edge of the cliff.

Here's some interesting comments on what exactly Tesla autopilot is when compared to Googles LIDAR based system which is no where near perfect either but much more advanced than Tesla's. Driving a car down a busy highway might be harder than landing a rocket on a moving barge.

Google's system uses an expensive 64-beam LIDAR to localize itself to within 10cm on a detailed pre-existing map. It's incredibly precise. It also uses the LIDAR data to build a 360 degree world model that tracks and predicts movements for all nearby vehicles, pedestrians, and other obstacles, so it's able to plan intelligent paths through complex highway or urban environments.

Tesla's system is reportedly based on monocular forward-looking camera technology from Mobileye. This means it's very unlikely that Tesla's system can localize itself on a map, at least to the degree needed to accomplish lane keeping (GPS isn't reliable enough). However, the forward-looking camera can pick up the location and curvature of highway lane makers, which is more than enough to simply keep the car in its lane and accomplish basic lane change maneuvers.

Each of these systems excels in its own domain. Tesla's is low-cost and will likely be effective at accomplishing Elon Musk's goal of automating 90% of driving within a few years. But handling the last 10% of driving situations is very, very hard. Google created the 90% solution years ago and accurately predicted others would follow soon after, so they decided to focus on solving the really hard problems that will give them a major, potentially even monopolistic advantage in the long run. Kyle Vogt, CEO at Cruise

THis Musk quote is a real gem considering the current circumstances:

Still, Musk says, "we really need better lane markings in California." He notes Autopilot works best in places where the road markings are very visible, or in traffic, where things move slowly. He emphasizes the importance of the fleet intelligence in creating an autonomous driving system that can succeed-which, of course, brings up questions abouthis stated concerns over the rise of AI. "If I'm so afraid of AI, why am I doing this?" he joked. "I don't think we have anything to be afraid of with cars driving themselves, they're not going to take over the world&#8230; that's a deeper AI, some sort of AI that due to itself or people, tries to drive civilization in a direction that is not good." Long story short, don't worry about your Model S killing you.


----------



## MattyMikey

observer said:


> Srry, I added a little something after you quoted me.
> 
> Why didn't the car notice the truck? It's not 3 feet off the ground.


Great question. First thing came to my mind was the truck part already was past the radar so the driver should have noticed. Not wanting to put blame on driver but there likely was at least some shared responsibility. Likely the driver was so used to it being perfect (as saved him in past) that he was not making it focus on vehicles in front of him and may have been playing on the Internet for all we know.


----------



## Optimus Uber

Jbeck said:


> With that guy dying in the self driving Tesla in Florida today .... I see reluctantance in passangers getting into driverless cars in the future. More people will die and so will self driving cars.... What you think?


Agree, even if they can make it, will people use it. Have to build trust. Unfortunately, autonomos cars and people driving their own cars aren't going to play well with one another. Either it has to be all or none.

But I have an idea, Disneyland is looking for a new ride in tomorrow land. They can setup their cars there.

Remember the rocket rods. All electric, wave of the future. Constantly, overheated and would catch on fire. Even with electric cars they can't go 100% full throttle, 100% of the time, the quick drain on the battery has an affect on it's longevity. The faster they discharge, the more heat that builds up. That heat degrades the battery quicker.

Same issue with rocket rods. When they were in development they would overheat and catch on fire. They had to put breakers on them so if they reached a certain temperature the breaker would go. They would have to shut the ride down for 30 minutes to let it cool. Constantly unloading the passengers. Made it not feasible. Short lived ride, because it was so unreliable.

I really don't think tesla has been around long enough and have been 100% forth telling of the real issues of the cars.


----------



## AllenChicago

Would you guys/gals feel more comfortable riding as a passenger in a self-driving car from SanFrancisco to L.A., or making that trip in a self-flying jet?


----------



## Admitter

Pdrivenyc said:


> http://abcnews.go.com/Technology/wireStory/driving-car-driver-died-crash-florida-40260566


Thats sad!!


----------



## observer

AllenChicago said:


> Would you guys/gals feel more comfortable riding as a passenger in a self-driving car from SanFrancisco to L.A., or making that trip in a self-flying jet?


Well since a self driving car has already gone coast to coast,

http://www.gizmag.com/delphi-drive-completed/36859/

I would pick the car.


----------



## Icecool

I knew auto driving is not safe weather you got all the senses or whatever . It is just a computer programme if it got virus or crash than you die . Nothing can replace a human soul .


----------



## Djc

AllenChicago said:


> Would you guys/gals feel more comfortable riding as a passenger in a self-driving car from SanFrancisco to L.A., or making that trip in a self-flying jet?


Jet hands down. Commercial airlines fly themselves almost the entire flight now and its mandatory the computer flys / lands in bad weather/visibility now. There are also much less obstacles and senarios that can happen where the computer will make a mistake.


----------



## F213

tohunt4me said:


> It was human error that caused the accident.
> Now they will address this issue.
> 
> ( more children died between crib rails before improvements were made,than the amount of people who will die this way.More Pinto gas tanks exploded ,than driverless cars hitting trucks )


Oh I know. Patch that mistake out. Find the right coding to avoid oncoming abrupt left turns. It's either us or them. I'm rooting for the underdog.


----------



## No_Username

This is why uber and there self driving cars idea will never get approved. 2 steps backwards for that idea


----------



## tevee123

I think the only way people will feel safe in those situations is if ALL cars on the road are self driving, therefore taking the human error element out of it.


----------



## NATIVE_INDIAN

Self driving cars are good for DISNEY LAND but not for NYC.


----------



## Fireguy50

SafeT said:


> How would the computer handle this? A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


As an Emergency Vehicle Driving Instructor, I would advise the programmers to maintain course and apply 100% breaking and hopefully mitigate life threatening traumatic injuries to the child.

You can't assume the Truck would maintain course, nor the child wouldn't jump or change course. I've been to post accident investigations where a defensive vehicle tried to navigate to a clear path of safety, but the oncoming at fault vehicle chose the same emergency course. They both still collided head on.

It's impossible to assume 2 objects would ignore risk and maintain course. So the best suggestion for a computer program is antilock breaking 100% and maintain original course.

The human brain is still better than any computer. Last I heard, to replace the human brain would take a computer the size of Manhattan island with the power consumption of 1 nuclear reactor. The technology isn't even close.

Human brain is 3 pounds running on 60 watts of power.


----------



## FAC

DriverX said:


> And so it begins.
> 
> http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s
> 
> https://www.teslamotors.com/blog/tragic-loss
> 
> Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?
> 
> THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.
> 
> Fear not drivers they will need us for quite sometime to come.


I didnt want to like your post because of someone dying. But I do agree with you that self driving cars is a major concern. It's not like releasing an OS filled with security risks that must be patched up frequently. Its technology that has serious and now deadly consequences. I've never really supported the autonomous driving thing. Computers do what we tell them them to do. AI is still in infancy. Humans are humans. The only predictable thing about us is we are unpredictable. Unless all vehicles become autonomous I can't see how they can program a car to respond to a human driver.


----------



## FAC

tohunt4me said:


> It was human error that caused the accident.
> Now they will address this issue.
> 
> ( more children died between crib rails before improvements were made,than the amount of people who will die this way.More Pinto gas tanks exploded ,than driverless cars hitting trucks )


More Pinto gas tanks exploded than necessary. Ford knew the design was flawed but did a cost/benefit analysis and determined the loss of human lives were more cost effective than a recall.


----------



## uberdriverfornow

One of the few times I'm not happy about being right.

https://uberpeople.net/threads/an-u...ll-hit-the-road-‘sooner-than-you-think.84346/


----------



## Fireguy50

Technically it's not a self driving car.
Cruise control with lane departure warning and crash avoidance warning. The driver still has to be involved.

Which is why I'm curious if the auto insurance even mentions auto pilot. And what legal paperwork they signed for that software update.
If there is no extra paperwork then all parties (including the deceased) are/were stupid in a life/death situation.

Every vehicle has warnings!


----------



## uberdriverfornow

Fireguy50 said:


> Technically it's not a self driving car.
> Cruise control with lane departure warning and crash avoidance warning. The driver still has to be involved.
> 
> Which is why I'm curious if the auto insurance even mentions auto pilot. And what legal paperwork they signed for that software update.
> If there is no extra paperwork then all parties (including the deceased) are/were stupid in a life/death situation.
> 
> Every vehicle has warnings!


If they can't get a self driving car with a person required to be in it to operate it to work correctly and safely how exactly are they going to get a self driving car with no persons in it to work correctly and safely ?


----------



## Fuzzyelvis

MattyMikey said:


> This is how you get the true extensive testing. Though I will say I'm shocked they didn't notice this glitch about the 3 foot prior to road tests.
> 
> But no, if it is safer now and by real use gets even safer, I disagree. Because it is actually saving lives now. It would be more dangerous statistically to stop its use.
> 
> So yes, I agree that public make freak out because they're rationale is emotional and not statistical. If they truly looked at the numbers and took emotion out of picture entirely, it shouldn't.
> 
> I do not disagree with you about having them add more safeguards as that would make it even safer.
> 
> I only disagree with saying it's not ready to be in the field now, it is. Make s few modifications now and safeguards and go back to adding mileage of fatality free.
> 
> Now if there is another fatality within the US and it happens within 60 million Autopilot miles then we can revisit this topic as my opinion would change.


It doesn't matter if there is another in 60 million miles or 60. It has no statistical significance either way. It's simply not enough miles or accidents to make any determination yet.

It's lile if you toss a penny and get heads 3 times in a row it means nothing. 10 times is a bit suspicious. 100 means either you have a two headed penny or you should go buy a lottery ticket that day.

1 accident or 2 means nothing right niw.


----------



## LevelX

Well if both units had been 'auto' then the crash wouldn't have happened either..... as the truck would have seen the car coming and not crossed in front of it.

In this case the human truck driver was at fault, and yes the car failed to see the issue and react in time due to sensor setup on the Tesla.

The same would have happened with bog standard cruise control, had the driver failed to react and brake in time.

That said, there is always room for improvement on the auto pilot side of things!!!


----------



## Danielson_

Sad to say but it was bound to happen. More accidents like this will occur too. The first but not the last. 

Man is depending on technology way too much.


----------



## sekani

In the meantime, a few thousand people died in human-driven cars and no one cares because they can't blame it on a computer.


----------



## Icecool

LevelX said:


> Well if both units had been 'auto' then the crash wouldn't have happened either..... as the truck would have seen the car coming and not crossed in front of it.
> 
> In this case the human truck driver was at fault, and yes the car failed to see the issue and react in time due to sensor setup on the Tesla.
> 
> The same would have happened with bog standard cruise control, had the driver failed to react and brake in time.
> 
> That said, there is always room for improvement on the auto pilot side of things!!!


You might as well said if the both vehicle were driven by the humans then either one of them would have avoid the accident


----------



## LevelX

Icecool said:


> You might as well said if the both vehicle were driven by the humans then either one of them would have avoid the accident


If this crash happened with a human driving each, then the 'truck' would have been deemed at fault. So I fail to see what its any different with one being auto. YES the auto one should have been more alert and avoided the issue, but lets face it, humans are far from prefect and make more mistakes.


----------



## Sydney Uber

Our State Government here in NSW is softening us up for the advent of robot cars. There has been a terrible upwards spike in fatalities here on our roads over the past 12 months. This has come about because Sydney's population is growing by about 80 to 90,000 people per annum, with no new roads built in the last 10 years to accommodate that increase.

They've started to run ads stating "towards zero deaths on New South Wales roads."

Now everybody knows that is impossible, fairy land stuff with what occurs on our roads today with humans behind the wheel. But with the increase of the sharing economy, fewer cars on the road, that are utilised more, and driven by robots, that is the Nirvana that Uber is selling to governments around the world


----------



## Fireguy50

MattyMikey said:


> Though I will say I'm shocked they didn't notice this glitch about the 3 foot prior to road tests.


The trailer should have that DOT red/white tape making its presence glaringly obvious.










observer said:


> From article,
> 
> It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.


Translation:

"Tessy" is the safest car in the world!
Remove seatbelt to walk about the cabin
Drink $14 Starbuck's latte
Watch Harry Potter movies
OH SHIMSCHUYSHENYUJHG.............
_Fin_


----------



## second2noone

Would you entrust your lives or loved ones into the hands of a killing machine?


----------



## LA Cabbie

painfreepc said:


> This question was asked on the George Noory show, are you at George Noory listener if you are good for you,
> 
> This is a very serious question and the people behind the technology is not answering it,
> 
> your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..


Hmmm. I took notice that a lot of cab drivers are coast to coast fans, myself included. You must have been a night driver.


----------



## Icecool

The news didn't say who at fault for the accident they are still investigating


----------



## LevelX

_"The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S."
_
Doesn't take a rocket scientist to work out who is fault here....


----------



## OC Lady Uber Driver

Everything we do in life has some element of risk. The first cars that were made weren't perfect and even today, we are safer in our vehicles than ever before, but once in awhile one will burst into flames from electrical issues, have a ruptured gas tank (remember the Pinto!) or the technology will just flat out fail (we all are supposed to be wearing seat belts which have been required since 1968, but there have been injuries from using those, too). Maybe the driverless conveyance needs to change and become even more safe for human transportation and not just have it be some converted old car humans drive around today.


----------



## Fireguy50

observer said:


> *He had already almost been killed for NOT PAYING ATTENTION WHILE DRIVING.*
> 
> _The Verge_ also notes that Tesla referred to Brown as a friend of the company, and that he recorded a somewhat viral video of his Model S nicknamed 'Tessy' having a near crash earlier. He said he had not been watching the road and that his car saved his life.
> 
> *In this guys case he got a second chance and didn't learn from his first mistake.*


----------



## observer

Fuzzyelvis said:


> It doesn't matter if there is another in 60 million miles or 60. It has no statistical significance either way. It's simply not enough miles or accidents to make any determination yet.
> 
> It's lile if you toss a penny and get heads 3 times in a row it means nothing. 10 times is a bit suspicious. 100 means either you have a two headed penny or you should go buy a lottery ticket that day.
> 
> 1 accident or 2 means nothing right niw.


Statistically it means little but in how the public percieves driverless tech it would mean a lot.


----------



## Fireguy50

observer said:


> Statistically it means little but in how the public percieves driverless tech it would mean a lot.


Correct, even though he wasn't following the instructions.
Headline reads:
*SELF DRIVING CAR KILLS PASSSANGER*


----------



## Fireguy50

Not sure which moderator is responsible for this, but my compliments to the chef!


----------



## MattyMikey

Fuzzyelvis said:


> It doesn't matter if there is another in 60 million miles or 60. It has no statistical significance either way. It's simply not enough miles or accidents to make any determination yet.
> 
> It's lile if you toss a penny and get heads 3 times in a row it means nothing. 10 times is a bit suspicious. 100 means either you have a two headed penny or you should go buy a lottery ticket that day.
> 
> 1 accident or 2 means nothing right niw.


You're right. 1-2 accidents means nothing right now. So improve things but you don't stop them. The technology is ready to be tried and learned from on what the Autopilot does. Experience and miles has to come from somewhere, right?
And I wouldn't say 130 million miles does not have any significant value. That it a lot of mileage and it does have value. Needs more, but don't stop it.

And again, we lost one life. Okay, tragic, but without even searching I know it saved at least 2 others in Seattle with Uber Select. I posted a link. So I'm sure if I did research on the internet I could find many more cases where driver would not have had time to react but Tesla did.

So what does this prove? It proves the Tesla Autopilot is safer today having it, than not having it. With the programmers learning from their mistakes it's going to be even safer.

People don't tend to understand you don't get these real life learning experiences just on test tracks. They have to be used by the masses.

Another poster talked about Autopilot on aircraft. You think it was always perfectly save? No, but they learned and make adjustments.


----------



## painfreepc

MattyMikey said:


> You're right. 1-2 accidents means nothing right now. So improve things but you don't stop them. The technology is ready to be tried and learned from on what the Autopilot does. Experience and miles has to come from somewhere, right?
> And I wouldn't say 130 million miles does not have any significant value. That it a lot of mileage and it does have value. Needs more, but don't stop it.
> 
> And again, we lost one life. Okay, tragic, but without even searching I know it saved at least 2 others in Seattle with Uber Select. I posted a link. So I'm sure if I did research on the internet I could find many more cases where driver would not have had time to react but Tesla did.
> 
> So what does this prove? It proves the Tesla Autopilot is safer today having it, than not having it. With the programmers learning from their mistakes it's going to be even safer.
> 
> People don't tend to understand you don't get these real life learning experiences just on test tracks. They have to be used by the masses.
> 
> Another poster talked about Autopilot on aircraft. You think it was always perfectly save? No, but they learned and make adjustments.


Yes I understand that this is not fully automated autonomous automobile,
this is driver assist breaking, keeping the car within the lanes, maintaining a safe distance from the car in front of you when the cruise control is engaged,
yes I understand that,

But that is not the way many people are going to use this technology, they're going to use it so they can check Facebook, so they can take selfies, so they can check email, so they can concentrate more on phone conversations, putting on makeup and discipline their kids in the backseat, instead of concentrating on the road..

Its6 our responsibility to concentrate on the road, to concentrate on driving not look for a way so we don't have to.

And speaking of concentrate on the road, why is Uber making us look more and more unprofessional with these non-stop stacked trips is making us have to devote our eyes from the road and look over for a few seconds at our app and press a few buttons, this is stupid it needs to stop,

there is no reason why uber can't hold that call and give it to us after we actually end our trip,

but no uber has no intentions of doing that because that would make uber look bad, because that means the customer will be looking at app with massage saying a "car will be available soon" but no actual driver or license plate to show a uber is actually coming,

better to make us look unprofessional like a bunch of idiots playing around with our cell phone looking for the next Dollar.


----------



## observer

Eight speeding tickets.

http://www.scpr.org/news/2016/07/01/62212/man-killed-in-tesla-autopilot-crash-had-8-speeding/

Latest was in 2011.


----------



## Jermin8r89

tohunt4me said:


> I wouldn't be so sure.
> Demographics.
> America has an aging population.
> Industry wants shipping by truck without drivers.
> Shipping with ships want crewless ships.
> It will advance.


Yea people want things done fast but also want efficaint. Uber is for just that. I laugh at people who want privacy cuz that died out once smart phones came out. U want more tech well u gonna have no privacy.


----------



## HERR_UBERMENSCH

Can't wait until next year when I can buy a car that drives itself, that will be SO MUCH safer.


----------



## Gung-Ho

No matter to what technical accuracy they will be able to fine tune these driverless cars there is one thing they still will never have and that's INTUITION not just reacting to what's happening but what is about to happen. How many times have you been on the highway when you're sure the car a little bit ahead of you is about to change lanes into yours which would cause an accident...intuition tells you to slow. Driverless cars will never have that. In a vacuum they would preform just fine. In a utopian society where every car is driverless most likely fine. But mixing these in with human drivers in hundreds of variable driving conditions and you have potential disaster.


----------



## HERR_UBERMENSCH

Gung-Ho said:


> No matter to what technical accuracy they will be able to fine tune these driverless cars there is one thing they still will never have and that's INTUITION not just reacting to what's happening but what is about to happen. How many times have you been on the highway when you're sure the car a little bit ahead of you is about to change lanes into yours which would cause an accident...intuition tells you to slow. Driverless cars will never have that. In a vacuum they would preform just fine. In a utopian society where every car is driverless most likely fine. But mixing these in with human drivers in hundreds of variable driving conditions and you have potential disaster.


Mixing humans and technology never works out well, ask anyone who has ever done IT Tech Support.


----------



## Jermin8r89

So after everything becomes autonimis what do we do? It'll close businesses down and make the economy go belly up. So if u just entering college u maze well just do robotics or medical


----------



## UberHammer

Gung-Ho said:


> No matter to what technical accuracy they will be able to fine tune these driverless cars there is one thing they still will never have and that's INTUITION not just reacting to what's happening but what is about to happen. How many times have you been on the highway when you're sure the car a little bit ahead of you is about to change lanes into yours which would cause an accident...intuition tells you to slow. Driverless cars will never have that. In a vacuum they would preform just fine. In a utopian society where every car is driverless most likely fine. But mixing these in with human drivers in hundreds of variable driving conditions and you have potential disaster.


Exactly. This is why the economy has recessions and depressions. There is nothing wrong with the math economists use. They result from the degree of unpredictability humans exhibit. There will never be a technology that can navigate human unpredictability without failing. Yet the owner of the technology pays the price for the failure. Even the best self driving technology will come with HUGE litigation costs for the owner of it.


----------



## HERR_UBERMENSCH

Jermin8r89 said:


> So after everything becomes autonimis what do we do? It'll close businesses down and make the economy go belly up. So if u just entering college u maze well just do robotics or medical


We will need undertakers to handle all of the people jumping from bridges.


----------



## Bart McCoy

observer said:


> I'm not too sure the 3 foot height had anything to do with this accident.
> 
> The car may not "see" the trailer from a few feet away but it definitely should have seen it from 100, 200, 500 feet away where the three foot height should not have been a factor.
> 
> Apparently the Tesla comes with 12 sensors that "see" objects for sixteen feet around the vehicle. It also has forward facing cameras and radar units.
> 
> Even if the cameras couldn't see "The white trailer and brightly lit sky", the radar units didn't detect it.


Why would you say the height didn't have anything to do with it? The car never applied brakes, either by the driver or the autopilot. So one can assume the autopilot detected nothing in the way so no brakes were applied right? Well there was a HUGE object in the way. One of the sensors should have seen an obstruction in the road, and SURELY a human being could see it coming. These auto cars have to be able to do what humans can and better. Like for me, I probably would have seen it far down the road, and maybe slowed early because I saw teh truck creeping(I mean its a big truck, and can't really just dart out in the road). A true autonomous vehicle should be able to "sense" danger just like a human would. Telsa failed miserably in this incident



observer said:


> From article,
> 
> It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.


clearly to ward off lawsuits. lets see if it works



observer said:


> I don't think Tesla believes this needed to be passed by regulators.
> 
> Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.


stupidest name in the world: autopilot. but they require you to have your hands on the wheel. might as well just call it super cruise control instead of autopilot



Icecool said:


> The news didn't say who at fault for the accident they are still investigating


I can't really think of any way the truck could have been in the right, if they say the car was on a highway. Unless there's a red light/stop sign, in the middle of a "highway"


----------



## HERR_UBERMENSCH

> I can't really think of any way the truck could have been in the right, if they say the car was on a highway. Unless there's a red light/stop sign, in the middle of a "highway"


Sure, the unproven technology with the inattentive driver at the controls must have been totally in the right. Professional driver with years of experience driving commercial vehicles should have been watching out for him probably driving in one of his blind spots. Unfortunately it takes a death to bring this kind of flaw to light. They really couldn't have tested for this situation in 130+ MILLION miles they supposedly drove this system?


----------



## Bart McCoy

HERR_UBERMENSCH said:


> Professional driver with years of experience driving commercial vehicles should have been watching out for him probably driving in one of his blind spots.?


who's blind spot? the truck crossed the road perpendicular to the road the car was on...


----------



## painfreepc

Jermin8r89 said:


> So after everything becomes autonimis what do we do? It'll close businesses down and make the economy go belly up. So if u just entering college u maze well just do robotics or medical


It looks to me like we're actually on the road of creating a society like we see in futuristic Sci-Fi movies like Robocop and Judge Dredd,

If everything goes robotic, when the hell are all the people going to work, everybody is not cut out to be a doctor or lawyer or some type of scientist or engineer and we don't need that many of these jobs..

But I'm not too worried anyway for the computer technology for this is not ready that driver was killed because his computer in his car can not actually see an object the way we humans see it, can only React to what it's been programmed to react to,

I love science fiction my favorite thing to watch is Star Trek, the idea of warp drive technology and Transporters is fascinating, but think about it if an alien race actually gave us warp drive and transporter technology we would not have the computer technology to operate it, we could not build a Starfleet we would not have the computer technology to operate the ships, for the Transporters we would not have the computer technology to break down the body into atoms and put it back together again, we're not there yet we want to believe that we are more than what we are but we are not..


----------



## LAuberX

In L.A. people pull out in front of you every hour causing you to "choose" brake/or hit them!
Truck driver says Tesla pilot was watching/playing "Harry Potter"... he could hear it coming from the crashed car, he said he did not "see" it.. and the Tesla was going so fast he never saw it go under his trailer.

So maybe the truck driver did get in the Tesla's "right of way"... if the Tesla was not going so fast as to not be visible before the truck turned.
Do Tesla's have a black box? a.k.a. "flight recorder" as many cars do?


----------



## njn




----------



## Jermin8r89

painfreepc said:


> It looks to me like we're actually on the road of creating a society like we see in futuristic Sci-Fi movies like Robocop and Judge Dredd,
> 
> If everything goes robotic, when the hell are all the people going to work, everybody is not cut out to be a doctor or lawyer or some type of scientist or engineer and we don't need that many of these jobs..
> 
> But I'm not too worried anyway for the computer technology for this is not ready that driver was killed because his computer in his car can not actually see an object the way we humans see it, can only React to what it's been programmed to react to,
> 
> I love science fiction my favorite thing to watch is Star Trek, the idea of warp drive technology and Transporters is fascinating, but think about it if an alien race actually gave us warp drive and transporter technology we would not have the computer technology to operate it, we could not build a Starfleet we would not have the computer technology to operate the ships, for the Transporters we would not have the computer technology to break down the body into atoms and put it back together again, we're not there yet we want to believe that we are more than what we are but we are not..


I'm a sci fye athusist also... I'm also a techy person to but also so a trucker too. Everything has moved extremely fast within 10 years. Once apple came out smart phones exploded. We seeing things takeing off to fast. We also have had compustable engines for over hundred years took decades for seatbelts as some places don't care for seatbelts,then hybrids been out for about 12 years full electric like the leaf a few years now autonomous. Social media took way off with in 4 years. I'm 27 I could see a couple of different life times with in 10 years. We seeing social movements we have never seen before. Now remember terminator skynet I'm already seeing uber and fb working together which is kinda scary in a big sense of things. We have always explored but we ain't exploring oceans or space. Also the world has been the most violent in the past 6 years that it ever has. We have things now as cyber attacks, cyber bulling, credit card theft, child porn stalking. There will be new creativity for criminals. We went to space in the 60s now we need to just bring a hand full of people to space 4 or 5 times a year to the moon or mars cuz if u know anything like wall E that's what could happen maybe within 30 years


----------



## Bart McCoy

LAuberX said:


> View attachment 46882
> 
> 
> In L.A. people pull out in front of you every hour causing you to "choose" brake/or hit them!
> Truck driver says Tesla pilot was watching/playing "Harry Potter"... he could hear it coming from the crashed car, he said he did not "see" it.. and the Tesla was going so fast he never saw it go under his trailer.
> 
> So maybe the truck driver did get in the Tesla's "right of way"... if the Tesla was not going so fast as to not be visible before the truck turned.
> Do Tesla's have a black box? a.k.a. "flight recorder" as many cars do?


If that's the pic, there's no way the tesla driver could NOT have the right of way. Truck driver clearly at fault regardless of pax not paying attention


----------



## painfreepc

painfreepc said:


> It looks to me like we're actually on the road of creating a society like we see in futuristic Sci-Fi movies like Robocop and Judge Dredd,
> 
> If everything goes robotic, when the hell are all the people going to work, everybody is not cut out to be a doctor or lawyer or some type of scientist or engineer and we don't need that many of these jobs..
> 
> But I'm not too worried anyway for the computer technology for this is not ready that driver was killed because his computer in his car can not actually see an object the way we humans see it, can only React to what it's been programmed to react to,
> 
> I love science fiction my favorite thing to watch is Star Trek, the idea of warp drive technology and Transporters is fascinating, but think about it if an alien race actually gave us warp drive and transporter technology we would not have the computer technology to operate it, we could not build a Starfleet we would not have the computer technology to operate the ships, for the Transporters we would not have the computer technology to break down the body into atoms and put it back together again, we're not there yet we want to believe that we are more than what we are but we are not..





Jermin8r89 said:


> I'm a sci fye athusist also... I'm also a techy person to but also so a trucker too. Everything has moved extremely fast within 10 years. Once apple came out smart phones exploded. We seeing things takeing off to fast. We also have had compustable engines for over hundred years took decades for seatbelts as some places don't care for seatbelts,then hybrids been out for about 12 years full electric like the leaf a few years now autonomous. Social media took way off with in 4 years. I'm 27 I could see a couple of different life times with in 10 years. We seeing social movements we have never seen before. Now remember terminator skynet I'm already seeing uber and fb working together which is kinda scary in a big sense of things. We have always explored but we ain't exploring oceans or space. Also the world has been the most violent in the past 6 years that it ever has. We have things now as cyber attacks, cyber bulling, credit card theft, child porn stalking. There will be new creativity for criminals. We went to space in the 60s now we need to just bring a hand full of people to space 4 or 5 times a year to the moon or mars cuz if u know anything like wall E that's what could happen maybe within 30 years


Yeah don't get me started about the the Cyber War thing,
I ran two DNS servers, a backup Authoritative DNS Name Server for my domains and Non-Authoritative DNS Name Caching Server for my home and close friends, I had to shut down the caching DNA server.. as for the wall-E scenario I think it will take longer than 30 years, but I do see it coming..


----------



## dpv

Leave the autonomous vehicles in the skys. They have no place on the ground, and that would be an extra idiot on the road that I have to fight with while going to and from work.


----------



## dpv

Uberchampion said:


> I wonder if the poor guy would have fared any differently if the car wasn't on autopilot?
> 
> When it's your time to go....


Poor guy? He was the idiot that would put his own trust into something that is computer controlled and probably programmed by someone with horrid driving habits and a bad driving record. These vehicles should be pulled off the road and put in the same category as detracted driving.


----------



## Uberman8263

I feel bad for the family of the driver. That being said, for a young guy like me it's good to know that I will have the opportunity to be an uber independent contractor for the next 30 maybe 40 years.


----------



## Fireguy50

observer said:


> Eight speeding tickets.
> 
> http://www.scpr.org/news/2016/07/01/62212/man-killed-in-tesla-autopilot-crash-had-8-speeding/
> 
> Latest was in 2011.


Was he ticketed for going 88MPH?


----------



## KevinH

I just spoke to a limousine driver who had an attorney passenger in the back seat. Both of them saw a Tesla Model X northbound on HWY 101 in San Mateo CA yesterday with no driver behind the wheel. They deliberately drove past the car a second time. Maybe someone was in a rear seat but both people could not see a person behind the wheel. This is required for autonomous research vehicles in California.


----------



## uberdriverfornow

Gung-Ho said:


> No matter to what technical accuracy they will be able to fine tune these driverless cars there is one thing they still will never have and that's INTUITION not just reacting to what's happening but what is about to happen. How many times have you been on the highway when you're sure the car a little bit ahead of you is about to change lanes into yours which would cause an accident...intuition tells you to slow. Driverless cars will never have that. In a vacuum they would preform just fine. In a utopian society where every car is driverless most likely fine. But mixing these in with human drivers in hundreds of variable driving conditions and you have potential disaster.


Exactly, you can't program common sense.


----------



## Jermin8r89

I'm curious with Tesla being being ground breaking do they just tell the good ole sales pitch with some education to the driver also or they just drive off the lot with a manual inside or nothing at all and just go off in the sunset


----------



## Believe33

DriverX said:


> And so it begins.
> 
> http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s
> 
> https://www.teslamotors.com/blog/tragic-loss
> 
> Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?
> 
> THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.
> 
> Fear not drivers they will need us for quite sometime to come.


This may not be good!


----------



## HERR_UBERMENSCH

Uberman8263 said:


> I feel bad for the family of the driver. That being said, for a young guy like me it's good to know that I will have the opportunity to be an uber independent contractor for the next 30 maybe 40 years.


That is a good thing?


----------



## DriverX

observer said:


> I don't think Tesla believes this needed to be passed by regulators.
> 
> Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.


THey don't require drivers to keep there hands on the wheel. They advise it but they put no sensors in the steering wheel to enforce it. I'm not saying they should, either becasue I believe we have the right to die. Look at all the youtubes of jackass showing off their new autopilot tesla holding their arms up in the arm like its Space Mountain. Too bad for the innocent person that gets taken out by some Mickey Mouse Tesla driver in space Mtn mode.

Anyway they had to get approval in the few states that allow it and that always takes greasing some wheels.


----------



## DriverX

MattyMikey said:


> Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. .


You might want to check that. Tesla doesn't have enough cars with autopilot on the road to have accrued that may miles in autopilot yet. That might be there internal test miles included.


----------



## HERR_UBERMENSCH

DriverX said:


> THey don't require drivers to keep there hands on the wheel. They advise it but they put no sensors in the steering wheel to enforce it. I'm not saying they should, either becasue I believe we have the right to die. Look at all the youtubes of jackass showing off their new autopilot tesla holding their arms up in the arm like its Space Mountain. Too bad for the innocent person that gets taken out by some Mickey Mouse Tesla driver in space Mtn mode.
> 
> Anyway they had to get approval in the few states that allow it and that always takes greasing some wheels.


According to many sources Tesla DOES require drivers to keep their hands on the wheel...

http://www.teslarati.com/what-happens-ignore-tesla-autopilot-warnings/


----------



## Uberx Vegas

The Tesla owner who died on May 7 in the first fatal crash involving a self-driving vehicle, was reportedly watching a Harry Potter movie when he died. The Associated Press interviewed the truck driver whose tractor-trailer was punctured by the self-driving vehicle on a Florida highway. The truck driver said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and that, quote, "he went so fast through my trailer I didn't see him. http://www.msn.com/en-nz/news/other...accident-was-watching-harry-potter/vi-AAhS3Ao


----------



## observer

HERR_UBERMENSCH said:


> According to many sources Tesla DOES require drivers to keep their hands on the wheel...
> 
> http://www.teslarati.com/what-happens-ignore-tesla-autopilot-warnings/


And there are sensors in the steering wheel. The car will start to slow down after a certain amount of time has passed. I'm wondering if he didn't Jerryrig the steering wheel some how.


----------



## Fireguy50

observer said:


> I don't think Tesla believes this needed to be passed by regulators.
> 
> Since they require drivers to keep hands on steering wheel and in control. It's kind of an enhanced cruise control.


Lawmakers will probably have to get their hands on this now. Tesla just forced the issue ahead of the planned schedule that govt, Uber, Google, and the rest were working on self driving cars.


observer said:


> And there are sensors in the steering wheel. The car will start to slow down after a certain amount of time has passed. I'm wondering if he didn't Jerryrig the steering wheel some how.


Some vehicles have eye sensors to see if they are fixed (not scanning the roadway) and drooping eyelids.
Not sure if Tesla has this technology, and why it wouldn't detect a driver watching a Harry Potter movie!?

Questions to the be answered, but the key takeaway is "self driving" car technology can be tricked/fooled/malfunction/tampered/disabled.


----------



## tohunt4me

observer said:


> Eight speeding tickets.
> 
> http://www.scpr.org/news/2016/07/01/62212/man-killed-in-tesla-autopilot-crash-had-8-speeding/
> 
> Latest was in 2011.


If he would have been driving instead of the robot,he would have been in FRONT of that slow truck and alive today . . .


----------



## HERR_UBERMENSCH

This accident happened on May 7th, nearly two months ago.


----------



## painfreepc

I got a question that no one has asked, 
why is this car allowed to operate on automatic pilot Way Beyond the posted speed limit..


----------



## HERR_UBERMENSCH

painfreepc said:


> I got a question that no one has asked,
> why is this car allowed to operate on automatic pilot Way Beyond the posted speed limit..


I wondered what the speed limit was myself. Waze reminds me whenever I exceed the posted limit, Tesla's autopilot isn't as advanced as Waze?


----------



## DriverX

observer said:


> I'm thinking the radar malfunctioned or needs a wider peripheral area. The trailer and truck should have been noticed a long time before the crash even if the cameras didn't see it.
> 
> ** Even if the trailer wasn't spotted because it was 3 ft high, what about the truck?


I don't think the radar works the way you think it does.


----------



## observer

HERR_UBERMENSCH said:


> I wondered what the speed limit was myself. Waze reminds me whenever I exceed the posted limit, Tesla's autopilot isn't as advanced as Waze?


The driver knew he was speeding. Didn't need to be reminded.


----------



## HERR_UBERMENSCH

observer said:


> The driver knew he was speeding. Didn't need to be reminded.


My point is that the car, while in autopilot, should know the speed limit and not exceed it.


----------



## observer

DriverX said:


> I don't think the radar works the way you think it does.


Apparently not, the radar thought the trailer was an overhead road sign.


----------



## DriverX

HERR_UBERMENSCH said:


> According to many sources Tesla DOES require drivers to keep their hands on the wheel...
> 
> http://www.teslarati.com/what-happens-ignore-tesla-autopilot-warnings/


"His cruising speed was 70mph and as he recalls, he had been hands off for more than 10 minutes. "

So plenty of time to rub one out at 90 mph!


----------



## observer

HERR_UBERMENSCH said:


> My point is that the car, while in autopilot, should know the speed limit and not exceed it.


I wonder if the driver is allowed to accelerate while the car drives.

The driver would be under the impression that if something dangerous came up, the car would automatically slow down, so he would think it's OK to speed.


----------



## DriverX

Fireguy50 said:


> Lawmakers will probably have to get their hands on this now. Tesla just forced the issue ahead of the planned schedule that govt, Uber, Google, and the rest were working on self driving cars.
> 
> Some vehicles have eye sensors to see if they are fixed (not scanning the roadway) and drooping eyelids.
> Not sure if Tesla has this technology, and why it wouldn't detect a driver watching a Harry Potter movie!?
> 
> Questions to the be answered, but the key takeaway is "self driving" car technology can be tricked/fooled/malfunction/tampered/disabled.


Apply a mild electric shock through the seat warmer if the driver lets go of the wheel for more than 30 seconds.

http://cdn.fansided.com/wp-content/blogs.dir/340/files/2016/04/Clockwork.jpg


----------



## DriverX

Jermin8r89 said:


> I'm curious with Tesla being being ground breaking do they just tell the good ole sales pitch with some education to the driver also or they just drive off the lot with a manual inside or nothing at all and just go off in the sunset


The manual has shirtless pics of Musk for them to enjoy while autopiloting to Whole Foods.


----------



## DriverX

Gung-Ho said:


> No matter to what technical accuracy they will be able to fine tune these driverless cars there is one thing they still will never have and that's INTUITION not just reacting to what's happening but what is about to happen. How many times have you been on the highway when you're sure the car a little bit ahead of you is about to change lanes into yours which would cause an accident...intuition tells you to slow. Driverless cars will never have that. In a vacuum they would preform just fine. In a utopian society where every car is driverless most likely fine. But mixing these in with human drivers in hundreds of variable driving conditions and you have potential disaster.


Hard to say if the death toll wouldn't just balance out. I agree though that humans can notice stuff like that the guy that just passed me was drunk and then slow down to let them get far away from you. Or maybe we see smoke on the horizon and can anticipate braking. Or we would be able to distinguish ice on the road or oil before the software will have been refined enough to cover all those variables.


----------



## observer

DriverX said:


> Hard to say if the death toll wouldn't just balance out. I agree though that humans can notice stuff like that the guy that just passed me was drunk and then slow down to let them get far away from you. Or maybe we see smoke on the horizon and can anticipate braking. Or we would be able to distinguish ice on the road or oil before the software will have been refined enough to cover all those variables.


Intuition and gut feelings should always be taken seriously. About 10-12 years ago I was driving southbound on the 405 near those long hilly curves by Pyramid Lake. I was in the outside lane driving up a long curve. I looked to my left and saw a mini pickup, towing another mini pick up, towing a small boat. I was sandwiched between this minitrain and the guard rail.

I remember thinking, hmmm this doesn't look safe, so I accelerated. Just as I pulled a hundred feet in front of the mini train, I looked in the rear view mirror and saw it crash into the guard rail.


----------



## PoorBasterd

DriverX said:


> I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.


i would consider that a very optimistic estimation. Mine is....never!


----------



## Sam023

I see Tesla stock falling like a rock on Tuesday morning when the markets open


----------



## Fireguy50

DriverX said:


> Apply a mild electric shock through the seat warmer if the driver lets go of the wheel for more than 30 seconds.


Actually, I believe Mercedes has a seat vibrator as part of their system. Think Top Gear or 5th Gear talked about it.
https://en.m.wikipedia.org/wiki/Driver_drowsiness_detection


Sam023 said:


> I see Tesla stock falling like a rock on Tuesday morning when the markets open


Maybe, maybe not.
That's a lot of vacation days for stock holders to forget about this, or for Tesla to put a positive spin on the story.
I'll bet their Marketing & Technology departments just got their weekend off cancelled. EMERGENCY, BACK TO WORK!


----------



## Fireguy50

DriverX said:


> Humans can notice stuff like that the guy that just passed me was drunk and then slow down to let them get far away from you. Or maybe we see smoke on the horizon and can anticipate braking. Or we would be able to distinguish ice on the road or oil before the software will have been refined enough to cover all those variables.


Agree, human brain is just to complex for computers to compete. Right now they just want to survive a simple commute, they're decades from competitive auto racing!
But there is NO excuse for this crash. If that truck had the proper DOT reflective tape I posted on page 5 any simplified automated detection system _should_ have easily seen that truck. This _should_ be an easy software or hardware glitch to fix.
Unless the truck did NOT have the proper markings, then they would carry some fault.


----------



## UberKim

F213 said:


> Ah yes, human error still blueprints it's self in their rise to the thinking age.


i love yur avatar and yur amazing quote...


----------



## DriverX

Fireguy50 said:


> Agree, human brain is just to complex for computers to compete. Right now they just want to survive a simple commute, they're decades from competitive auto racing!
> But there is NO excuse for this crash. If that truck had the proper DOT reflective tape I posted on page 5 any simplified automated detection system _should_ have easily seen that truck. This _should_ be an easy software or hardware glitch to fix.
> Unless the truck did NOT have the proper markings, then they would carry some fault.
> View attachment 46962
> 
> 
> View attachment 46963


Yes something like that could be useful, but I'm not interested in the band-aid approach to solving this problem. Google is developing the tech that will become the standard. This Tesla stuff is for hobbyists who want to get a taste of the future. Personally I don't care what people do so long as it doesn't harm anyone besides themselves. Fortunately, the only one killed was the inattentive driver. Do we really want this proto-tech available to teenagers and drunks; is the question we should be asking.


----------



## Fireguy50

DriverX said:


> Yes something like that could be useful, but I'm not interested in the band-aid approach to solving this problem. Google is developing the tech that will become the standard. This Tesla stuff is for hobbyists who want to get a taste of the future. Personally I don't care what people do so long as it doesn't harm anyone besides themselves. Fortunately, the only one killed was the inattentive driver. Do we really want this proto-tech available to teenagers and drunks; is the question we should be asking.


Which is why Legislative Branch of government has to catch up with technology.
The NHTSA & DOT will have new regulations before fall. Probably fine Tesla after the investigation.


----------



## Fuzzyelvis

observer said:


> Intuition and gut feelings should always be taken seriously. About 10-12 years ago I was driving southbound on the 405 near those long hilly curves by Pyramid Lake. I was in the outside lane driving up a long curve. I looked to my left and saw a mini pickup, towing another mini pick up, towing a small boat. I was sandwiched between this minitrain and the guard rail.
> 
> I remember thinking, hmmm this doesn't look safe, so I accelerated. Just as I pulled a hundred feet in front of the mini train, I looked in the rear view mirror and saw it crash into the guard rail.


There are many similar systems being worked on to allow blind folks to safely walk around a city, crossing streets, avoiding obstacles, including hanging ones, trip hazards, other pedestrians, cyclists, stepping up and down curbs, etc.

Right now, a dog is still better at this.

Plus you can't hack into a dog's brain the way you can a computer. And if a dog makes a mistake it likely ONLY affects the blind person. It won't take out a city block at 90 mph.


----------



## Fireguy50

http://fortune.com/2016/05/26/tesla-autopilot-crash/
They'll have to check the blackbox


----------



## 808master

So we prob got a few more years before we're replaced by self driving cars, sweet


----------



## Jermin8r89

https://transportevolved.com/2016/0...lying-car-technology-but-its-not-a-hover-car/
You doing autonomous in air seem more practical the on roads


----------



## Golfer Lou

It will be 10 years plus before they let a fully automated car without a person behind the wheel to take over operations drive a passenger


----------



## sellkatsell44

Sam023 said:


> I see Tesla stock falling like a rock on Tuesday morning when the markets open


Like how targets did after the announcement of the breach?

Or dramatic as, let's say, Bank of America after the Wall Street fall out of 08?

I would like to think that most Americans are smart enough to realize that crashes will happen. It's an unfortunate event. However, unless the city is solely operating on driverless vehicles; accidents and subsequently deaths will occur. Because on the other side, there's a human there that will cause error


----------



## Jermin8r89

sellkatsell44 said:


> Like how targets did after the announcement of the breach?
> 
> Or dramatic as, let's say, Bank of America after the Wall Street fall out of 08?
> 
> I would like to think that most Americans are smart enough to realize that crashes will happen. It's an unfortunate event. However, unless the city is solely operating on driverless vehicles; accidents and subsequently deaths will occur. Because on the other side, there's a human there that will cause error


We are heading in wrong direction. We need to just go to mars we go the tech to do it if the lazy ppl want to just be run by robots then they can. I think its bs how jetbacks aren't available to the public yet. Drones have lots of restrictions its just remote. Its hard for these to be available cuz it gives more freedom to us. What's the big push for autonomous cars? Its harder to accomplish then drones and jetpacks? They got hydro jetpacks but mostly for commercial use or they as much as a car but all it is is compostion engine with high powered water. Also to make a complete jetpack its about same thing except u using premium fuel. The whole safety bs is stupid. They have been trying for years to get gun control and they can't so they doing this. I'm getting a jetpack over a self driving car any day that's how u solve road rage, traffic ,waiting, around malfunctions


----------



## Old Rocker

Jermin8r89 said:


> We are heading in wrong direction. We need to just go to mars we go the tech to do it if the lazy ppl want to just be run by robots then they can. I think its bs how jetbacks aren't available to the public yet. Drones have lots of restrictions its just remote. Its hard for these to be available cuz it gives more freedom to us. What's the big push for autonomous cars? Its harder to accomplish then drones and jetpacks? They got hydro jetpacks but mostly for commercial use or they as much as a car but all it is is compostion engine with high powered water. Also to make a complete jetpack its about same thing except u using premium fuel. The whole safety bs is stupid. They have been trying for years to get gun control and they can't so they doing this. I'm getting a jetpack over a self driving car any day that's how u solve road rage, traffic ,waiting, around malfunctions


If someone cuts you off while jetpack flying, you can just turn around and melt them with your exhaust!


----------



## DriverX

LevelX said:


> Well if both units had been 'auto' then the crash wouldn't have happened either..... as the truck would have seen the car coming and not crossed in front of it.
> 
> In this case the human truck driver was at fault, and yes the car failed to see the issue and react in time due to sensor setup on the Tesla.
> 
> The same would have happened with bog standard cruise control, had the driver failed to react and brake in time.
> 
> That said, there is always room for improvement on the auto pilot side of things!!!


THe fact that the Tesla was speeding means its dual fault at the least, and I would think the legal dept at the trucking company could put it all on the Tesla driver. He was speeding and not attending to the vehicle. Who knows he may have been sending a text too which is illegal or watching porn.

Anyway, this will never get that far Tesla will settle whatever it has to.


----------



## DriverX

painfreepc said:


> It looks to me like we're actually on the road of creating a society like we see in futuristic Sci-Fi movies like Robocop and Judge Dredd,
> 
> If everything goes robotic, when the hell are all the people going to work, everybody is not cut out to be a doctor or lawyer or some type of scientist or engineer and we don't need that many of these jobs..
> 
> But I'm not too worried anyway for the computer technology for this is not ready that driver was killed because his computer in his car can not actually see an object the way we humans see it, can only React to what it's been programmed to react to,
> 
> I love science fiction my favorite thing to watch is Star Trek, the idea of warp drive technology and Transporters is fascinating, but think about it if an alien race actually gave us warp drive and transporter technology we would not have the computer technology to operate it, we could not build a Starfleet we would not have the computer technology to operate the ships, for the Transporters we would not have the computer technology to break down the body into atoms and put it back together again, we're not there yet we want to believe that we are more than what we are but we are not..


The people shouldn't have to work if everything is automated. That's really how we should be thinking. Why are we wasting are lives working so a few people at the top of a corporate pyramid scheme can live lavish lives of luxury and hold enormous power over everyone else. Those Dominionists will try and use robots to enslave us rather than free us from a life of wage labor.

As far as warp drive goes, alien tech transcends what we currently know about the physical world. Space exploration will not be done by physically sending our drones and then ourselves to these places by rocket transport. It's more likely that you would just project your conscious at light speed wherever you want to go.


----------



## painfreepc

DriverX said:


> The people shouldn't have to work if everything is automated. That's really how we should be thinking. Why are we wasting are lives working so a few people at the top of a corporate pyramid scheme can live lavish lives of luxury and hold enormous power over everyone else. Those Dominionists will try and use robots to enslave us rather than free us from a life of wage labor.
> 
> As far as warp drive goes, alien tech transcends what we currently know about the physical world. Space exploration will not be done by physically sending our drones and then ourselves to these places by rocket transport. It's more likely that you would just project your conscious at light speed wherever you want to go.


And some of my friends think my mind being in the Twilight Zone..lol


----------



## mweiss10

Golfer Lou said:


> It will be 10 years plus before they let a fully automated car without a person behind the wheel to take over operations drive a passenger


I feel like an automated person I can't believe Lyft hasn't even been decent enough to get back to me and let me know what the story is. I think they were ripping me off of my referrals to I gave him like 30 referrals this past weekend and didn't get one referral fee out of it and people said they were going to use it left and right.


----------



## Uberx Vegas

808master said:


> So we prob got a few more years before we're replaced by self driving cars, sweet


I have no problem if they use them to delivery pizza to my house. I will not get in one for free ride.


----------



## yourgrace

https://t.co/D80KRXJvzK

Sarcasm-on
Well, I think it's a "good" news for us. We'll keep our jobs longer before TK introduces auto-Mercs 
Sarcasm-off


----------



## Bart McCoy

DriverX said:


> THe fact that the Tesla was speeding means its dual fault at the least, and I would think the legal dept at the trucking company could put it all on the Tesla driver. He was speeding and not attending to the vehicle. Who knows he may have been sending a text too which is illegal or watching porn.
> 
> Anyway, this will never get that far Tesla will settle whatever it has to.


Truck should have never crossed the road no matter if the Tesla was speeding or not. It's not like it was a motorcycle going 140mph. And truck drivers should already know they need even More clear road to cross (when looking for oncoming vehicle) because they are slower and much longer of course. So why should Tesla pay money because a truck clearly didn't yield the right of way?


----------



## sellkatsell44

Jermin8r89 said:


> We are heading in wrong direction. We need to just go to mars we go the tech to do it if the lazy ppl want to just be run by robots then they can. I think its bs how jetbacks aren't available to the public yet. Drones have lots of restrictions its just remote. Its hard for these to be available cuz it gives more freedom to us. What's the big push for autonomous cars? Its harder to accomplish then drones and jetpacks? They got hydro jetpacks but mostly for commercial use or they as much as a car but all it is is compostion engine with high powered water. Also to make a complete jetpack its about same thing except u using premium fuel. The whole safety bs is stupid. They have been trying for years to get gun control and they can't so they doing this. I'm getting a jetpack over a self driving car any day that's how u solve road rage, traffic ,waiting, around malfunctions


Who know, depending on who you talk to you'll get different answers all around.

If you ask me, theoretically, you streamline functions and make things more easy and seamless so that you're able to spend time doing the more crucial things. Eg, instead of driving, you can have the car on autopilot (not to sleep!!) but so you can relax a bit better behind the wheel.

Or a better example, they've made it easier for you to do banking, instead of going to the teller and waiting 10-15 minutes you can take a picture with your phone in the comforts of your home or on the go, and do the deposit that way. It's meant to make your life easier by eliminating the more mundane things

What I'm finding though, is that it frees up your time so they can pile more work on you.


----------



## Jermin8r89

Bart McCoy said:


> Truck should have never crossed the road no matter if the Tesla was speeding or not. It's not like it was a motorcycle going 140mph. And truck drivers should already know they need even More clear road to cross (when looking for oncoming vehicle) because they are slower and much longer of course. So why should Tesla pay money because a truck clearly didn't yield the right of way?


I'll use the California Driver's Test answer, as it matches that recommended by the National Safety Council.

The CA Driver's test calls for a 10-12 second passing time for safety. (answer "C")

That is assuming you pull out to pass at your 2 second safety gap distance behind the car being passed and pull back in after passing allowing the car you passed to have a 2 second safety gap between their hood and the rear end of your car. You are also supposed to do your accelerating to your passing speed BEFORE pulling into the opposing traffic's lane and maintain a constant speed throughout the maneuver.

BTW: To pass LEGALLY, you must be able to drive at least 5 mph faster than the vehicle being passed and NOT exceed the speed limit. That means that you can not legally pass a car driving at 51 mph on a road with a 55 mph speed limit. Exceeding the speed limit can be ticketed for speeding while passing just as easily as when alone on the road. If you have accelerated to the speed limit for passing, there is no reason to have to slow after passing.

Keep in mind also that they are talking about average ca


----------



## DriverX

Bart McCoy said:


> Truck should have never crossed the road no matter if the Tesla was speeding or not. It's not like it was a motorcycle going 140mph. And truck drivers should already know they need even More clear road to cross (when looking for oncoming vehicle) because they are slower and much longer of course. So why should Tesla pay money because a truck clearly didn't yield the right of way?


Tesla was speeding. Dual fault Go ask any traffic cop or insurance agent what that means. case in point:

*Proportional Comparative Fault at 51 Percent*
The states that have adopted proportional comparative fault bar recovery if you are more than 51% at fault for the accident. In other words, you cannot file a liability claim and lawsuit against the other driver's negligence if you were more than 51% at fault.

For example, Dennis hit Teri's car while driving in excess of 25 miles per hour over the speed limit while Teri was attempting to cross the road. Even though Teri was partially at fault for not waiting until the road was completely clear before crossing, the insurance company allocated fault to Dennis at 60% due to his excessive speed. Even though Dennis suffered a broken arm from the accident, he is not entitled to recover for his injury due to the fact that he was more than 51% at fault for the accident.

_States: Connecticut, Delaware, Hawaii, Illinois, Indiana, Iowa, Massachusetts, Michigan, Minnesota, Montana, Nevada, New Hampshire, New Jersey, Ohio, Oregon, Pennsylvania, South Carolina, Texas, Vermont, Wisconsin and Wyoming._


----------



## Jermin8r89

Wtf how did driving become so difficult. Its like those couples who share a pizza in half and get technical...u got more cheese or pepperoni...oh u ripped it in half without measuring. Eye ball it. U speed up ahead and if they slow down then u slow down. All this for just passing someone its stupid I'm laughing


----------



## Old Rocker

I know the Tesla driver was reported as having had several speeding tickets, but was he actually speeding at the time of the crash?


----------



## Lincoln Navigator L

This story has been spun so badly. Tesla's autopilot feature is adaptive cruise control with lane keeping. It's not fully autonomous because according to the terms and conditions, as a driver you must still pay attention to the driving task, to be ready to take manual control at any moment. It's supposed to relieve the driver of some fatigue but it is not for sleeping behind the wheel nor for watching movies.


----------



## Fuzzyelvis

Lincoln Navigator L said:


> This story has been spun so badly. Tesla's autopilot feature is adaptive cruise control with lane keeping. It's not fully autonomous because according to the terms and conditions, as a driver you must still pay attention to the driving task, to be ready to take manual control at any moment. It's supposed to relieve the driver of some fatigue but it is not for sleeping behind the wheel nor for watching movies.


Maybe they should stop calling it "autopilot" ?


----------



## Fuzzyelvis

I read an article about this and had not realised that the Tesla kept going after the crash and could easily have killed someone else (luckily it didn't).

Anyway, what I don't get is that there appear to be no sensors to tell the car it has BEEN in an accident and should STOP.

THAT makes no sense and woukd definitely leave them open to liability, regardless of who was responsible fir the initial accident. See below.


----------



## lubi571

Imagine how it would have been spun if nothing happened. I stated on another thread that self driving works in simple applications, clearly that isn't true. The system never recognized the truck no brakes no swerving to try to avoid etc. I read all the excuses no reflectors on truck, white color, too high off the ground 130 million miles and no one died and so on. Most drivers would have recognized a potential problem from 500 to 300 feet away. Much work needs to be done just to complete simple applications.


----------



## F213

lubi571 said:


> Imagine how it would have been spun if nothing happened. I stated on another thread that self driving works in simple applications, clearly that isn't true. The system never recognized the truck no brakes no swerving to try to avoid etc. I read all the excuses no reflectors on truck, white color, too high off the ground 130 million miles and no one died and so on. Most drivers would have recognized a potential problem from 500 to 300 feet away. Much work needs to be done just to complete simple applications.


Coding the vehicle software to avoid these things and anything that relate to how this accident occured. I am not on the drivers side, but also tesla needs to figure this out because humans will continue to be humans.


----------



## RamzFanz

My god people, really?

This is not a self driving car. The DRIVER is in charge, not the car. He was watching a DVD in a non self driving car.

So yes, if you hurl your body down a road in a human driven car with your eyes closed, bad things will happen.

Meanwhile, while the deniers post click bait hit pieces, MIT has probably solved at least half of the driving in the snow and rain hurdle. Self driving doesn't exist in the US yet, so no, the sky isn't falling.


----------



## Jermin8r89

Once cars go autonomous without drivers I'm just gonna say **** it give me money government since u happing us so we don't have to work anymore


----------



## Bart McCoy

Jermin8r89 said:


> I'll use the California Driver's Test answer, as it matches that recommended by the National Safety Council.
> 
> The CA Driver's test calls for a 10-12 second passing time for safety. (answer "C")
> 
> That is assuming you pull out to pass at your 2 second safety gap distance behind the car being passed and pull back in after passing allowing the car you passed to have a 2 second safety gap between their hood and the rear end of your car. You are also supposed to do your accelerating to your passing speed BEFORE pulling into the opposing traffic's lane and maintain a constant speed throughout the maneuver.
> 
> BTW: To pass LEGALLY, you must be able to drive at least 5 mph faster than the vehicle being passed and NOT exceed the speed limit. That means that you can not legally pass a car driving at 51 mph on a road with a 55 mph speed limit. Exceeding the speed limit can be ticketed for speeding while passing just as easily as when alone on the road. If you have accelerated to the speed limit for passing, there is no reason to have to slow after passing.
> 
> Keep in mind also that they are talking about average ca


Except we arent talking about passing. Truck crossed street perpendicular to road Tesla was on



DriverX said:


> Tesla was speeding. Dual fault Go ask any traffic cop or insurance agent what that means. case in point:
> 
> *Proportional Comparative Fault at 51 Percent*
> The states that have adopted proportional comparative fault bar recovery if you are more than 51% at fault for the accident. In other words, you cannot file a liability claim and lawsuit against the other driver's negligence if you were more than 51% at fault.
> 
> For example, Dennis hit Teri's car while driving in excess of 25 miles per hour over the speed limit while Teri was attempting to cross the road. Even though Teri was partially at fault for not waiting until the road was completely clear before crossing, the insurance company allocated fault to Dennis at 60% due to his excessive speed. Even though Dennis suffered a broken arm from the accident, he is not entitled to recover for his injury due to the fact that he was more than 51% at fault for the accident.
> 
> _States: Connecticut, Delaware, Hawaii, Illinois, Indiana, Iowa, Massachusetts, Michigan, Minnesota, Montana, Nevada, New Hampshire, New Jersey, Ohio, Oregon, Pennsylvania, South Carolina, Texas, Vermont, Wisconsin and Wyoming._


Interesting, MD is not on the list
So if this is true, Tesla and the driver split the fault at 25% each?
It said the insurance company allocated fault to 60%, WHO'S insurance company sets that? who gets to determine who was more at fault?

and in this case in question: tesla or truck, who was more wrong?

and i guess it was proven the driver was speeding right?


----------



## Lincoln Navigator L

Fuzzyelvis said:


> Maybe they should stop calling it "autopilot" ?


Autopilot is an appropriate term for what it is. In aviation, when autopilot is engaged, at least one pilot is required to be at the controls, monitoring and paying attention in case he/she needs to take over.

It's not a blowup doll like you saw in the movie "Airplane!"


----------



## Spy & Mash

'Tesla' are out to 'steal' your brain.
Don't let them have it!


----------



## Oleg 77

LAuberX said:


> the driver was not using his head anyway...


He was sleeping.


----------



## DriverX

Bart McCoy said:


> Except we arent talking about passing. Truck crossed street perpendicular to road Tesla was on
> 
> Interesting, MD is not on the list
> So if this is true, Tesla and the driver split the fault at 25% each?
> It said the insurance company allocated fault to 60%, WHO'S insurance company sets that? who gets to determine who was more at fault?
> 
> and in this case in question: tesla or truck, who was more wrong?
> 
> and i guess it was proven the driver was speeding right?


I'm just explaining the law to you. It's up for the lawyers to make a deal.

As soon as you are in violation of a traffic law while involved in an accident you're at least partially at fault. pretty sure they go over that in drivers ed


----------



## LAuberX

Oleg 77 said:


> He was sleeping.


or watching a movie on his portable dvd player... Harry Potter? really, 40 year olds watch that??


----------



## Bart McCoy

DriverX said:


> I'm just explaining the law to you. It's up for the lawyers to make a deal.
> 
> As soon as you are in violation of a traffic law while involved in an accident you're at least partially at fault. pretty sure they go over that in drivers ed


not my drivers ed in maryland

but will be interesting to see what happens, because Telsa has a lot of money, so of course lawyers would want to go to them first
but Telsa put in the clause to "always have hands on the wheel" and be attentive, so speeding and not paying attention on the driver. Seems driver would be at fault more than anybody else if cali law is working like that...

The way Tesla has worded their autopilot, they are no more responsible for the accident than if somebody is using regular cruise control on their car and gets into an accident...


----------



## FrankMartin

observer said:


> I would not attribute this to the car but to driver error.
> 
> Driver should have been paying attention.


DON'T BLAME THE VICTIM!!!

If you have to be totally engaged in second-guessing the autopilot then what's the point!?

TESLA is damned lucky with this one. Only the fully waivered driver was killed - the truck driver and bystanders could easily have been killed too.

And even though the TESLA driver was fully waivered his surviving family should be handsomely compensated with hush money... if TESLA doesn't want to see this technology permanently banned.


----------



## FrankMartin

DriverX said:


> they call it AUTOPILOT when clearly it isn't.
> 
> This will be a HUGE lawsuit that Tesla settles fast.
> 
> edit
> Mr. Brown apparently posted videos of himself riding in autopilot mode. "The car's doing it all itself,'' he said in one, smiling as he took his hands from the steering wheel.
> 
> DOH famous last youtubes


And at the end of that youtube he says "... there's nothing to worry about ..." Well I guess he's got no worries now.


----------



## Michael - Cleveland

There's no need to tip said:


> BS technology that has resulted in 1 (ONE) fatality thus far vs how many with human drivers on a DAILY basis?


1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
Those are the stats, thus far.
http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


----------



## Bart McCoy

FrankMartin said:


> DON'T BLAME THE VICTIM!!!
> 
> If you have to be totally engaged in second-guessing the autopilot then what's the point!?
> 
> TESLA is damned lucky with this one. Only the fully waivered driver was killed - the truck driver and bystanders could easily have been killed too.
> 
> And even though the TESLA driver was fully waivered his surviving family should be handsomely compensated with hush money... if TESLA doesn't want to see this technology permanently banned.


are you serious?


----------



## FrankMartin

Michael - Cleveland said:


> 1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
> Those are the stats, thus far.
> http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


When it comes to fiery crashes don't expect the public to embrace actuarial statistics- otherwise the power grid would be 100% nuclear and 0% coal.


----------



## lubi571

Michael - Cleveland said:


> 1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
> Those are the stats, thus far.
> http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


----------



## lubi571

Michael - Cleveland said:


> 1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
> Those are the stats, thus far.
> http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


Those are the stats but what we are not told is how many of those 130 million miles were clinical. We know that there weren't any trucks crossing in front of the vehicle that's for sure.


----------



## observer

FrankMartin said:


> DON'T BLAME THE VICTIM!!!
> 
> If you have to be totally engaged in second-guessing the autopilot then what's the point!?
> 
> TESLA is damned lucky with this one. Only the fully waivered driver was killed - the truck driver and bystanders could easily have been killed too.
> 
> And even though the TESLA driver was fully waivered his surviving family should be handsomely compensated with hush money... if TESLA doesn't want to see this technology permanently banned.


I'm not blaming the victim, the car didn't know any better so it wasn't its fault. In this case the car was the victim.


----------



## RamzFanz

FrankMartin said:


> DON'T BLAME THE VICTIM!!!
> 
> If you have to be totally engaged in second-guessing the autopilot then what's the point!?
> 
> TESLA is damned lucky with this one. Only the fully waivered driver was killed - the truck driver and bystanders could easily have been killed too.
> 
> And even though the TESLA driver was fully waivered his surviving family should be handsomely compensated with hush money... if TESLA doesn't want to see this technology permanently banned.


Nope. The driver isn't taking over, the driver is ALWAYS in charge. These are driver assist functions that back up the driver, NONE ever actually drives.


----------



## RamzFanz

HERR_UBERMENSCH said:


> My point is that the car, while in autopilot, should know the speed limit and not exceed it.


The tesla driver assist features are never driving, the driver does. You can set your cruise control over the speed limit, same thing.


----------



## RamzFanz

DriverX said:


> And so it begins.
> 
> http://www.theverge.com/2016/6/30/12072408/tesla-autopilot-car-crash-death-autonomous-model-s
> 
> https://www.teslamotors.com/blog/tragic-loss
> 
> Clearly Tesla's QA dept. isn't up to the challenge of insuring the safety of driverless vehicles. Why would it not occur to someone at Tesla that scanning the roadway over 3 feet off the ground would be a vital safety requirement?
> 
> THe first death in a series to come. I'd expect a recall soon. and I'm sticking to my guns on the advent of truly autonomous vehicles being about 40 years out.
> 
> Fear not drivers they will need us for quite sometime to come.


Would you care to define a self-driving car? Do you realise that an actual SDC would have seen that truck in at least 3 different ways? Do you know those ways?

Who operates a Tesla in auto-pilot? The human driver or the car?

Does Tesla allow a driver to watch DVD's in lane assist mode? Remove his/her hands from the steering wheel? Why or why not?


----------



## RamzFanz

LAuberX said:


> and here is the problem. they argue the cars are "safer" or "better drivers" than humans... I think it will be great for ambulance chasing lawyers!


Two errors:

1) They are.

2) This wasn't a SDC and it was 100% human error.


----------



## RamzFanz

SafeT said:


> So if this driverless car myth ever happens 100 years from now.. how would the computer handle this?
> 
> A 6 year old runs out in front of the car. The only two options are to run over and kill the 6 year old kid, or run into an on-coming semi truck and kill the driver. How should the car be programmed for that? Kill the kid or the driver? Who would buy a car knowing it would kill you first? Who would allow a car to be sold that is programmed to run over kids and grannies rather than risk possibly killing the driver?


The answer is whichever would cause the least damage. What did you think it would be? Did you really think it would be a moral decision?


----------



## RamzFanz

painfreepc said:


> This question was asked on the George Noory show, are you at George Noory listener if you are good for you,
> 
> This is a very serious question and the people behind the technology is not answering it,
> 
> your self driving car will have to make this decision who will it make the decision in favor of, the driver are in favor of the idiot who walked out in front of your car..


This is the same old tired trolley question which has 0% meaning in SDCs.

The answer is the car is responsible for protecting it's passenger and will choose the least energy impact. Period. And there is no responsibility to do otherwise.


----------



## RamzFanz

SafeT said:


> So if this driverless car myth ever happens 100 years from now..


2-4 years, absolutly no later.


----------



## LAuberX

RamzFanz said:


> Nope. The driver isn't taking over, the driver is ALWAYS in charge. These are driver assist functions that back up the driver, NONE ever actually drives.


wrong:


----------



## RamzFanz

painfreepc said:


> So that's the way it's going to be, every time one of these cars kills somebody, its going to be somehow it was the driver's fault,
> but the fact is this technology should not be out for the general public to begin with, it's not ready..


You forgot where, even after this death, they are saving lives. The airbag didn't save him either, should we get rid of it?


----------



## RamzFanz

LAuberX said:


> wrong:


Which proves the point that humans are the problem, not the technology, which ALREADY has a lower death rate in BETA.

You proved that it actually works better than advertised.


----------



## Bart McCoy

RamzFanz said:


> Two errors:
> 
> 1) They are.
> 
> 2) This wasn't a SDC and it was 100% human error.


so Tesla's autopilout function that is supposed to slow down when cars are in front, didn't see a big ole tractor trailer in the road, and applied 0% brakes, is 0% responsible? really?

If so, then no way Tesla can be sued (well no way they can lose. you'll always win if you're not a fault)


----------



## LAuberX

RamzFanz said:


> You forgot where, even after this death, they are saving lives. The airbag didn't save him either, should we get rid of it?


Airbags are killing people:

http://autoweek.com/article/recalls...airbags-honda-urges-owners-participate-recall


----------



## LAuberX

Bart McCoy said:


> so Tesla's autopilout function that is supposed to slow down when cars are in front, didn't see a big ole tractor trailer in the road, and applied 0% brakes, is 0% responsible? really?
> 
> If so, then no way Tesla can be sued (well no way they can lose. you'll always win if you're not a fault)


the Tesla is not looking 36" above the road for obstacles... so it went right under the trailer.

What exactly was the Tesla driver looking at?


----------



## Bart McCoy

LAuberX said:


> the Tesla is not looking 36" above the road for obstacles... so it went right under the trailer.
> 
> What exactly was the Tesla driver looking at?


why would computer driven cars not check for object up to the car's height and to the cars width? that makes no sense.

When humans drive, they check for clearly way above 3 feet, aka large tractor trailers in the road... No normal human would drive their car through anyway pathway is not clear to the total height and width of their vehicle. Key phrase normal human


----------



## LAuberX

Bart McCoy said:


> why would computer driven cars not check for object up to the car's height and to the cars width? that makes no sense.


maybe they will after this... more cpu's, more software, more sensors... it CAN be done, now they have a reason WHY it should be done... so the Tesla driver can watch Harry Potter and take a nap.


----------



## RamzFanz

Bart McCoy said:


> so Tesla's autopilout function that is supposed to slow down when cars are in front, didn't see a big ole tractor trailer in the road, and applied 0% brakes, is 0% responsible? really?
> 
> If so, then no way Tesla can be sued (well no way they can lose. you'll always win if you're not a fault)


Yes, that is correct. The driver is 100% responsible for driving a Tesla. The driver assist, well, ASSISTS, when it can. It's a very rudimentary feature that helps when it can, period. There is NOTHING self driving about these features. Nada, none, zip, zero.

If it were self driving, why would they insist you keep your hands on the wheel and drive?


----------



## RamzFanz

LAuberX said:


> Airbags are killing people:
> 
> http://autoweek.com/article/recalls...airbags-honda-urges-owners-participate-recall


So are seatbelts. Should we get rid of them?

So, let's get real, they are saving bad human drivers far more than harming them so, no, they aren't going away.


----------



## LAuberX

Hands FREE:


----------



## RamzFanz

Bart McCoy said:


> why would computer driven cars not check for object up to the car's height and to the cars width? that makes no sense.
> 
> When humans drive, they check for clearly way above 3 feet, aka large tractor trailers in the road... No normal human would drive their car through anyway pathway is not clear to the total height and width of their vehicle. Key phrase normal human


Because, in exact opposite of what the 100% false title of this thread epitomized, the computer wasn't driving, the human was. Yet he didn't do as you claimed he would? Why?

You're saying he WOULD but he didn't? Why?


----------



## RamzFanz

LAuberX said:


> Hands FREE:


As a moderator, you're being disingenuous and deceitful. The Tesla is not self driving in any way. You're videos only prove it is better than intended.


----------



## LAuberX

RamzFanz said:


> As a moderator, you're being disingenuous and deceitful. The Tesla is not self driving in any way. You're videos only prove it is better than intended.


lol, no I'm drooling over the self driving capabilities of this car! 10 second 1/4 mile! 0-60 in UNDER 3 seconds!!

I drive a MBZ for a client that has adaptive cruise control, I set it for 3 car lengths and it drives me to LAX with no throttle or brake inputs.... nice.

watch the videos... very impressive.


----------



## RamzFanz

LAuberX said:


> maybe they will after this... more cpu's, more software, more sensors... it CAN be done, now they have a reason WHY it should be done... so the Tesla driver can watch Harry Potter and take a nap.


Except, of course, "computer driven cars" do have way more sensors, cpu's (you are way behind, try GPUs), and software (you mean programing). It IS being done, but the Tesla S is barely a distant relative to a SDC. This is a human driven car, and the humans are acting like monkeys with shotguns.


----------



## LAuberX

RamzFanz said:


> Except, of course, "computer driven cars" do have way more sensors, cpu's (you are way behind, try GPUs), and software (you mean programing). It IS being done, but the Tesla S is barely a distant relative to a SDC. This is a human driven car, and the humans are acting like monkeys with shotguns.


um... NO. GPU means graphics processing unit

software IS programming... _Software_ means computer instructions or data. Anything that can be stored electronically is _software_, in contrast to storage devices and display devices which are called hardware

better report me for correcting you!


----------



## Fuzzyelvis

Lincoln Navigator L said:


> Autopilot is an appropriate term for what it is. In aviation, when autopilot is engaged, at least one pilot is required to be at the controls, monitoring and paying attention in case he/she needs to take over.
> 
> It's not a blowup doll like you saw in the movie "Airplane!"


The point is many idiots who buy one of these and hear the term will do just what this guy did.


----------



## RamzFanz

LAuberX said:


> um... NO. GPU means graphics processing unit
> 
> software IS programming... _Software_ means computer instructions or data. Anything that can be stored electronically is _software_, in contrast to storage devices and display devices which are called hardware
> 
> better report me for correcting you!


Yes, and SDCs are using GPUs. One gold star!

You should be correcting this thread as being false, not supporting it.

_Software_ operates on top of a _programmed_ operating system. _Software_ is written using the operating system's language, not _programmed_.


----------



## lubi571

RamzFanz said:


> 2-4 years, absolutly no later.


The real problem with the programming arises when there is human intervention. How do you program a police officers hand to stop when there is a green light. What does a flashlight being waved at night mean etc. Google MIT professors and driverless car technology some think these variables and more could never be programmed.


----------



## RamzFanz

lubi571 said:


> The real problem with the programming arises when there is human intervention. How do you program a police officers hand to stop when there is a green light. What does a flashlight being waved at night mean etc. Google MIT professors and driverless car technology some think these variables and more could never be programmed.


People ask these questions over and over as if they aren't, or haven't already been, addressed.

No, there are no experts, actually in the field, that believe this can never be addressed.

Google cars already recognize emergency vehicles and hand signals. This is a decade old in technology years. Have you ever watched their TED presentation?






The above video explains how the Tesla driver died and why Elon Musk is making a mistake handing shotguns to monkeys. The Shotgun works fine, the monkey is unreliable.


----------



## LAuberX

RamzFanz said:


> Yes, and SDCs are using GPUs. One gold star!
> 
> You should be correcting this thread as being false, not supporting it.
> 
> _Software_ operates on top of a _programmed_ operating system. _Software_ is written using the operating system's language, not _programmed_.


thanks for correcting your mistake about "programming".

The car was self driving right up to the point it was stopped by a telephone pole.

he will never know how Harry Potter turns out! (Witness said he heard it playing in the crashed Tesla)


----------



## LAuberX

Fuzzyelvis said:


> The point is many idiots who buy one of these and hear the term will do just what this guy did.


Peeps will buy this car BECAUSE of this! any publicity is good publicity!


----------



## Lincoln Navigator L

Fuzzyelvis said:


> The point is many idiots who buy one of these and hear the term will do just what this guy did.


That's right there the exception to the rule of "slippery slope" being a logical fallacy. Because once words stop having objective meaning, then idiotic and mistaken "reasoning" is elevated and chaos ensues.


----------



## lubi571

RamzFanz thanks for the sales pitch video. I'm sure that these professors at MIT have been working on these issues as long as anyone. I'm not pretending to be an expert on this technology just passing along information. It's clear to me that after 10 years of programming the car could not detect a truck crossing in front of it. How they will address a policeman's hand in the air or some other scenario is not simple and maybe one day posssible but not according to some who are smarter than both of us.


----------



## RamzFanz

lubi571 said:


> RamzFanz thanks for the sales pitch video. I'm sure that these professors at MIT have been working on these issues as long as anyone. I'm not pretending to be an expert on this technology just passing along information. It's clear to me that after 10 years of programming the car could not detect a truck crossing in front of it. How they will address a policeman's hand in the air or some other scenario is not simple and maybe one day posssible but not according to some who are smarter than both of us.


Seriously....what?!?

The technology that MIT offered is amazing.

The person who killed themselves is crayons. You do grasp that was human caused, no?


----------



## RamzFanz

LAuberX said:


> thanks for correcting your mistake about "programming"
> 
> The car was self driving right up to the point it was stopped by a telephone pole.
> 
> he will never know how Harry Potter turns out!


If this site won't stop you from lying, I will.

You're a liar.

No self driving car has ever caused an accident. I know this site has become taxi and union leaning, but what you claim has never happened. Ever. 1,200,000 humans dead from human driving each year since inception of the SDC, 0 from computer driving.

Cry, wiggle, *****...you lose every human argument.


----------



## LAuberX

RamzFanz said:


> If this site won't stop you from lying, I will.
> 
> You're a liar.
> 
> No self driving car has ever caused an accident. I know this site has become taxi and union leaning, but what you claim has never happened. Ever. 1,200,000 humans dead from human driving each year since inception, 0 from computer driving.
> 
> Cry, wiggle, *****...you lose every human argument.


The Tesla had autopilot engaged, it was self driving and got into an accident.

The computer was operating the Tesla while the driver watched Harry Potter.


----------



## RamzFanz

LAuberX said:


> The Tesla had autopilot engaged, it was self driving.


The Tesla isn't even capable of self driving so how did that happen? Do you even know the definition of self driving?

You're a moderator? You are free to spread lies? What kind of site is this?

I'm getting tired of moderators who promote positions with no knowledge. I will probably get banned again for calling you cabbies out on what is supposed to be a Uber drivers forum, but prove me wrong.

Notice to all moderators everywhere: Don't comment on subjects you know absolutely nothing about or you will undermine your authority.


----------



## LAuberX

http://www.wired.com/2015/10/tesla-self-driving-over-air-update-live/

Yep, self driving!


----------



## RamzFanz

LAuberX said:


> The Tesla had autopilot engaged, it was self driving.
> 
> The computer was operating the Tesla while the driver watched Harry Potter.


The Tesla S IS NOT CAPABLE OF SELF DRIVING. At no time and in no way has any person involed said it was. It doesn't have the sensors that would even allow it to be self driving.

STOP spreading lies, or continue and devolve this site to uselessness.


----------



## LAuberX

Another Tesla self driving video:


----------



## AllenChicago

observer said:


> Well since a self driving car has already gone coast to coast,
> 
> http://www.gizmag.com/delphi-drive-completed/36859/
> 
> I would pick the car.


You're far braver than I am, LOL!


----------



## AllenChicago

LAuberX said:


> The Tesla had autopilot engaged, it was self driving and got into an accident.
> 
> The computer was operating the Tesla while *the driver watched Harry Potter*.


The truck driver said after the accident, he could hear a Harry Potter movie still playing. I'm not sure if the Telsa betrayed it's owner, Joshua Brown...Or, if Joshua Brown put too much faith into his Telsa's abilities. I'm sure all parties involved with the car and the auto-pilot will blame 100% of the accident on Mr. Brown's "blatant negligence".

Harry Potter was playing at the time of the crash: *http://nypost.com/2016/07/02/tesla-driver-killed-in-autopilot-crash-was-watching-harry-potter/*


----------



## DriverX

RamzFanz said:


> Would you care to define a self-driving car? Do you realise that an actual SDC would have seen that truck in at least 3 different ways? Do you know those ways?
> 
> Who operates a Tesla in auto-pilot? The human driver or the car?
> 
> Does Tesla allow a driver to watch DVD's in lane assist mode? Remove his/her hands from the steering wheel? Why or why not?


LOL read the thread fanboy


----------



## Lincoln Navigator L

"Self driving" does not accurately describe Tesla's autopilot feature.

News media outlets and bloggers having used the wrong term does not make autopilot and self driving the same thing.

The purpose of autopilot is to reduce operator fatigue. It's not to take over 100% of the driving task.

Tesla's own website is very measured in its description of autopilot, that it's an "incremental" step towards self-driving, meaning that it's only partially self driving, and not self-driving. The same way flour by itself is not a cake, but it's a step in the right direction towards a cake.

But you know what, it's like movie advertisements which partially quote film critics. The critic says the movie is "fantastic in its awfulness," and the movie advertisement only quotes "fantastic."

Another quote from the same Tesla web page: "While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car."


----------



## DriverX

Michael - Cleveland said:


> 1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
> Those are the stats, thus far.
> http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


Thats a BS quote from Musk. THose numbers are not all from the released beta test. They don't have that many cars using the tech yet to generate that mileage.


----------



## lubi571

RamzFanz I do grasp that it was human error on multiple fronts. The driver over estimated the capabilities of the software (call it what you want). The programmers need to figure out what a truck crossing in front of them means. Human error in judgment human error in programming.

You do realize that if no accident had occurred everyone with a vested interest would be touting this ride as great success. "The driver was watching a movie while going from point A to B without mishap".


----------



## Miguel Aprender

I can't wait for this technology to become good. Once it is working correctly, it really will save a bunch of lives.

But what happens when the auto-pilot needs to make ethical decisions and sacrifice the driver to save two pedestrians? 
https://www.technologyreview.com/s/539731/how-to-help-self-driving-cars-make-ethical-decisions/


----------



## painfreepc

This whole thing is really starting to PMF, I don't care he was speeding the autopilot Technologies drove right straight into a damn truck and even if the car was speeding why is the technology allowed to stay engaged way above the speed limit..


----------



## RamzFanz

DriverX said:


> LOL read the thread fanboy


Exactly.


----------



## painfreepc

RamzFanz said:


> Yes, that is correct. The driver is 100% responsible for driving a Tesla. The driver assist, well, ASSISTS, when it can. It's a very rudimentary feature that helps when it can, period. There is NOTHING self driving about these features. Nada, none, zip, zero.
> 
> If it were self driving, why would they insist you keep your hands on the wheel and drive?


So you're actually saying that Tesla has no responsibility if the technology They are promoting in their hundred-thousand-dollar car cannot see an object in front of it, so you saying it does not need to be reliable at al, so what is the point of this cruise control that is supposed to automatically slow down or brake the car if an object gets too close to the front of it,

I have normal cruise control in my 2015 Ford Fusion SE hybrid, I step up to speed or step down the speed manually as needed, what is the point of the computerized cruise control if it's not going to do what they're supposed to do, this s*** makes absolutely no sense,

I hope this man has family I hope they can suit tesla for everything they have, i would like to see the whole company go out of business, this technology is not needed..


----------



## Jermin8r89

Ok we already know how its gonna happen! This thread is starting to get pointless...humans mistake prone machines not mistake prone. Machines slowly take our lives and in 100 years humans extinct AI rule the world. Its already happening we humans stupid and lazy nothing else to say


----------



## painfreepc

Jermin8r89 said:


> Ok we already know how its gonna happen! This thread is starting to get pointless...humans mistake prone machines not mistake prone. Machines slowly take our lives and in 100 years humans extinct AI rule the world. Its already happening we humans stupid and lazy nothing else to say


will never be AI, stop dreaming it will never happen,
It will only happen when we can use the human brain as a processor for a computer.

That brain will most likely have no human experience, 
May even be room to argue that that is not truly AI.


----------



## FrankMartin

Bart McCoy said:


> are you serious?


Dead serious


----------



## FrankMartin

RamzFanz said:


> Nope. The driver isn't taking over, the driver is ALWAYS in charge. These are driver assist functions that back up the driver, NONE ever actually drives.


Well then the Tesla auto-whatever is pointless and Musk is an idiot.


----------



## Zoplay

observer said:


> I would not attribute this to the car but to driver error.
> 
> The driver should have been paying attention.


You are right my friend whenever an accident happened in road it was because of driver's only since they are the responsible persons to take care of their car and their passengers,


----------



## Hunt to Eat

What could possibly go wrong?
Oh wait...never mind.


----------



## uberdriverfornow

Michael - Cleveland said:


> 1 death in 130,000,000 miles (autonomous) vs. 1 death in 93,000,000 miles.
> Those are the stats, thus far.
> http://nymag.com/selectall/2016/07/...self-driving-car-death.html?mid=twitter_nymag


Is it safe to assume you're not seriously believing the word of the CEO with regards to accident data that is being released after a fatality ?


----------



## painfreepc

Blog source
https://www.teslamotors.com/blog/tragic-loss

*A Tragic Loss*
The Tesla Team June 30, 2016
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot "is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla's mission. We would like to extend our deepest sympathies to his family and friends.
-------------------------------------------------------------------------

*So I guess it's safe to assume that Tesla will be looking for even more human guinea pigs, please put down your hundred thousand dollars and sign up to be a crash test dummy.*


----------



## Miguel Aprender

painfreepc said:


> It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things,


Whelp, it finally has happened. My habit of not reading, really not even thinking, and clicking software [OK] boxes has the potential now to kill me. <TESLA spin doctors, both hands in air> "Hey, I mean, He clicked the OK box it's not our fault!"

(Edit - Spin Doctors. Now there's a 90's band that I don't think I'll buy on iTunes.)


----------



## Michael - Cleveland

uberdriverfornow said:


> Is it safe to assume you're not seriously believing the word of the CEO with regards to accident data that is being released after a fatality ?


I've no idea what difference it makes to you what I believe or don't believe. (and I don't care)

Facts are facts:
In the US, there are *1.08 deaths* per *100 million vehicle* miles traveled
The Tesla involved in the accident reported above was not a fully autonomous vehicle
(see the 7/5/16 Washington Post article "The Tesla Didn't Really Crash Itself")

Since "self driving" cars are regulated by each state for permission to drive on public streets, I see no reason to not take the number of miles they have driven, as reported, at face value.

And then, there's this: "Adding Some Statistical Perspective..." from Forbes 7/5/16

You can draw you own conclusions.


----------



## observer

If the car slows down after not detecting hands on the steering wheel. What happened? 

I think this driver figured out a way to bypass this feature. Perhaps by winding something tightly around the steering wheel.


----------



## LAuberX

observer said:


> If the car slows down after not detecting hands on the steering wheel. What happened?
> 
> I think this driver figured out a way to bypass this feature. Perhaps by winding something tightly around the steering wheel.


In all the YouTube videos I've never seen mention the fact the car is slowing down when they have no hands on the wheel... I wonder how long it takes to sense there is nobody holding the wheel? an hour? lol


----------



## Miguel Aprender

I'm so sorry for the pun. But I'm a dad, I have to.

I wonder if there is a savior detector unit for when Jesus takes the wheel.


----------



## observer

There were questions about Autopilots safety before the accident.

http://mobile.reuters.com/article/idUSKCN0UO0NM20160110


----------



## Miguel Aprender

I mean, that question has been on everyone's mind since the advent of self-driving cars..

I did like this quote from the article.

_"In a bold pronouncement characteristic of Musk, who also has interests in space travel, he said advances in the "summon" technology would allow Tesla owners to summon their cars from New York to Los Angeles within two years. " _​


----------



## RamzFanz

painfreepc said:


> So you're actually saying that Tesla has no responsibility if the technology They are promoting in their hundred-thousand-dollar car cannot see an object in front of it, so you saying it does not need to be reliable at al, so what is the point of this cruise control that is supposed to automatically slow down or brake the car if an object gets too close to the front of it,
> 
> I have normal cruise control in my 2015 Ford Fusion SE hybrid, I step up to speed or step down the speed manually as needed, what is the point of the computerized cruise control if it's not going to do what they're supposed to do, this s*** makes absolutely no sense,
> 
> I hope this man has family I hope they can suit tesla for everything they have, i would like to see the whole company go out of business, this technology is not needed..


1,200,000 deaths a year screams that it is indeed needed.

The Tesla ships _without_ the autopilot features activated. You have to seek out to have them activated and you are warned over and over they are beta and _not_ self driving. So, no, they won't sue and win because it wasn't Tesla's fault the owner abused the equipment.


----------



## RamzFanz

FrankMartin said:


> Well then the Tesla auto-whatever is pointless and Musk is an idiot.


It is indeed useless at this point. It's in beta and is not even activated when you buy the car.


----------



## painfreepc

RamzFanz said:


> 1,200,000 deaths a year screams that it is indeed needed.
> 
> The Tesla ships _without_ the autopilot features activated. You have to seek out to have them activated and you are warned over and over they are beta and _not_ self driving. So, no, they won't sue and win because it wasn't Tesla's fault the owner abused the equipment.


I hope his family has enough money for this to have a serious court trial,

Number 1 why does this autopilot allowed to stay engaged after the car greatly exceeds the legal speed limit,

Number 2 why is the autopilot allowed to stay engaged for a length of time after you remove your hands from the steering wheel it should almost immediately disengage - and you as a driver with it immediately get the point hey this is not a self-driving car is it,

But of course disengaging the autopilot too soon wouldn't give any real feedback to let them know how the autonomous features are actually working, so far as I'm concerned the company is using the drivers as human Crash Test Dummies..


----------



## painfreepc

RamzFanz said:


> It is indeed useless at this point. It's in beta and is not even activated when you buy the car.


What is the point of specifying that the autopilot is not activated, all you do is accept it and check a box,

They're out there with us other people driving cars, rhis truck driver could have been killed, if it's not ready then it has no business being available to the public..

I do not wish to die because some idiot was driving down the street at 80, 90, 100 plus miles an hour with his autopilot engaged..

What would the company tell my mom are my girlfriend,
we're sorry for your lost but it's not are fault because the driver check the box,

I can imagine them now sitting in the next big board meeting breathing a sigh of relief because only the driver was killed.


----------



## observer

LAuberX said:


> In all the YouTube videos I've never seen mention the fact the car is slowing down when they have no hands on the wheel... I wonder how long it takes to sense there is nobody holding the wheel? an hour? lol





painfreepc said:


> I hope his family has enough money for this to have a serious court trial,
> 
> Number 1 why does this autopilot allowed to stay engaged after the car greatly exceeds the legal speed limit,
> 
> Number 2 why is the autopilot allowed to stay engaged for a length of time after you remove your hands from the steering wheel it should almost immediately disengage - and you as a driver with it immediately get the point hey this is not a self-driving car is it,
> 
> But of course disengaging the autopilot too soon wouldn't give any real feedback to let them know how the autonomous features are actually working, so far as I'm concerned the company is using the drivers as human Crash Test Dummies..


Both of you raise valid points. This car is constantly sending information back to Tesla in real or near real time.

Tesla knew this driver was speeding. Tesla knew his hands were off the steering wheel for long periods of time. Tesla bears some responsibility in this respect.

Where does Teslas responsibility begin and end?


----------



## RamzFanz

painfreepc said:


> What is the point of specifying that the autopilot is not activated, all you do is accept it and check a box,
> 
> They're out there with us other people driving cars, rhis truck driver could have been killed, if it's not ready then it has no business being available to the public..
> 
> I do not wish to die because some idiot was driving down the street at 80, 90, 100 plus miles an hour with his autopilot engaged..
> 
> What would the company tell my mom are my girlfriend,
> we're sorry for your lost but it's not are fault because the driver check the box,
> 
> I can imagine them now sitting in the next big board meeting breathing a sigh of relief because only the driver was killed.


No, you don't just have to check a box, you have to pay thousands of dollars and agree to multiple warnings. This includes another warning when you turn it on.

130,000,000 miles with autopilot on, in beta, and one death, not caused by the car. Worldwide, there is one death per 60,000,000 miles from human driving. Tesla, in beta, is already over twice as safe as humans. Quite frankly, if someone is going to drive 80-100 I would prefer they have autopilot on.


----------



## painfreepc

RamzFanz said:


> No, you don't just have to check a box, you have to pay thousands of dollars and agree to multiple warnings. This includes another warning when you turn it on.
> 
> 130,000,000 miles with autopilot on, in beta, and one death, not caused by the car. Worldwide, there is one death per 60,000,000 miles from human driving. Tesla, in beta, is already over twice as safe as humans. Quite frankly, if someone is going to drive 80-100 I would prefer they have autopilot on.


Sorry I forgot to add the point that you have to pay money for the future, but I thought that was understood fact already, I didn't really have to say it but anyway,

First of all as one other poster is already pointed out how can it possibly be 130 million miles they don't have that many cars on the road are you sure they not including simulated Miles in the computer to add with the real world miles,

And just for the sake of argument even if it is 130 million real world miles, the people who paid money for this feature and check the box and read the warning are automatically going to drive a little safer, it's kind of like the same thing when you give somebody a placebo and not a real drug,


----------



## naplestom75

observer said:


> Apparently not, the radar thought the trailer was an overhead road sign.


Exactly, the autopilot was unable to correctly identify what was in front of it. People, this is why this will never work unless you set up roads and or lanes for these vehicles only, and all of the vehicles on these roadways will have to be self-driving and of like make and model. otherwise there will always be ways that they can be fooled.


----------



## painfreepc

Miguel Aprender said:


> Why would anything but the miles per fatality figure matter? We are so litigious and emotional about our vehicles and driving.
> 
> I should take the time to calculate the total costs to society if we took someone who has had their second accident in the same year and buy them a self-driving car possibly instead of jail time for DUI. It's not the Teslas that I am worried about, it's the drunk 17 year old in the IROC Z.
> 
> On the topic of like make and model. Perhaps there is an opportunity for a similar internet of vehicles IOV where you can recognize obstacles and dangerous situations in a standardized way (not depending on vehicle make.) Software would have the advantage of seeing a lot more than we do from the cockpit.
> 
> Speaking of cockpits, I assume that none of you mind that over 90% of the time, the commercial plane you are flying in is on autpilot.


I am getting so sick and tired of hearing about the airplanes autopilot,
there are no park tractor trailers in the sky,
there's no pedestrians crossing the street in the sky,
There's no obstacles of any kind in the sky except for other planes and that's traffic control so stop with the airplane's autopilot it has not a damn thing to do with cars autopilot system..


----------



## Miguel Aprender




----------



## RamzFanz

painfreepc said:


> First of all as one other poster is already pointed out how can it possibly be 130 million miles they don't have that many cars on the road are you sure they not including simulated Miles in the computer to add with the real world miles,


??? I don't know where that idea is coming from. There are over a hundred thousand of autopilot capable cars on the road.



painfreepc said:


> And just for the sake of argument even if it is 130 million real world miles, the people who paid money for this feature and check the box and read the warning are automatically going to drive a little safer, it's kind of like the same thing when you give somebody a placebo and not a real drug,


No, actually, it's pretty well known that recklessness is common. There are many videos out there demonstrating their recklessness. There are even videos of Tesla drivers sleeping. This crash is a perfect example.


----------



## uber strike

we already have self driving cars apparently. with more and more pay cuts drivers have to work longer hours to make the same money. look at this driver asleep at the wheel...


----------



## DriverX

painfreepc said:


> I hope his family has enough money for this to have a serious court trial,
> 
> Number 1 why does this autopilot allowed to stay engaged after the car greatly exceeds the legal speed limit,
> 
> Number 2 why is the autopilot allowed to stay engaged for a length of time after you remove your hands from the steering wheel it should almost immediately disengage - and you as a driver with it immediately get the point hey this is not a self-driving car is it,
> 
> But of course disengaging the autopilot too soon wouldn't give any real feedback to let them know how the autonomous features are actually working, so far as I'm concerned the company is using the drivers as human Crash Test Dummies..


You got it. They can't totally NERF the autopilot if they want people using it either. It's gottta be fun right and danger adds excitement which equals adrenaline rush and feels fun. Look at me and my self driving Tesla! I'm the most futuristic guy on the road! SMASH

Anyway that's what beta test means, you're a guinea pig, but if their software crashes so do you!


----------



## DriverX

uber strike said:


> we already have self driving cars apparently. with more and more pay cuts drivers have to work longer hours to make the same money. look at this driver asleep at the wheel...


was he parked legit? Maybe just on a break, but it's true after 11 hours on the road I'm fading fast.


----------



## DriverX

Miguel Aprender said:


> Why would anything but the miles per fatality figure matter? We are so litigious and emotional about our vehicles and driving.
> 
> I should take the time to calculate the total costs to society if we took someone who has had their second accident in the same year and buy them a self-driving car possibly instead of jail time for DUI. It's not the Teslas that I am worried about, it's the drunk 17 year old in the IROC Z.
> 
> On the topic of like make and model. Perhaps there is an opportunity for a similar internet of vehicles IOV where you can recognize obstacles and dangerous situations in a standardized way (not depending on vehicle make.) Software would have the advantage of seeing a lot more than we do from the cockpit.
> 
> Speaking of cockpits, I assume that none of you mind that over 90% of the time, the commercial plane you are flying in is on autpilot.


It's not a valid comparison. I used to set the autopilot on my boat and go below to open a bottle of wine. In the ocean as in the air there is nothing around you for miles.

THe software and hardware you describe is being developed by google but is years away from working perfectly, but is far beyond the Teslas autopilot already.

The number Musk gave are false. There aren't enough Teslas in the beta test now to have generated that many test miles.


----------



## uber strike

DriverX said:


> was he parked legit? Maybe just on a break, but it's true after 11 hours on the road I'm fading fast.


not parked legit. look at the curve. it's red.


----------



## DriverX

RamzFanz said:


> No, you don't just have to check a box, you have to pay thousands of dollars and agree to multiple warnings. This includes another warning when you turn it on.
> 
> 130,000,000 miles with autopilot on, in beta, and one death, not caused by the car. Worldwide, there is one death per 60,000,000 miles from human driving. Tesla, in beta, is already over twice as safe as humans. Quite frankly, if someone is going to drive 80-100 I would prefer they have autopilot on.


Those number aren't correct. There aren't nearly enough Teslas in the beta to generate that many miles. Only 90,000 model S have been delivered. If everyone of them was paying to run the beta autopilot they would each need to have driven 1500 miles in autopilot.

How many are in the very expensive paid beta? 15%? Can you see how the facts we actually know do not support such a claim as you and Musk have made?


----------



## DriverX

uber strike said:


> not parked legit. look at the curve. it's red.


curb? couldn't see it on the vertically challenged dark video. thx


----------



## RamzFanz

DriverX said:


> Those number aren't correct. There aren't nearly enough Teslas in the beta to generate that many miles. Only 90,000 model S have been delivered. If everyone of them was paying to run the beta autopilot they would each need to have driven 1500 miles in autopilot.
> 
> How many are in the very expensive paid beta? 15%? Can you see how the facts we actually know do not support such a claim as you and Musk have made?


Expensive? These are $80,000 cars so, yes, I could see a lot of drivers opting in for a couple of grand. Not only that but they gave it away for free to every car for a month. The X is also autopilot capable. I have no idea how many have opted in, do you?

Next we need to find a motive for them to tell a blatant lie.


----------



## painfreepc

RamzFanz said:


> Expensive? These are $80,000 cars so, yes, I could see a lot of drivers opting in for a couple of grand. Not only that but they gave it away for free to every car for a month. The X is also autopilot capable. I have no idea how many have opted in, do you?
> 
> Next we need to find a motive for them to tell a blatant lie.


You're absolutely right there's no motive in trying to accelerate a technology that they could potentially make billions, you're right that's no motive at all..


----------



## DriverX

RamzFanz said:


> Expensive? These are $80,000 cars so, yes, I could see a lot of drivers opting in for a couple of grand. Not only that but they gave it away for free to every car for a month. The X is also autopilot capable. I have no idea how many have opted in, do you?
> 
> Next we need to find a motive for them to tell a blatant lie.


The motives obvious why do you think they waited 2 months to reveal the problem?

Musk is the millennial version of Trump. He'll be running for king of space soon enough.

http://www.huffingtonpost.com/entry/elon-musk-fortune_us_577bc4b0e4b041646410a3c0


----------



## Gibman73

Is it the immediate solution, no. Is it perfect, no. After the result of one incident though and especially one in which the owner was allegedly not following safe protocol. How many other Tesla owners out the had just been lucky so far engaging in similar habits, so hopefully this is a wake up for them. This tech may or may not be 10 years out, but it will be prevalent before ya know it. I don't entirely wish to give up driving but getting from San Francisco to Oakland on any given weekday afternoon would be a ton more relaxing if a lot of the cars could handle the autonomous task. Humans are horrible in rush hour, we cut each other off, we hug the bumper in front of us and refuse to look left or right so we don't see that guy with his blinker on trying to get in our lane, we fill up intersections when we know we're not going anywhere for at least 3 red light cycles because we're doing it for 3 more traffic lights ahead of us too. We're horrible at zipper merges, we follow each other for shit in traffic. Like it or not it is right around the corner and I see a ton of drivers everyday that would be better off if the robot could take over.


----------



## RamzFanz

painfreepc said:


> You're absolutely right there's no motive in trying to accelerate a technology that they could potentially make billions, you're right that's no motive at all..


Tesla discusses both the data gathered with the autopilot on and the data collected when it's off, so there is a clear line of demarcation between the two. In reality though, there is no difference between them as far as the system testing and programming is concerned. Simulators don't have any idea if miles are live and real or feeds of real world data.

A startup like Tesla telling a whopping lie has no tangible upside. It would be a stock and company crusher if it came out. The "acceleration", not sure how that works if it's not real data, would be stopped dead if they were lying about what they were doing and could have very serious criminal stock fraud implications. I also have more faith in Elon Musk's honesty than most CEOs, but hey, I'm an admitted fanboy.


----------



## painfreepc

RamzFanz said:


> A startup like Tesla telling a whopping lie has no tangible upside. It would be a stock and company crusher if it came out. The "acceleration", not sure how that works if it's not real data, would be stopped dead if they were lying about what they were doing and could have very serious criminal stock fraud implications. I also have more faith in Elon Musk's honesty than most CEOs, but hey, I'm an admitted fanboy.


You sound like one of those people who speak like there's no Shades of Grey everything is either black or white

Why does the numbers have to absolutely be a bald-faced lie how about maybe they're just bending the truth a little bit,

The miles being quoted are these miles actually driven by the customers who bought the car and have the car on autopilot or do these miles include their own testing..

And I'm sorry if this makes Fanboys like you upset, but I still say if this car could not see a tractor trailer on the highway there's a serious problem, I don't care if he was watching Harry Potter I don't care if he was speeding,

If Tesla's Car Technology can't see an 18-wheel truck crossing the road in broad daylight then as they said in the movie Apollo 13 "Houston we have a problem"

Having a driver check a box with warning saying hay this technology may not work so keep your hands on the wheel, and then put that car and driver on the road with me, I did not check a box giving my ok, i have driven a car now for 40 years I have never had one single accident and I do not wish to die because some fool is using the autopilot which the company itself says may not be 100% ready, this is very irresponsible and this ceo's head needs to roll,

As I said earlier only the driver died and yes the driver was an idiot, so I guess we're going to have to wait until this autopilot technology crashes into an innocent Family's car and kills everybody inside and by the way let's not forget they would have not have checked the Box.


----------



## Miguel Aprender

painfreepc said:


> [....]
> If Tesla's Car Technology can't see an 18-wheel truck crossing the road in broad daylight then as they said in the movie Apollo 13 "Houston we have a problem"
> 
> Having a driver check a box with warning saying hay this technology may not work so keep your hands on the wheel, and then put that car and driver on the road with me I have driven a car now for 40 years I have never had one single accident and I do not wish to die because some fool is using the autopilot which the company itself says may not be 100% ready, this is very irresponsible and this ceo's head needs to roll,
> 
> As I said earlier only the driver died and yes the driver was an idiot, so I guess we're going to have to wait until this autopilot technology crashes into an innocent Family's car and kills everybody inside and by the way that's not forget they would have not have checked the Box.


People smash their cars into trees that they don't see all of the time. Or moms with strollers. etc. What is so heinous about this situation is the fact that it was automated and our standards for an automated car are WAY higher than that of a new teenage driver. Thank God that it was a truck and the passenger was the only fatality.

I bet in the time I have written this response, somewhere in the US, a dumb driver has crashed into an innocent family's car.

I agree with you that having someone that just bought their $70k car check a checkbox, and in some way that makes a company not responsible for a stupid vehicle operator -- the concept is totally hilarious to me. So off-base.

Statistical evidence seems to support that automated vehicles are safe enough right now. We are learning that we as a society are not ready to accept them yet. They need to rise to our expected level of perfection first. I am pretty sure they can get there - to the point of saving a bunch of lives. I am willing to give my prediction a decade to come into the mainstream.

Remember 10 years ago the state of tech? I find it surprising to note that the release of the first iPhone is less than a decade old. Tech will continue to amaze us.


----------



## RamzFanz

DriverX said:


> why do you think they waited 2 months to reveal the problem?


I'm confused... did his family not know he was dead? The police? Reporters? No one noticed? Or is this a tinhatter moment where Tesla paid them all off or threatened them?

Why would I wait 2 months? To get the details. You didn't notice all of the inaccurate reporting? First "self-driving car" death and all that? Ohhhh, wait, that's what you did too, isn't it?


----------



## RamzFanz

painfreepc said:


> You sound like one of those people who speak like there's no Shades of Grey everything is either black or white
> 
> Why does the numbers have to absolutely be a bald-faced lie how about maybe they're just bending the truth a little bit,
> 
> The miles being quoted are these miles actually driven by the customers who bought the car and have the car on autopilot or do these miles include their own testing..
> 
> And I'm sorry if this makes Fanboys like you upset, but I still say if this car could not see a tractor trailer on the highway there's a serious problem, I don't care if he was watching Harry Potter I don't care if he was speeding,
> 
> If Tesla's Car Technology can't see an 18-wheel truck crossing the road in broad daylight then as they said in the movie Apollo 13 "Houston we have a problem"
> 
> Having a driver check a box with warning saying hay this technology may not work so keep your hands on the wheel, and then put that car and driver on the road with me, I did not check a box giving my ok, i have driven a car now for 40 years I have never had one single accident and I do not wish to die because some fool is using the autopilot which the company itself says may not be 100% ready, this is very irresponsible and this ceo's head needs to roll,
> 
> As I said earlier only the driver died and yes the driver was an idiot, so I guess we're going to have to wait until this autopilot technology crashes into an innocent Family's car and kills everybody inside and by the way let's not forget they would have not have checked the Box.


Oh sure, the car failed. No doubt about it. That's why they warned him it would. It's in Beta, it's not a standard feature but an opt in trial and if you don't use it correctly, like all things, the bad things they told you will happen, will.


----------



## DriverX

RamzFanz said:


> I'm confused... did his family not know he was dead? The police? Reporters? No one noticed? Or is this a tinhatter moment where Tesla paid them all off or threatened them?
> 
> Why would I wait 2 months? To get the details. You didn't notice all of the inaccurate reporting? First "self-driving car" death and all that? Ohhhh, wait, that's what you did too, isn't it?


What are you blabbering about, Did you even read the article I posted? Tesla made no public annoucement or gave no additional warnings to drivers with autopilot that this had happened and HOW it happened until 2 months after the fact. end of story

THanks for the heads up Elon!


----------



## Laughingatyoufoolsdaily

Apparently one of the safety features missing on the Tesla is "recline"


----------



## Miguel Aprender

Click [Accept] or [Recline]


----------



## Fireguy50

painfreepc said:


> You sound like one of those people who speak like there's no Shades of Grey everything is either black or white


Sounds like razfam is causing a stir, I put him on ignore so my experience here is more peaceful. I only see the multiple replies about why he thinks/assumed /said or where he read it, or the accusation he misread your post.
Life is easier with ignore


----------



## RamzFanz

DriverX said:


> What are you blabbering about, Did you even read the article I posted? Tesla made no public annoucement or gave no additional warnings to drivers with autopilot that this had happened and HOW it happened until 2 months after the fact. end of story
> 
> THanks for the heads up Elon!


What would they say if they didn't know the answers of how it happened? Don't watch DVDs or speed? Pay attention like you agreed to a dozen times before we activated it?

As much as you want to sensationalise this, it's nothing new and changes nothing. It's not a self driving car, not even close even though you tried to claim it was. The driver is responsible for safe operation. Period. It's abundantly clear, there is no gray area. A man killed himself being reckless, end of story.

I did read the article. What's your point? That Musk doesn't appreciate people who sensationalise tragedies for clicks?

Two humans made mistakes. The truck driver and the Tesla driver. The truck driver being the most culpable. The Tesla beta system did not work as hoped, but the entire world knew it wouldn't work 100% because Tesla has been completely forthcoming about its capabilities. So far, autopilot is at least 30% safer than humans and so far, it hasn't caused an accident.


----------



## RamzFanz

Laughingatyoufoolsdaily said:


> Apparently one of the safety features missing on the Tesla is "recline"


----------



## DriverX

RamzFanz said:


> What would they say if they didn't know the answers of how it happened? Don't watch DVDs or speed? Pay attention like you agreed to a dozen times before we activated it?
> 
> As much as you want to sensationalise this, it's nothing new and changes nothing. It's not a self driving car, not even close even though you tried to claim it was. The driver is responsible for safe operation. Period. It's abundantly clear, there is no gray area. A man killed himself being reckless, end of story.
> 
> I did read the article. What's your point? That Musk doesn't appreciate people who sensationalise tragedies for clicks?
> 
> Two humans made mistakes. The truck driver and the Tesla driver. The truck driver being the most culpable. The Tesla beta system did not work as hoped, but the entire world knew it wouldn't work 100%. So far, autopilot is at least 30% safer than humans and so far, it hasn't caused an accident.


You and Elon share a penchant for pulling numbers out of your ass. I never made a claim that Tesla was at fault. I maintain it would be very easy for any lawyer to make a good case why they are partially responsible which will enevitably mean a lawsuit and or settlement if there already hasn't been some cash payment made in secret to the victims families. WHo knows though if Elon doesn't make real nice with the NHTSA they might force a recall and fines.


----------



## RamzFanz

DriverX said:


> You and Elon share a penchant for pulling numbers out of your ass. I never made a claim that Tesla was at fault. I maintain it would be very easy for any lawyer to make a good case why they are partially responsible which will enevitably mean a lawsuit and or settlement if there already hasn't been some cash payment made in secret to the victims families. WHo knows though if Elon doesn't make real nice with the NHTSA they might force a recall and fines.


Yeah, put that on your wishlist with the unicorn letterhead. Cars fail all the time with far less information and zero waivers not in _opt in beta testing_. Shit happens, nothing's perfect, and everyone in the world was told this system was not perfect, would make mistakes, and was in _opt in beta testing_. Josh Brown was _extremely well informed_ on the subject and spent time teaching others in video and in person of the system's capabilities and limitations. He became so familiar, he could consistently predict when the system would return control to him. Who is going to be convinced he didn't know exactly what he was doing and risking? No one. No sensible lawyer will take this case. The trucking company, on the other hand, is more exposed.

They aren't "victims", they are the perpetrators. The two involved parties, the truck driver and Josh Brown, both made horrible decisions. This is why human driving is about to become obsolete.


----------



## painfreepc

I totally agree this driver who was killed was a complete total idiot, and as George Carlin used to say God bless his soul "It's God thinning the herd"

I hope we can agree that the major problem with this technology since it is in its infancy is that it enables the idiots to be Reckless,

I did not check the check box that allows an idiot to be Reckless with my life since we are sharing the same road, 

My other problem with this is this technology should be able to see a truck, if it can't see an 18-wheel truck it's not ready,

And what is the allowable speed limit for this technology can I step on the gas and go a hundred and twenty miles an hour and still have the autopilot engaged I'm just asking..


----------



## painfreepc

RamzFanz said:


> Yeah, put that on your wishlist with the unicorn letterhead. Cars fail all the time with far less information and zero waivers not in _opt in beta testing_. Shit happens, nothing's perfect, and everyone in the world was told this system was not perfect, would make mistakes, and was in _opt in beta testing_. Josh Brown was _extremely well informed_ on the subject and spent time teaching others in video and in person of the system's capabilities and limitations. He became so familiar, he could consistently predict when the system would return control to him. Who is going to be convinced he didn't know exactly what he was doing and risking? No one. No sensible lawyer will take this case. The trucking company, on the other hand, is more exposed.
> 
> They aren't "victims", they are the perpetrators. The two involved parties, the truck driver and Josh Brown, both made horrible decisions. This is why human driving is about to become obsolete.


Really you actually believe everyone who loves to drive is going to willingly turn in driving license and their car, I've been driving for 40 years never had one accident not even a God damn fender bender and you think I'm going to let uber the auto car drive me around.


----------



## RamzFanz

Miguel Aprender said:


> What is so heinous about this situation is the fact that it was automated and our standards for an automated car are WAY higher than that of a new teenage driver.


It's not automated in the sense of autonomous. These are just driver assist features akin to cruise control or anti-lock brakes. These are also driver assist features _in testing_.



painfreepc said:


> I hope we can agree that the major problem with this technology since it is in its infancy is that it enables the idiots to be Reckless,
> 
> I did not check the check box that allows an idiot to be Reckless with my life since we are sharing the same road,


Yep. I agree. Musk may know technology, but he doesn't know human nature. The biggest roadblock to SDCs now is Tesla because the media and the naysayers don't know the difference, or choose to ignore the difference, to play gotcha.

You check the _I accept reckless idiots_ box every time you drive. It's just a fact of life but I agree, I don't want so much driver assist for humans, it's like handing monkeys shotguns, and we become even worse drivers than we already are.

_However_, in fairness, the stats right now appear to show this technology, even in beta, is better than humans. This guy Josh Brown was saved from an accident by his Tesla once already.



painfreepc said:


> My other problem with this is this technology should be able to see a truck, if it can't see an 18-wheel truck it's not ready,


It should have seen the truck. In fact, it did see the truck from what I understand. The radar saw it but the camera didn't agree and, as I understand it, with the Autopilot, they have to agree. This is where a true SDC like a Google car is far superior. They have lasers that would have painted the truck and confirmed it. They have FLIR which would also have seen it.



painfreepc said:


> And what is the allowable speed limit for this technology can I step on the gas and go a hundred and twenty miles an hour and still have the autopilot engaged I'm just asking..


I know you can't engage autopilot while speeding. I don't know if you can increase the speed over the limit while AP is engaged.

Edit: It does allow you to speed after it is engaged.


----------



## DriverX

RamzFanz said:


> Yeah, put that on your wishlist with the unicorn letterhead. Cars fail all the time with far less information and zero waivers not in _opt in beta testing_. Shit happens, nothing's perfect, and everyone in the world was told this system was not perfect, would make mistakes, and was in _opt in beta testing_. Josh Brown was _extremely well informed_ on the subject and spent time teaching others in video and in person of the system's capabilities and limitations. He became so familiar, he could consistently predict when the system would return control to him. Who is going to be convinced he didn't know exactly what he was doing and risking? No one. No sensible lawyer will take this case. The trucking company, on the other hand, is more exposed.
> 
> They aren't "victims", they are the perpetrators. The two involved parties, the truck driver and Josh Brown, both made horrible decisions. This is why human driving is about to become obsolete.


Googles tech will make Tesla's illegal because it's unsafe comparatively. Josh Brown is lucky he didn't kill innocent bystanders as well as himself. Tesla tracks the data of the beta testers they knew he was speeding and he posted many youtubes where he displayed himself acting recklessly with autopilot. Tesla could have pulled his access to the beta but they let him continue probably because of the free marketing he was doing for them.

Lots of room to make a claim for negligence in that alone, it's not like Elon is Hillary Clinton or something,


----------



## DriverX

RamzFanz said:


> This guy Josh Brown was saved from an accident by his Tesla once already.


Well if he had been driving the car himself he might not have gotten into that close call in the first place. The fact that he's driving around on autopilot all the time and "predicting" as you said when it will fail makes it pretty clear what kind of A Hole dangerous driving this tech is allowing. The whole time I wasn't in the beta I didn't need autopilot to once save me, because I was able to avoid accidents myself.

I'm all for using this tech at slow speeds especially when in parking lots to auto brake or whatever. It's just not ready for high speed driving yet. beta should be recalled immediately or nerf'd.

Just way too easy for A holes like Josh to start mixxing drinks while DJ'ing in the car on the way to da Club. 
Consider that 30% of the public has an IQ below 90. Clearly your average Tesla owner is probably 20 points higher but as soon as honda knocks off their version every idiot in AMerica will be running it.


----------



## painfreepc

DriverX said:


> Well if he had been driving the car himself he might not have gotten into that close call in the first place. The fact that he's driving around on autopilot all the time and "predicting" as you said when it will fail makes it pretty clear what kind of A Hole dangerous driving this tech is allowing. The whole time I wasn't in the beta I didn't need autopilot to once save me, because I was able to avoid accidents myself.
> 
> I'm all for using this tech at slow speeds especially when in parking lots to auto brake or whatever. It's just not ready for high speed driving yet. beta should be recalled immediately or nerf'd.
> 
> Just way too easy for A holes like Josh to start mixxing drinks while DJ'ing in the car on the way to da Club.
> Consider that 30% of the public has an IQ below 90. Clearly your average Tesla owner is probably 20 points higher but as soon as honda knocks off their version every idiot in AMerica will be running it.


I'm sensing a Ken and John listener


----------



## DriverX

painfreepc said:


> I'm sensing a Ken and John listener


I'm sensing a hover board owner.


----------



## observer

painfreepc said:


> I'm sensing a Ken and John listener


John and Ken.


----------



## RamzFanz

painfreepc said:


> Really you actually believe everyone who loves to drive is going to willingly turn in driving license and their car, I've been driving for 40 years never had one accident not even a God damn fender bender and you think I'm going to let uber the auto car drive me around.


Nope. Never said that. Driving will become a recreational activity vs a necessity. There will probably be plenty of rural places that allow human driving 10-20 years from now. Transportation, though, will be overcome by SDCs to save lives. Humans will have higher and higher standards in urban areas for driving until it becomes untenable. Drivers are elevator operators. They were needed when the elevators didn't have automation. Today, we don't even think about it.


----------



## RamzFanz

DriverX said:


> Googles tech will make Tesla's illegal because it's unsafe comparatively. Josh Brown is lucky he didn't kill innocent bystanders as well as himself. Tesla tracks the data of the beta testers they knew he was speeding and he posted many youtubes where he displayed himself acting recklessly with autopilot. Tesla could have pulled his access to the beta but they let him continue probably because of the free marketing he was doing for them.
> 
> Lots of room to make a claim for negligence in that alone, it's not like Elon is Hillary Clinton or something,


Negligence? So Tesla is supposed to monitor youtube and make moral judgments?

Josh was well informed, well educated, negligent, reckless, and the victim of a negligent truck driver.

I speed every day. Is the manufacturer of my car supposed to pull it from me because I use their cruise control and anti-lock brakes?

You are presenting polar arguments. Tesla is so weak in their tech, they will not survive and yet they are to be treated as self-driving?

The driver makes their decisions. Just the same as all drivers today. If a driver is wearing his seatbelt, even if he's speeding and not attentive, and then the seat belt fails to save him, we don't blame the seat belt. Ironically, autopilot has been shown to be more effective in saving lives than the seatbelt. So we throw it out?


----------



## RamzFanz

DriverX said:


> Well if he had been driving the car himself he might not have gotten into that close call in the first place. The fact that he's driving around on autopilot all the time and "predicting" as you said when it will fail makes it pretty clear what kind of A Hole dangerous driving this tech is allowing. The whole time I wasn't in the beta I didn't need autopilot to once save me, because I was able to avoid accidents myself.
> 
> I'm all for using this tech at slow speeds especially when in parking lots to auto brake or whatever. It's just not ready for high speed driving yet. beta should be recalled immediately or nerf'd.
> 
> Just way too easy for A holes like Josh to start mixxing drinks while DJ'ing in the car on the way to da Club.
> Consider that 30% of the public has an IQ below 90. Clearly your average Tesla owner is probably 20 points higher but as soon as honda knocks off their version every idiot in AMerica will be running it.


Yes, I figured you had no idea. He had a heavy lift pickup truck swerve into his blindspot fully intending to go straight through him to the interstate exit. The car saved him from a high speed interstate accident.






So, fairly, Josh had his life or quality of life extended by the Tesla S.

He abused his car and crashed, just as many abuse their sportscar and crash. No one blames the sportscar, they blame the idiot.

A-hole driving? Have we banned sports bikes or cars? Should we based on the a-holes that abuse them? How fast can your car go over any US speed limit?

I'm just saying, this is not a self driving car, it is driver assist, and anyone who abuses that is taking all responsibility on himself.

Yes, Tesla is screwing up handing it to hairless monkeys, especially rich ones.


----------



## DriverX

RamzFanz said:


> Yes, I figured you had no idea. He had a heavy lift pickup truck swerve into his blindspot fully intending to go straight through him to the interstate exit. The car saved him from a high speed interstate accident.
> 
> 
> 
> 
> 
> 
> So, fairly, Josh had his life or quality of life extended by the Tesla S.
> 
> He abused his car and crashed, just as many abuse their sportscar and crash. No one blames the sportscar, they blame the idiot.
> 
> A-hole driving? Have we banned sports bikes or cars? Should we based on the a-holes that abuse them? How fast can your car go over any US speed limit?
> 
> I'm just saying, this is not a self driving car, it is driver assist, and anyone who abuses that is taking all responsibility on himself.
> 
> Yes, Tesla is &%[email protected]!*ing up handing it to hairless monkeys, especially rich ones.


Well there ya go the guy is watching a tv show while driving again. Anyone watching the road would have seen the truck clearly visible moving over into their lane and avoid it. It only saved him because he wasn't paying attention as usual.

Most drivers would probably have noticed the truck moving over and anticipated its need to exit allowing it safe distance just in case, but the Teslas systems aren't forward thinking are they? It's a reaction based system that needs a lot of fine tuning so much in fact google ditched it all together for a better rig.


----------



## RamzFanz

DriverX said:


> Well there ya go the guy is watching a tv show while driving again. Anyone watching the road would have seen the truck clearly visible moving over into their lane and avoid it. It only saved him because he wasn't paying attention as usual.
> 
> Most drivers would probably have noticed the truck moving over and anticipated its need to exit allowing it safe distance just in case, but the Teslas systems aren't forward thinking are they? It's a reaction based system that needs a lot of fine tuning so much in fact google ditched it all together for a better rig.


Who knows. It's easy to look at a video and make hindsight assumptions. No they aren't forward thinking, they aren't self driving as you claimed, they are driver assist.


----------



## painfreepc

"Tesla Autopilot relieves drivers of the most tedious and potentially dangerous aspects of road travel. We're building Autopilot to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable. While truly driverless cars are still a few years away, *Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. *The driver is still responsible for, and ultimately in control of, the car. What's more, *you always have intuitive access to the information your car is using to inform its actions*."
Source: https://www.teslamotors.com/blog/your-autopilot-has-arrived
-------------------

You say like the system airplane pilots use when conditions are clear,
I've been driving for 40 years when the hell is conditions totally clear on the highway when does that happen,

So your car will give me the information for why it's doing what it's about to do,
So let me get this straight I'm traveling down the freeway at 70 miles an hour or I'm going around a hairpin turn and I'm i supposed to look at your display to figure out why your car is doing what it's about to do and then act upon that information, do I act upon the information before my head goes through the windshield or afterwards..


----------



## UberEricLong

*https://www.teslamotors.com/blog/misfortune*

*Misfortune*
The Tesla Team July 6, 2016
Fortune's article is fundamentally incorrect.

First, Fortune mischaracterizes Tesla's SEC filing. Here is what Tesla's SEC filing actually says: *"We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims."* [full text included below] This is just stating the obvious. One of the risks facing Tesla (or any company) is that someone could bring product liability claims against it. However, neither at the time of this SEC filing, nor in the several weeks to date, has anyone brought a product liability claim against Tesla relating to the crash in Florida.

Next, Fortune entirely ignores what Tesla knew and when, nor have they even asked the questions. Instead, they simply assume that Tesla had complete information from the moment this accident occurred. This was a physical impossibility given that the damage sustained by the Model S in the crash limited Tesla's ability to recover data from it remotely.

When Tesla told NHTSA about the accident on May 16th, we had barely started our investigation. Tesla informed NHTSA because it wanted to let NHTSA know about a death that had taken place in one of its vehicles. It was not until May 18th that a Tesla investigator was able to go to Florida to inspect the car and the crash site and pull the complete vehicle logs from the car, and it was not until the last week of May that Tesla was able to finish its review of those logs and complete its investigation. When Fortune contacted Tesla for comment on this story during the July 4th holiday, Fortune never asked any of these questions and instead just made assumptions. Tesla asked Fortune to give it a day to confirm these facts before it rushed its story to print. They declined and instead ran a misleading article.

Here's what we did know at the time of the accident and subsequent filing:


That Tesla Autopilot had been safely used in over 100 million miles of driving by tens of thousands of customers worldwide, with zero confirmed fatalities and a wealth of internal data demonstrating safer, more predictable vehicle control performance when the system is properly used.
That contrasted against worldwide accident data, customers using Autopilot are statistically safer than those not using it at all.
That given its nature as a driver assistance system, a collision on Autopilot was a statistical inevitability, though by this point, not one that would alter the conclusion already borne out over millions of miles that the system provided a net safety benefit to society.
Given the fact that the "better-than-human" threshold had been crossed and robustly validated internally, news of a statistical inevitability did not materially change any statements previously made about the Autopilot system, its capabilities, or net impact on roadway safety.

Finally, the Fortune article makes two other false assumptions. First, they assume that this accident was caused by an Autopilot failure. To be clear, this accident was the result of a semi-tractor trailer crossing both lanes of a divided highway in front of an oncoming car. Whether driven under manual or assisted mode, this presented a challenging and unexpected emergency braking scenario for the driver to respond to. In the moments leading up to the collision, there is no evidence to suggest that Autopilot was not operating as designed and as described to users: specifically, as a driver assistance system that maintains a vehicle's position in lane and adjusts the vehicle's speed to match surrounding traffic.

Fortune never even addresses that point. Second, Fortune assumes that, putting all of these other problems aside, a single accident involving Autopilot, regardless of how many accidents Autopilot has stopped and how many lives it has saved, is material to Tesla's investors. On the day the news broke about NHTSA's decision to initiate a preliminary evaluation into the incident, Tesla's stock traded up, not down, confirming that not only did our investors know better, but that our own internal assessment of the performance and risk profile of Autopilot were in line with market expectations.

The bottom line is that Fortune jumped the gun on a story before they had the facts. They then sought wrongly to defend that position by plucking boilerplate language from SEC filings that have no bearing on what happened, while failing to correct or acknowledge their original omissions and errors.

Full text referenced above:

We may become subject to product liability claims, which could harm our financial condition and liquidity if we are not able to successfully defend or insure against such claims.

Product liability claims could harm our business, prospects, operating results and financial condition. The automobile industry experiences significant product liability claims and we face inherent risk of exposure to claims in the event our vehicles do not perform as expected resulting in personal injury or death. We also may face similar claims related to any misuse or failures of new technologies that we are pioneering, including autopilot in our vehicles and our Tesla Energy products. A successful product liability claim against us with respect to any aspect of our products could require us to pay a substantial monetary award. Our risks in this area are particularly pronounced given the limited number of vehicles and energy storage products delivered to date and limited field experience of our products. Moreover, a product liability claim could generate substantial negative publicity about our products and business and would have material adverse effect on our brand, business, prospects and operating results. We self-insure against the risk of product liability claims, meaning that any product liability claims will have to be paid from company funds, not by insurance.


----------



## painfreepc

Many if you hear say the car is not advertise as self driving, that is an outright lie or missing information on your part, the car is advertised as self driving the car is advertised as self-driving ramp to ramp,

which means as long as you're going down Main Street as long as you're going down 1st Street as long as you stay on the 101 freeway the car is self driving, the car is advertised as self-driving ramp to ramp which means as long as you don't turn down the next street or exit the freeway the car is totally self-driving,

So get your facts straight.


----------



## DriverX

RamzFanz said:


> Who knows. It's easy to look at a video and make hindsight assumptions. No they aren't forward thinking, they aren't self driving as you claimed, they are driver assist.


Well EXCUUUUSE me for using the common vernacular. Good luck convincing the planet to use more specific language so that you aren't triggered.

https://www.google.com/webhp?source...95&ion=1&espv=2&ie=UTF-8#q=tesla self driving

http://www.nytimes.com/2016/07/05/b...different-roads-to-self-driving-car.html?_r=0


----------



## UberEricLong

painfreepc said:


> Many if you hear say the car is not advertise as self driving, that is an outright lie or missing information on your part, the car is advertised as self driving the car is advertised as self-driving ramp to ramp,
> 
> which means as long as you're going down Main Street as long as you're going down 1st Street as long as you stay on the 101 freeway the car is self driving, the car is advertised as self-driving ramp to ramp which means as long as you don't turn down the next street or exit the freeway the car is totally self-driving,
> 
> So get your facts straight.


It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot "is an assist feature that requires you to keep your hands on the steering wheel at all times," and that "you need to maintain control and responsibility for your vehicle" while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to "Always keep your hands on the wheel. Be prepared to take over at any time." The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

https://www.teslamotors.com/blog/tragic-loss


----------



## DriverX

painfreepc said:


> Many if you hear say the car is not advertise as self driving, that is an outright lie or missing information on your part, the car is advertised as self driving the car is advertised as self-driving ramp to ramp,
> 
> which means as long as you're going down Main Street as long as you're going down 1st Street as long as you stay on the 101 freeway the car is self driving, the car is advertised as self-driving ramp to ramp which means as long as you don't turn down the next street or exit the freeway the car is totally self-driving,
> 
> So get your facts straight.


Good point


----------



## LAuberX

painfreepc said:


> Many if you hear say the car is not advertise as self driving, that is an outright lie or missing information on your part, the car is advertised as self driving the car is advertised as self-driving ramp to ramp,
> 
> which means as long as you're going down Main Street as long as you're going down 1st Street as long as you stay on the 101 freeway the car is self driving, the car is advertised as self-driving ramp to ramp which means as long as you don't turn down the next street or exit the freeway the car is totally self-driving,
> 
> So get your facts straight.


I agree 100%

the title of the thread is not about an "autonomous" car.


----------



## RamzFanz

DriverX said:


> Well EXCUUUUSE me for using the common vernacular. Good luck convincing the planet to use more specific language so that you aren't triggered.
> 
> https://www.google.com/webhp?sourceid=chrome-instant&rlz=1C1TSND_enUS695US695&ion=1&espv=2&ie=UTF-8#q=tesla self driving
> 
> http://www.nytimes.com/2016/07/05/b...different-roads-to-self-driving-car.html?_r=0


As if it wasn't intentional. Please. You know full well all Teslas are human driven, clickbait articles notwithstanding.

Here, I'll help you:

*The first death resulting from a crash involving a Tesla on autopilot where the system didn't save the idiot's life!!*


----------



## RamzFanz

painfreepc said:


> Many if you hear say the car is not advertise as self driving, that is an outright lie or missing information on your part, the car is advertised as self driving the car is advertised as self-driving ramp to ramp,
> 
> which means as long as you're going down Main Street as long as you're going down 1st Street as long as you stay on the 101 freeway the car is self driving, the car is advertised as self-driving ramp to ramp which means as long as you don't turn down the next street or exit the freeway the car is totally self-driving,
> 
> So get your facts straight.


Yes, you are correct. In the right conditions it is capable of driving you from ramp to ramp (primarily after update to 8.0) and they do state that, but not as self-driving.

However, even though it is sometimes capable of accomplishing this, it is not self driving. It is not self driving because these are driver assist features that can disengage at any time and the car requires a human driver at all times. Speed and lane control are only two of many features an actual self driving car would need and it wouldn't be limited to specific situations. The tesla doesn't have the sensors, system, or programing to be self driving. It can't make lane changes on its own or drive off highway.

Self driving is level 4 and implies it_ could_ be a car with no human controls. The tesla is level 2 or arguably perhaps 3.

"When Elon Musk first introduced the Autopilot, he made it clear that hardware limitations will not allow for a fully self-driving system..."


----------



## UberEricLong

painfreepc said:


> the car is advertised as self driving the car is advertised as self-driving ramp to ramp





painfreepc said:


> So get your facts straight.


Please enlighten us so we can get our facts straight by posting a link to any Tesla advertisement, yet alone one for Autopilot which advertises "ramp to ramp" self-driving. You cant, because no such advertisement exists. Tesla spends exactly $0 on their advertising budget. It appears that you are the one who needs to get their facts straight. If, instead, you are referring to Tesla's representation of how autopilot works, here is a fact directly from TeslaMotors.com


----------



## LAuberX

"system that pilots the Model S along the highway, staying within your lane, even in stop and go traffic"

Summon feature: "lets you call your car from your phone so it can come greet you at the front door"

yep, self driving.


----------



## RamzFanz

LAuberX said:


> "system that pilots the Model S along the highway, staying within your lane, even in stop and go traffic"
> 
> Summon feature: "lets you call your car from your phone so it can come greet you at the front door"
> 
> yep, self driving.


Negative. Summon requires a human watching and ensuring the route is clear. Lane control is fundamentally no different than cruise control because they both require a driver in control and only assist the driver.

"When Elon Musk first introduced the Autopilot, he made it clear that hardware limitations will not allow for a fully self-driving system..."

The Tesla cars don't even have the hardware to self drive. Autopilot 2 isn't due out until at least 2018.

"This is where Autopilot 2.0 comes in. A new sensor suite on which Tesla can release more advanced autonomous and semi-autonomous features built on the architecture developed through the current Autopilot program."


----------



## painfreepc

LAuberX said:


> "system that pilots the Model S along the highway, staying within your lane, even in stop and go traffic"
> 
> Summon feature: "lets you call your car from your phone so it can come greet you at the front door"
> 
> yep, self driving.


http://electrek.co/2016/06/30/tesla-8-0-update-new-autopilot-features-ui-refresh-more-model-s-x/

This article was about the 8.0 update so maybe the author of this news article got his facts incorrect


----------



## DriverX

RamzFanz said:


> As if it wasn't intentional. Please. You know full well all Teslas are human driven, clickbait articles notwithstanding.
> 
> Here, I'll help you:
> 
> *The first death resulting from a crash involving a Tesla on autopilot where the system didn't save the idiot's life!!*


YOu are paranoid and delusional. Who cares about Tesla's shitty autopilot or whatever you want to call it. I used the common terminology to describe what everyone agrees is shite tech for self driving cars, even Musk has said as much. Your mincing language just to be right and have the last word.

Guess what, everyone calls RC Helicopters DRONES when clearly they aren't drones!


----------



## painfreepc

DriverX said:


> YOu are paranoid and delusional. Who cares about Tesla's shitty autopilot or whatever you want to call it. I used the common terminology to describe what everyone agrees is shite tech for self driving cars, even Musk has said as much. Your mincing language just to be right and have the last word.
> 
> Guess what, everyone calls RC Helicopters DRONES when clearly they aren't drones!


Wow that one hit home, I used to fly radiocontrol planes and plan to get back into it pretty soon these damn things they called drones specially the lower end ones are not drones by any stretch of the imagination, they're not even as good as mid priced actual RC helicopters


----------



## DriverX

RamzFanz said:


> Negative. Summon requires a human watching and ensuring the route is clear. Lane control is fundamentally no different than cruise control because they both require a driver in control and only assist the driver.
> 
> "When Elon Musk first introduced the Autopilot, he made it clear that hardware limitations will not allow for a fully self-driving system..."
> 
> The Tesla cars don't even have the hardware to self drive. Autopilot 2 isn't due out until at least 2018.
> 
> "This is where Autopilot 2.0 comes in. A new sensor suite on which Tesla can release more advanced autonomous and semi-autonomous features built on the architecture developed through the current Autopilot program."


BS it says right there in the marketing material

Summon feature: "lets you call your car from your phone so it can come greet you at the front door in the morning"

To any normal person that implies that the car is driving itself. and it ads nothing about what you say requiring you to watch it. Stop making stuff up that isn't there. Why are you so invested in protecting your precious Tesla or is it Musk that you feel the need to champion. Is Elon your desktop image?


----------



## RamzFanz

DriverX said:


> YOu are paranoid and delusional. Who cares about Tesla's shitty autopilot or whatever you want to call it. I used the common terminology to describe what everyone agrees is shite tech for self driving cars, even Musk has said as much. Your mincing language just to be right and have the last word.
> 
> Guess what, everyone calls RC Helicopters DRONES when clearly they aren't drones!


Except it isn't a self driving car, so there is that. I'm sorry I spoiled your party.


----------



## RamzFanz

DriverX said:


> BS it says right there in the marketing material
> 
> Summon feature: "lets you call your car from your phone so it can come greet you at the front door in the morning"
> 
> To any normal person that implies that the car is driving itself. and it ads nothing about what you say requiring you to watch it. Stop making stuff up that isn't there. Why are you so invested in protecting your precious Tesla or is it Musk that you feel the need to champion. Is Elon your desktop image?


"_This feature will park Model S while the driver is outside the vehicle. Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. *As such, Summon requires that you continually monitor your vehicle's movement and surroundings while it is in progress and that you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle. You must maintain control and responsibility for your vehicle when using this feature* and should only use it on private property."
_
I'm not invested in Tesla or Musk. I just find the constant misleading statements on this board useless and damaging to the community. Some people actually like to discuss subjects without all the hyperbol.

No one has died in a self driving car. Tesla's, _which are not self driving_, have been and remain to be safer than humans.


----------



## DriverX

RamzFanz said:


> "_This feature will park Model S while the driver is outside the vehicle. Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. *As such, Summon requires that you continually monitor your vehicle's movement and surroundings while it is in progress and that you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle. You must maintain control and responsibility for your vehicle when using this feature* and should only use it on private property."
> _
> I'm not invested in Tesla or Musk. I just find the constant misleading statements on this board useless and damaging to the community. Some people actually like to discuss subjects without all the hyperbol.


_"Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. "_

So this leads me to believe a semi-truck would be detected. The gory details left in the fine print of the user agreement that everyone reads thoroughly before checking the agree box, I'm sure. Drive your tesla while baking a cake for yourself, WTF do I care, until I'm the innocent bystander crossing the street that your autopilot didn't detect.

The beginning of the AI takeover that Musk fears so much is already here, the machines are trying to dumb us down so much that we can't park or operate a car. We will be their pets soon. Musk has basically said that we are just part of a computer simulation, him and Travis seem to have an unhealthy fascination with TRON. It's very obvious in Uber marketing art and clearly Elon is a sci-tech freak.


----------



## painfreepc

RamzFanz said:


> "_This feature will park Model S while the driver is outside the vehicle. Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. *As such, Summon requires that you continually monitor your vehicle's movement and surroundings while it is in progress and that you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle. You must maintain control and responsibility for your vehicle when using this feature* and should only use it on private property."
> _
> I'm not invested in Tesla or Musk. I just find the constant misleading statements on this board useless and damaging to the community. Some people actually like to discuss subjects without all the hyperbol.
> 
> No one has died in a self driving car. Tesla's, _which are not self driving_, have been and remain to be safer than humans.


How are we going to ensure that the person only using it on their own private property,

control it with a cell phone that you mean the same cell phones that we use to control uber and lyfe in the same cell phones that freeze that glitch that go off by themselves, this is getting more and more Ridiculous by the second.


----------



## painfreepc

And please tell me why are only the fatality numbers quoted being that there's only one fatality,

how many major accidents have there been without fatalities how many people have been in accidents have been to the hospital a long term that was not a fatality,

how many cars have been salvaged how many cars have been totaled that did not involved a fatality where are those numbers..


----------



## RamzFanz

painfreepc said:


> How are we going to ensure that the person only using it on their own private property,
> 
> control it with a cell phone that you mean the same cell phones that we use to control uber and lyfe in the same cell phones that freeze that glitch that go off by themselves, this is getting more and more Ridiculous by the second.


I agree. I don't like what Musk is doing. It's dangerous and counterproductive because humans are untrustworthy and summon is not ready.

You can't ensure anything about human behavior. Myself, I take the guard off my table saw. I won't blame the manufacturer if I lose a finger or two.


----------



## RamzFanz

painfreepc said:


> And please tell me why are only the fatality numbers quoted being that there's only one fatality,
> 
> how many major accidents have there been without fatalities how many people have been in accidents have been to the hospital a long term that was not a fatality,
> 
> how many cars have been salvaged how many cars have been totaled that did not involved a fatality where are those numbers..


Why would you only ask Tesla that? Do all car companies release that data? Why would we assume they even have that data?

I personally have not read of any accidents actually caused by a properly operated Tesla on autopilot. It doesn't mean they don't exist.


----------



## RamzFanz

DriverX said:


> _"Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. "_
> 
> So this leads me to believe a semi-truck would be detected. The gory details left in the fine print of the user agreement that everyone reads thoroughly before checking the agree box, I'm sure. Drive your tesla while baking a cake for yourself, WTF do I care, until I'm the innocent bystander crossing the street that your autopilot didn't detect.
> 
> The beginning of the AI takeover that Musk fears so much is already here, the machines are trying to dumb us down so much that we can't park or operate a car. We will be their pets soon. Musk has basically said that we are just part of a computer simulation, him and Travis seem to have an unhealthy fascination with TRON. It's very obvious in Uber marketing art and clearly Elon is a sci-tech freak.


More hyperbol. They are specifically warned every time they engage autopilot of it's limitations and their responsibilities. There's no need to pretend this is something it's not. The truck driver failed, the driver failed, and the beta autopilot failed to save them from themselves, but that is to be expected.


----------



## RamzFanz

By the way, in your effort to sensationalise, you have violated all of the rules of creating a post in the news forum. There's a reason they want you to use the article's headline and not your own title and that is they probably want honesty and accuracy. There's also a reason they ask you not to add your opinion to the original post. That reason is probably so people aren't confused by others who want to be deceptive about the actual subject, facts, and just want to make grandiose agenda serving claims. But I'm guessing here.

The UberPeople.NET News forum is for news relating to Uber and driving for Uber. Please make posts here that include the following:​
1) A link to your source
2) *The headline in the title and in the thread*
3) *Please do not post your opinion in the first post*
4) Include a screen capture of the news article if you can​
All content that is not news will be moved to other forum sections.​
Thanks
UPNet​
Let me clean it up for you and inject some actual facts without the hyperbol:

*Tesla driver killed in crash with Autopilot active, NHTSA investigating*

A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.

The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver - who Tesla notes is ultimately responsible for the vehicle's actions, even with Autopilot on - nor the car noticed the big rig or the trailer "against a brightly lit sky" and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle's radar didn't help in this case because it "tunes out what looks like an overhead road sign to avoid false braking events."

Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car's safety systems "would likely have prevented serious injury as it has in numerous other similar incidents."

"AUTOPILOT IS GETTING BETTER ALL THE TIME, BUT IT IS NOT PERFECT AND STILL REQUIRES THE DRIVER TO REMAIN ALERT."

The accident occurred May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown driving. The truck driver was not injured.

Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.


Follow
Elon Musk

✔@elonmusk
Our condolences for the tragic loss https://www.teslamotors.com/blog/tragic-loss &#8230;

3:53 PM - 30 Jun 2016

A Tragic Loss
We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality...



In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it, and that the system was designed with the expectation that drivers keep their hands on the wheel and that the driver is required to "maintain control and responsibility for your vehicle." Safety-critical vehicle features rolled out in public betas are new territory for regulators, and rules haven't been set.

THE FIRST FATALITY IN AN TESLA IN AUTOPILOT MODE

Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system "gives you the impression that it's doing more than it is." In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active. Several automakers working on systems similar to Autopilot - GM with Super Cruise, for instance - have only tested the feature privately and have said they won't deploy until they're ready.

Volvo has said that it will take full legal liability for all its cars when they are operating in fully autonomous mode, and plans to launch a limited trial of its autonomous Drive Me technology next year.

NHTSA issued the following statement to _The Verge_:

NHTSA's Office of Defects Investigation is opening a Preliminary Evaluation of the design and performance of automated driving systems in the Tesla Model S.

NHTSA recently learned of a fatal highway crash involving a 2015 Tesla Model S, which, according to the manufacturer, was operating with the vehicle's 'Autopilot' automated driving systems activated. The incident, which occurred on May 7 in Williston, Florida, was reported to NHTSA by Tesla. NHTSA deployed its Special Crash Investigations Team to investigate the vehicle and crash scene, and is in communication with the Florida Highway Patrol. Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway. The driver of the Tesla died due to injuries sustained in the crash.

NHTSA's Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.

The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles


----------



## DriverX

RamzFanz said:


> More hyperbol. They are specifically warned every time they engage autopilot of it's limitations and their responsibilities. There's no need to pretend this is something it's not. The truck driver failed, the driver failed, and the beta autopilot failed to save them from themselves, but that is to be expected.


and you failed at winning whatever argument you're having with yourself seeing as you have admitted to all parties failing and thus shared liability. It'll be settled before its argued in court which is a fail to the justice system because it leaves the matter undecided and does nothing to prevent further failures.


----------



## DriverX

RamzFanz said:


> By the way, in your effort to sensationalise, you have violated all of the rules of creating a post in the news forum. There's a reason they want you to use the article's headline and not your own title and that is they probably want honesty and accuracy. There's also a reason they ask you not to add your opinion to the original post. That reason is probably so people aren't confused by others who want to be deceptive about the actual subject, facts, and just want to make grandiose agenda serving claims. But I'm guessing here.
> 
> The UberPeople.NET News forum is for news relating to Uber and driving for Uber. Please make posts here that include the following:​
> 1) A link to your source
> 2) *The headline in the title and in the thread*
> 3) *Please do not post your opinion in the first post*
> 4) Include a screen capture of the news article if you can​
> All content that is not news will be moved to other forum sections.​
> Thanks
> UPNet​
> Let me clean it up for you and inject some actual facts without the hyperbol:
> 
> *Tesla driver killed in crash with Autopilot active, NHTSA investigating*
> 
> A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.
> 
> The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver - who Tesla notes is ultimately responsible for the vehicle's actions, even with Autopilot on - nor the car noticed the big rig or the trailer "against a brightly lit sky" and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle's radar didn't help in this case because it "tunes out what looks like an overhead road sign to avoid false braking events."
> 
> Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car's safety systems "would likely have prevented serious injury as it has in numerous other similar incidents."
> 
> "AUTOPILOT IS GETTING BETTER ALL THE TIME, BUT IT IS NOT PERFECT AND STILL REQUIRES THE DRIVER TO REMAIN ALERT."
> 
> The accident occurred May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown driving. The truck driver was not injured.
> 
> Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.
> 
> 
> Follow
> Elon Musk
> 
> ✔@elonmusk
> Our condolences for the tragic loss https://www.teslamotors.com/blog/tragic-loss &#8230;
> 
> 3:53 PM - 30 Jun 2016
> 
> A Tragic Loss
> We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality...
> 
> 
> 
> In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it, and that the system was designed with the expectation that drivers keep their hands on the wheel and that the driver is required to "maintain control and responsibility for your vehicle." Safety-critical vehicle features rolled out in public betas are new territory for regulators, and rules haven't been set.
> 
> THE FIRST FATALITY IN AN TESLA IN AUTOPILOT MODE
> 
> Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system "gives you the impression that it's doing more than it is." In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active. Several automakers working on systems similar to Autopilot - GM with Super Cruise, for instance - have only tested the feature privately and have said they won't deploy until they're ready.
> 
> Volvo has said that it will take full legal liability for all its cars when they are operating in fully autonomous mode, and plans to launch a limited trial of its autonomous Drive Me technology next year.
> 
> NHTSA issued the following statement to _The Verge_:
> 
> NHTSA's Office of Defects Investigation is opening a Preliminary Evaluation of the design and performance of automated driving systems in the Tesla Model S.
> 
> NHTSA recently learned of a fatal highway crash involving a 2015 Tesla Model S, which, according to the manufacturer, was operating with the vehicle's 'Autopilot' automated driving systems activated. The incident, which occurred on May 7 in Williston, Florida, was reported to NHTSA by Tesla. NHTSA deployed its Special Crash Investigations Team to investigate the vehicle and crash scene, and is in communication with the Florida Highway Patrol. Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway. The driver of the Tesla died due to injuries sustained in the crash.
> 
> NHTSA's Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.
> 
> The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles


OMG it never ends with you. It was a copy and paste off the original headline which is probably why the moderators allowed it and made it a FEATURED post. Not my fault if the Verge went and changed their headline. get over it no on but you and Teslas legal department care about the distinction in the language.

So lame to try and use a non-starter like vocabulary to provide a loop hole for bad ideas. We have friggin leash laws for dogs in thei country because people are stupid why should Elon get a free pass to beta test his deadly product on the unsuspecting public?

If I can't blow my vapor all over your lunch table than you can't use your autopilot on the public roads!


----------



## Ca$h4

*Laptop in wreckage of Tesla Autopilot car: Florida investigators*









A Tesla Model S involved in the fatal crash on May 7, 2016 is shown with the top third of the car sheared off by the impact of the collision of the Tesla with a tractor-trailer truck on nearby highway and came to rest in the yard of Robert and Chrissy VanKavelaar in...
Reuters

*http://www.reuters.com/article/us-tesla-autopilot-idUSKCN0ZN1XX*


----------



## DriverX

Ca$h4 said:


> *Laptop in wreckage of Tesla Autopilot car: Florida investigators*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A Tesla Model S involved in the fatal crash on May 7, 2016 is shown with the top third of the car sheared off by the impact of the collision of the Tesla with a tractor-trailer truck on nearby highway and came to rest in the yard of Robert and Chrissy VanKavelaar in...
> Reuters
> 
> *http://www.reuters.com/article/us-tesla-autopilot-idUSKCN0ZN1XX*


Self driving cars don't need Max Headroom


----------



## Swj

RamzFanz said:


> By the way, in your effort to sensationalise, you have violated all of the rules of creating a post in the news forum. There's a reason they want you to use the article's headline and not your own title and that is they probably want honesty and accuracy. There's also a reason they ask you not to add your opinion to the original post. That reason is probably so people aren't confused by others who want to be deceptive about the actual subject, facts, and just want to make grandiose agenda serving claims. But I'm guessing here.
> 
> The UberPeople.NET News forum is for news relating to Uber and driving for Uber. Please make posts here that include the following:​
> 1) A link to your source
> 2) *The headline in the title and in the thread*
> 3) *Please do not post your opinion in the first post*
> 4) Include a screen capture of the news article if you can​
> All content that is not news will be moved to other forum sections.​
> Thanks
> UPNet​
> Let me clean it up for you and inject some actual facts without the hyperbol:
> 
> *Tesla driver killed in crash with Autopilot active, NHTSA investigating*
> 
> A Tesla Model S with the Autopilot system activated was involved in a fatal crash, the first known fatality in a Tesla where Autopilot was active. The company revealed the crash in a blog post posted today and says it informed the National Highway Transportation Safety Administration (NHTSA) of the incident, which is now investigating.
> 
> The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver - who Tesla notes is ultimately responsible for the vehicle's actions, even with Autopilot on - nor the car noticed the big rig or the trailer "against a brightly lit sky" and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle's radar didn't help in this case because it "tunes out what looks like an overhead road sign to avoid false braking events."
> 
> Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car's safety systems "would likely have prevented serious injury as it has in numerous other similar incidents."
> 
> "AUTOPILOT IS GETTING BETTER ALL THE TIME, BUT IT IS NOT PERFECT AND STILL REQUIRES THE DRIVER TO REMAIN ALERT."
> 
> The accident occurred May 7th in Williston, Florida with 40-year-old Ohio resident Joshua Brown driving. The truck driver was not injured.
> 
> Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide. The NHTSA investigation, Tesla says, is a "preliminary evaluation" to determine if the Autopilot system was working properly, which can be a precursor to a safety action like a recall.
> 
> 
> Follow
> Elon Musk
> 
> ✔@elonmusk
> Our condolences for the tragic loss https://www.teslamotors.com/blog/tragic-loss &#8230;
> 
> 3:53 PM - 30 Jun 2016
> 
> A Tragic Loss
> We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality...
> 
> 
> 
> In the blog post, Tesla reiterates that customers are required to agree that the system is in a "public beta phase" before they can use it, and that the system was designed with the expectation that drivers keep their hands on the wheel and that the driver is required to "maintain control and responsibility for your vehicle." Safety-critical vehicle features rolled out in public betas are new territory for regulators, and rules haven't been set.
> 
> THE FIRST FATALITY IN AN TESLA IN AUTOPILOT MODE
> 
> Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system "gives you the impression that it's doing more than it is." In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active. Several automakers working on systems similar to Autopilot - GM with Super Cruise, for instance - have only tested the feature privately and have said they won't deploy until they're ready.
> 
> Volvo has said that it will take full legal liability for all its cars when they are operating in fully autonomous mode, and plans to launch a limited trial of its autonomous Drive Me technology next year.
> 
> NHTSA issued the following statement to _The Verge_:
> 
> NHTSA's Office of Defects Investigation is opening a Preliminary Evaluation of the design and performance of automated driving systems in the Tesla Model S.
> 
> NHTSA recently learned of a fatal highway crash involving a 2015 Tesla Model S, which, according to the manufacturer, was operating with the vehicle's 'Autopilot' automated driving systems activated. The incident, which occurred on May 7 in Williston, Florida, was reported to NHTSA by Tesla. NHTSA deployed its Special Crash Investigations Team to investigate the vehicle and crash scene, and is in communication with the Florida Highway Patrol. Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway. The driver of the Tesla died due to injuries sustained in the crash.
> 
> NHTSA's Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving systems.
> 
> The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles


Really get a life buddy


----------



## RamzFanz

DriverX said:


> and you failed at winning whatever argument you're having with yourself seeing as you have admitted to all parties failing and thus shared liability. It'll be settled before its argued in court which is a fail to the justice system because it leaves the matter undecided and does nothing to prevent further failures.


The car failed to stop the human driver, who is 100% responsible for driving, from being an idiot, but the car is not responsible at all. If you drive your car at 100 MPH and you crash and the airbag fails to save you, the car is not responsible for the death. Same thing.

A settlement is just bean counters saving beans, it's in no way an admission of anything. A civil judgment against Tesla would also not be a finding of actual guilt, it's a coin flip based on the jury and the next suit or appeal could be the exact opposite. Now, if they were taken to criminal court by a state or the feds, that would mean something.


----------



## DriverX

RamzFanz said:


> The car failed to stop the human driver, who is 100% responsible for driving, from being an idiot, but the car is not responsible at all. If you drive your car at 100 MPH and you crash and the airbag fails to save you, the car is not responsible for the death. Same thing.
> 
> A settlement is just bean counters saving beans, it's in no way an admission of anything. A civil judgment against Tesla would also not be a finding of actual guilt, it's a coin flip based on the jury and the next suit or appeal could be the exact opposite. Now, if they were taken to criminal court by a state or the feds, that would mean something.


They recall cars all the time for failed airbags. OJ wasn't criminally convicted but we all know he did it. Typical for settlements to occur in private to keep things out of the press. It shows that the settler didnt' feel they had a slam dunk defense and opted for an easy lose.

Just stop now you're really looking foolish.


----------



## RamzFanz

DriverX said:


> OMG it never ends with you. It was a copy and paste off the original headline which is probably why the moderators allowed it and made it a FEATURED post. Not my fault if the Verge went and changed their headline. get over it no on but you and Teslas legal department care about the distinction in the language.
> 
> So lame to try and use a non-starter like vocabulary to provide a loop hole for bad ideas. We have friggin leash laws for dogs in thei country because people are stupid why should Elon get a free pass to beta test his deadly product on the unsuspecting public?
> 
> If I can't blow my vapor all over your lunch table than you can't use your autopilot on the public roads!


Uh, what? The headline itself is in the actual link you posted, so no, it wasn't changed. Does it never end with you?


----------



## RamzFanz

DriverX said:


> It shows that the settler didnt' feel they had a slam dunk defense and opted for an easy lose.


Sometimes they payout because they can't win but mostly they settle because the battle costs more than the settlement. They also settle to avoid a runaway jury acting in sympathy and not following the law.



DriverX said:


> Just stop now you're really looking foolish.


Tell us again about how they changed the headline that's IN THE URL you posted, that's a good one! Did they change your post too?

For reference, you have now gone from "I was using common venicular" to "they changed the headline" (oops, busted).

What's next? "I was hacked?" "Elon Musk did it?"


----------



## RamzFanz

Swj said:


> Really get a life buddy


Sorry, I won't ignore people trying to deceive others. It's not in my genes. Honesty or be called out. DriverX has a long history of saying untrue things, possibly harming other drivers, just because he's mad at SDCs and Uber.


----------



## RamzFanz

DriverX said:


> They recall cars all the time for failed airbags.


Not when the driver is the cause of the failure and not even if the airbag failed once.



DriverX said:


> OJ wasn't criminally convicted but we all know he did it.


Did this have a point? I said nothing about being found guilty or not. Just being taken to court for a long drawn out and expensive battle would harm Tesla.


----------



## oregonuberduber

RamzFanz said:


> Not when the driver is the cause of the failure and not even if the airbag failed once.
> 
> Did this have a point?


Yes, the point is that self-driving cars is just futuristic fantasy.


----------



## RamzFanz

oregonuberduber said:


> Yes, the point is that self-driving cars is just futuristic fantasy.


Yes, true. 2-4 years in the future according to, well, just about everyone who follows the technology and understands it.


----------



## oregonuberduber

Dream on.


----------



## RamzFanz

oregonuberduber said:


> Dream on.


Oh, wait, did you mean worldwide? Because the Netherlands already has self driving vehicles in use on public roads so you're a little late in your prediction. Zero driver, zero human controls.


----------



## oregonuberduber

So why don't you go to the Netherlands, have a driver-less car chauffer you around the countryside, and tell us all about it.


----------



## RamzFanz

oregonuberduber said:


> So why don't you go to the Netherlands, have a driver-less car chauffer you around the countryside, and tell us all about it.


Na, I like the good ol' USA. I can wait the 2-4 years.


----------



## painfreepc

RamzFanz 
I'm not trying to be an A-hole or anything but I really would like to know, what is your fascination with the acceleration of the autonomous technology,

Did you have a family member or a loved one died in a tragic car accident,
Do you have all your stocks invested in Tesla,

That tesla car could not distinguish a billboard from the side of a truck, we just have one car crash a few days ago into a guardrail exiting freeway, what you think the Technologies will be ready in 2 years..


----------



## RamzFanz

DriverX said:


> Don't cry when its not ready for another 15 years, you won't be able to afford it anyway.


I won't Alan. I don't know if you are the Alan that tried to convince me that the internet was a fad, but you sound so much like him, I hope you don't mind that I call you that.

I can afford anything I want. I sold my company and retired years ago at 50. I Uber, not so much lately because of a pool installation and new privacy fence for my house, because I like it and make money for my daughter's college fund.

If you are too stubborn to see what is happening, I feel bad for you. I've seen so many of you along the way. "The internet is a fad". "We don't need email or a website, we've always done it this way". "I prefer faxes". "I don't trust debit cards, checks keep me safe."

Hey, keep on keepin on. I have no doubt you know what you are talking about.


----------



## RamzFanz

painfreepc said:


> RamzFanz
> I'm not trying to be an A-hole or anything but I really would like to know, what is your fascination with the acceleration of the autonomous technology,
> 
> Did you have a family member or a loved one died in a tragic car accident,
> Do you have all your stocks invested in Tesla,
> 
> That tesla car could not distinguish a billboard from the side of a truck, we just have one car crash a few days ago into a guardrail exiting freeway, what you think the Technologies will be ready in 2 years..


I have no investment (yet) in SDCs or Tesla and probably will invest in the suppliers vs the manufacturers. Nor do I have an investment in SDCs coming about and killing one of my profitable pastimes, Ubering.

What I do invest in is honesty and reality. I want drivers to be prepared and, perhaps, even profit from the changes coming verses living in denial and getting crushed. Some people on here submerse themselves and attempt to submerse others in denial. SDCs are coming. They are coming soon. They may even come in large quantities soon, but that is yet to be seen.

I have a daughter that wants to drive this year and hates that I want SDCs. I know that she will suck as a driver for many years or, perhaps, for life as her mother does. There's nothing devious about wanting safer roads. 1.2 million people die every year from road "accidents" and I propose that most of them are human caused and not in any way "accidents".

Yes, I lost 2 highschool friends in a road "accident" where they made a bad decision passing friends on a wet road. I lost a really awesome girlfriend in a roll over because she drove drunk. My wife absolutely sucks at driving and I fear for her and others whenever she takes the wheel which, while legal, will never happen with me in the car. Seriously.

Humans suck at driving. I live near a six flags where an entire family waiting in line to park was killed by an inattentive dump truck driver. Do we really need better reasons to end that? I mean, for the sake of god, wouldn't we rather pay attention to our family and arrive alive?

Musk and Tesla are making a huge mistake in my mind and I've said it over and over. SDCs, however, are looking to be pretty damn safe and they are coming, thank god, no matter what any taxi or Uber driver wants. I will trade any job for lives. I will trade all Uber drivers for one grieving parent. So be it, that's who I am.

No one who challenges me is an a-hole. I appreciate it. The a-holes lie and deceive drivers for their agenda.


----------



## painfreepc

RamzFanz said:


> I have no investment (yet) in SDCs or Tesla and probably will invest in the suppliers vs the manufacturers. Nor do I have an investment in SDCs coming about and killing one of my profitable pastimes, Ubering.
> 
> What I do invest in is honesty and reality. I want drivers to be prepared and, perhaps, even profit from the changes coming verses living in denial and getting crushed. Some people on here submerse themselves and attempt to submerse others in denial. SDCs are coming. They are coming soon. They may even come in large quantities soon, but that is yet to be seen.
> 
> I have a daughter that wants to drive this year and hates that I want SDCs. I know that she will suck as a driver for many years or, perhaps, for life as her mother does. There's nothing devious about wanting safer roads. 1.2 million people die every year from road "accidents" and I propose that most of them are human caused and not in any way "accidents".
> 
> Yes, I lost 2 highschool friends in a road "accident" where they made a bad decision passing friends on a wet road. I lost a really awesome girlfriend in a roll over because she drove drunk. My wife absolutely sucks at driving and I fear for her and others whenever she takes the wheel which, while legal, will never happen with me in the car. Seriously.
> 
> Humans suck at driving. I live near a six flags where an entire family waiting in line to park was killed by an inattentive dump truck driver. Do we really need better reasons to end that? I mean, for the sake of god, wouldn't we rather pay attention to our family and arrive alive?
> 
> Musk and Tesla are making a huge mistake in my mind and I've said it over and over. SDCs, however, are looking to be pretty damn safe and they are coming, thank god, no matter what any taxi or Uber driver wants. I will trade any job for lives. I will trade all Uber drivers for one grieving parent. So be it, that's who I am.
> 
> No one who challenges me is an a-hole. I appreciate it. The a-holes lie and deceive drivers for their agenda.


Self-driving car technology is not about saving lives if you believe that you really are kidding yourself you live in a pipe dream
This technology is being accelerated to make rich people richer and to make poor people poorer,

Do you even realize the way you dream about this technology if this actually becomes a reality how many people won't have a job,

Who needs a UPS driver when a Droid UPS truck can just pull up to your door and have a little robot UPS man can bring your package to your door,

Hell you won't even need the ice cream man driving through your neighborhood anymore who needs that when droid ice cream truck will do, more money for the person who manufactured ice cream,

So you won't even need hot dog vendors anymore, who needs the actual person selling you a hot dog, hot dog vending machines can even drive down the street to go too many locations on there own,

And please explain how you think this dream of yours is going to happen in a few years, do you think every transportation service in the world is just going to throw out there cars and buy robot operated cars, who's going to finance this,

Those things you quoted earlier about how technology has changed over the years and have made things better, hey I agree, but this is first time a technology has come along that's going to put mass people out of work, do you actually think people who drive for a living and people who guess enjoy driving or just going to turn in keys without a fight..


----------



## Another Uber Driver

Of course, the things are going to have their problems, which includes destruction of property and deaths. Recall that in the late nineteenth and early twentieth centuries, the first automobiles exploded, caught fire and broke down with disastrous results. In some jurisdictions, if you wanted to drive your car through the streets, you had to have a flagman precede you.

As users and builders became more familiar with the technology, the cars became safer.

It is the same thing with these self-driving cars. There are going to be teething problems. There will be more destruction of property and people will die. It is how things happen. It happened with aeroplanes. It happened with rockets.

Much of the property loss and loss of life on the roads is due to human error. The self-driving technology can eliminate much of that. It will not eliminate all of it, if , for no other reason than the "garbage in/garbage out" rule. You can see this t0-day in the GPS when it directs you off a cliff or to the incorrect address. Eventually, it will improve.

While I do not agree with some posters as to the timetable, one thing on which we do agree is that this self-driving technology is coming, will get better and will be in common use, at some point.


----------



## DriverX

Watching MUsk or google or whoever take on the Teamsters will be very interesting.


----------



## Terminator

many posts deleted, warnings issued.

NO PERSONAL ATTACKS !


----------



## painfreepc

I use to watch movies like Robocop and Judge Dredd and think wow this is great fantasy, it's starting to look to me like the fantasy is about to become a reality,

It's not just automated cars it looks like they want to have everything automated,

And don't tell me everybody just needs to go to school, everybody can't be doctors lawyers Engineers nurses, the fact is about 30 to 40% of the population has an IQ at 90% or below sorry but that's the reality..

How many times have you ever talk to someone on the street and realize their elevator doesn't go all the way up to the top,

I mean you give one of these people a small book of logic puzzles they would not be able to complete one page, but yet and still from a distance they seem perfectly normal..

I run a DNS server in my home to help get rid of some of the ads I see on some of the popular sites I go to, I tried to explain to a few of my friends one evening in my home how DNS works, all three of them looked at me like a deer caught in the Headlights, DNS is not that hard to understand when you explain to someone in total layman terms, it's how the internet works,

sorry I'm going off on a tangent..


----------



## painfreepc

Yes I guess don't get it, even our most high-end computers in our home and our most high-end smartphones don't always work properly,

When these Technologies work properly and I don't get GPS signal lost from my smart phone we can talk about self-driving cars until then Dream On,

And please stop using auto pilots and drones in the air as reasons why this technology should work, autopilot or drone in the air need to avoid any object that it sees in its path, there no Billboards are overhead freeway signs in the air, that car did not even attempt to apply the brakes because it could not make a decision,

a Droid or autopilot in the air has no decision to make any object in his path spells Danger it means moving him out of the way now instantly.


----------



## DriverX

painfreepc said:


> Yes I guess don't get it, even our most high-end computers in our home and our most high-end smartphones don't always work properly,
> 
> When these Technologies work properly and I don't get GPS signal lost from my smart phone we can talk about self-driving cars until then Dream On,


Exactly, remember when they pushed "Virtual Reality" on us in the 80's. What a joke. Now its slightly better but still nowhere near a VR experience they've been promising sorta like how a hover board doesn't actual HOVER. In the 90s they were saying video games would be indistinguishable from reality by now, lol, not even close yet. One day sure but in our lifetime, doubtful. THe best we will see is like I've already said, HOV lanes converted for SDC vehicles, adopted first for shipping goods, like what they already got started in Europe.

THe only way you'll see this Disney Tomorrow Land fantasy of SDCs everywhere and no human drivers will be if the FEDs take over the entire transportation system and implement a standardized system. Which I was talking about long before Uber or Tesla jumped into the mix, but that's not the way things work in a capitalist system. So we will end up with some sh*tty use of the tech that benefits corporations first and human beings second in about 15 years.


----------



## painfreepc

Really bothers me that a company is allowed to have an experimental technology on the street it's called bata by their own admission, they have your customer simply check a box and promise to keep hands on the steering wheel so they can continue to collect more and more data so they can make billions,

if these people want to buy these cars and indulge themself in beta technology and they should have to go to a special class and sign documents taking liability for any accidents or deaths this may occur,

Going tobe deaths involving innocent because of this bata technology, you mark my words..


----------



## There’s no need to tip

Another Uber Driver said:


> Of course, the things are going to have their problems, which includes destruction of property and deaths. Recall that in the late nineteenth and early twentieth centuries, the first automobiles exploded, caught fire and broke down with disastrous results. In some jurisdictions, if you wanted to drive your car through the streets, you had to have a flagman precede you.
> 
> As users and builders became more familiar with the technology, the cars became safer.
> 
> It is the same thing with these self-driving cars. There are going to be teething problems. There will be more destruction of property and people will die. It is how things happen. It happened with aeroplanes. It happened with rockets.
> 
> Much of the property loss and loss of life on the roads is due to human error. The self-driving technology can eliminate much of that. It will not eliminate all of it, if , for no other reason than the "garbage in/garbage out" rule. You can see this t0-day in the GPS when it directs you off a cliff or to the incorrect address. Eventually, it will improve.
> 
> While I do not agree with some posters as to the timetable, one thing on which we do agree is that this self-driving technology is coming, will get better and will be in common use, at some point.


This, exactly this! I was saying the same thing at the start of this thread.



painfreepc said:


> Really bothers me that a company is allowed to have an experimental technology on the street it's called bata by their own admission, they have your customer simply check a box and promise to keep hands on the steering wheel so they can continue to collect more and more data so they can make billions


I actually kind of agree with this point. I know how stupid humans can be and there are some, like this driver, who will misuse the technology with life altering consequences to innocent people. I'm not exactly sure where the line should be drawn though. For example, regular cruise control. I'm sure there are drivers that set it and zone the hell out slamming into other drivers. Should we not allow that technology because some drivers are tools? However, cruise control isn't marketed as "autopilot" so perhaps there is less likelihood it will be abused. I guess the question is just how BETA this tech is and how dangerous it can be in the wrong hands. It is only limited to a small pool of wealthy individuals right now. What happens when the car and feature becomes more financially affordable that every Tom, Dick, and Harry will have access to it? What if it still isn't "ready" at that point? When should we be protected from ourselves and more importantly, from other idiots on the road? I'm not sure I know exactly where I stand on that issue yet but it does raise a very valid point.


----------



## painfreepc

For those of you that don't bother to even read what the features of this car is, the current autopilot and can't stop at stop signs or red light, it needs a car in front of it to do that that's why most people only use it on the freeway, the new autopilot version will have two cameras and it will recognize stop signs and red lights so there will be more and more using this on the Main Street God help all of us have a good day.


----------



## painfreepc

*I went hands-free in Tesla's Model S on Autopilot, even though I wasn't supposed to*

*http://mashable.com/2015/10/14/tesla-auotpilot-hands-on/#u6PbBEeM_aqh*
*

Please point out to me where on this video is its driver being forced to keep his hands on the steering wheel
*
*Got to love the web browser right there in the center console makes it very easy to check your stocks, facebook and watch Harry Potter.*
*









*


----------



## There’s no need to tip

painfreepc said:


> *I went hands-free in Tesla's Model S on Autopilot, even though I wasn't supposed to*
> 
> *http://mashable.com/2015/10/14/tesla-auotpilot-hands-on/#u6PbBEeM_aqh*
> *
> 
> Please point out to me where on this video is its driver being forced to keep his hands on the steering wheel
> *
> *Got to love the web browser right there in the center console makes it very easy to check your stocks, facebook and watch Harry Potter.*
> *
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *


He talks about the steering wheel check later on in the video but it doesn't happen. I watched a different video where it asked him to place his hands on the wheel and if you don't do it in the allotted time the car allegedly slows, then stops and turns on flashers. I'm not exactly sure what the trigger is but this post said curves can cause it: http://www.teslarati.com/what-happens-ignore-tesla-autopilot-warnings/

Also, as of the last info I read, the car did not yet have a seat sensor to prevent someone from leaving the seat with the autopilot engaged. That seams VERY dumb not to have implemented already.


----------



## RamzFanz

painfreepc said:


> Self-driving car technology is not about saving lives if you believe that you really are kidding yourself you live in a pipe dream
> This technology is being accelerated to make rich people richer and to make poor people poorer,
> 
> Do you even realize the way you dream about this technology if this actually becomes a reality how many people won't have a job,
> 
> Who needs a UPS driver when a Droid UPS truck can just pull up to your door and have a little robot UPS man can bring your package to your door,
> 
> Hell you won't even need the ice cream man driving through your neighborhood anymore who needs that when droid ice cream truck will do, more money for the person who manufactured ice cream,
> 
> So you won't even need hot dog vendors anymore, who needs the actual person selling you a hot dog, hot dog vending machines can even drive down the street to go too many locations on there own,
> 
> And please explain how you think this dream of yours is going to happen in a few years, do you think every transportation service in the world is just going to throw out there cars and buy robot operated cars, who's going to finance this,
> 
> Those things you quoted earlier about how technology has changed over the years and have made things better, hey I agree, but this is first time a technology has come along that's going to put mass people out of work, do you actually think people who drive for a living and people who guess enjoy driving or just going to turn in keys without a fight..


The full transition to no human driving allowed will probably take decades. UPS isn't going to robots anytime soon or ice cream truck drivers or hot dog vendors. Here's the thing about jobs like those, just because you have a human on board doesn't mean they need to drive so the attendant will make less because they don't need a license or commercial license, but that's OK, as I will explain later in this post. But many vehicles will still need people on board for a long time to come to deliver the package the last few feet or up the stairs or elevator, make the hotdog, hand out the icecream, and help the elderly etc.

For taxis and Uber drivers though, our time is probably much much shorter, IMO. Why? Because SDC TNC is the low hanging fruit. Cheap to roll out and very profitable. An untapped multi-trillion dollar market.

GM bought part of Lyft and dropped a billion dollars on SDC research. They expect to live test next year with actual passengers. That's _next _year. GM can crank out the cars and put them out as lyft vehicles_ at actual cost, _with no profit or dealership costs and with no cost to Lyft. Lyft is likely to get a fleet of free SDCs and GM gets to waltz into the TNC business. GM has changed their very business model to eventually become primarily a SDC TNC company rather than primarily a car manufacturer for public consumption.

Uber has partnered with Toyota to develop the cars, not sure if they will do what GM and Lyft are, but it makes too much sense for them not to.

Truck drivers are also on a short timeline. Trucks are already being tested on the interstate.

So, yes, millions of jobs will be lost and yes, many will be protesting and fighting the change but, in the end, the jobs will be gone. Some customers will insist on human drivers. My Father in Law refused to use self checkout even for a single item. He'd stand in line to help keep people employed.

But, as people see how much safer they are in SDCs, and how much cheaper it is, they will eventually switch for the most part.

This is a disaster, right? Probably in the short term if we don't prepare for it, on par with the jobs lost in the year after 9/11 or the housing crash, but not in the long term. Despite dire predictions after every technology change or industry collapse, we always create new jobs. There are almost no telephone operators anymore, but there are cell phone tower technicians, salespeople, and repair technicians. America always finds a way through entrepreneurship and free(ish) markets.

BUT, that's not the good part. The good part is the cost of living is about to plummet. In 10 years you will probably pay so much less for everything, you simply won't need to earn as much money. This is one huge step to becoming a race that needs to work very little where we can do the things we want to more than need to. Wouldn't you rather work 10-15 hours a week and have the same standard of living as working full time OR double or triple your buying power from the same income level?

By the way, save every dollar you can now and grow it if possible because it's going to buy a heck of a lot more down the road.

No truck drivers to pay, which is HUGE because raw materials are shipped to parts manufacturers, parts are shipped to manufacturers, assembled products are shipped to wholesale distributors, then shipped to retail distribution warehouses, then shipped to stores. UPS drivers will eventually be replaced by single package delivery self driving vehicles but that will cost them a fortune, unless they partner with a manufacturer like Lyft did. Or perhaps Ubers and Lyfts will be so cheap, they will just use them while they transition to their own fleet.

You're right about the profit being the motivation. Regardless, it's the safety and the massive cost savings that will convince the people and governments to get on board. Who's going to want to put their child on a human driven school bus when a self driving one is thousands of times safer? Not me. Who's going to pay $10 for a $2 ride? Very few people. Who is going to want to commute when they could work or play and be driven door to door? I don't.

So that's all I'm saying. People who ignore this and don't train and transition out of the fields of TNC, taxi driving, trucking, bus driving, etc, are headed for a huge disappointment in the next 5+ years. Maybe 7 years if SDCs run into stiff government issues which I could see in the more liberal cities who eagerly take away free market rights, but not because the technology won't be ready. So we all have plenty of time to prepare if we don't buy into the crazy talk about 30-40 years. I could easily see a close to 100% transition for transportation services in 10ish years.

Nobody's going to stop it, if anything, it's accelerating. These are the most powerful companies in the world and it's a race to the finish. The benefits will be massive and undeniable.

By the way, fleets of existing trucks and cars can be made self driving, probably for $5,000 - $10,000 each, so that will probably be what they do early on while we transition away from gas and deisel. It will mean financing or making payments to the manufacturer, but that's money they are paying out to the driver already, not a burden on their profit.


----------



## Just one more trip

There's no need to tip said:


> Yes, this BS technology that has resulted in 1 (ONE) fatality thus far vs how many with human drivers on a DAILY basis? On my way home from work today 3 different people almost changed lanes into me. It is a known fact that these systems aren't sophisticated enough yet to handle all situations. The article I posted explains more about some of the issues. The technology is still very young. So I guess according to you anything that takes time and money to perfect is just a waste and we shouldn't bother right? Look how far the technology has come in such a short period of time.


----------



## There’s no need to tip

RamzFanz said:


> BUT, that's not the good part. The good part is the cost of living is about to plummet. In 10 years you will probably pay so much less for everything, you simply won't need to earn as much money. This is one huge step to becoming a race that needs to work very little where we can do the things we want to more than need to. Wouldn't you rather work 10-15 hours a week and have the same standard of living as working full time OR double or triple your buying power from the same income level?
> 
> By the way, save every dollar you can now and grow it if possible because it's going to buy a heck of a lot more down the road.


Many of the world's leading economists have serious been discussing Universal Basic Income recently with the advancements being made in robotics and eventually AI.


----------



## RamzFanz

There's no need to tip said:


> Many of the world's leading economists have serious been discussing Universal Basic Income recently with the advancements being made in robotics and eventually AI.


UBI is inevitable at some point. Somewhere down the road, humans will, for the most part, not have jobs. We will mostly do creative things instead of labor.


----------



## painfreepc

RamzFanz said:


> UBI is inevitable at some point. Somewhere down the road, humans will, for the most part, not have jobs. We will mostly do creative things instead of labor.


No in reality many people will just be sitting on their ass getting stoned on drugs and others will be running violent gangs just for the fun of it, like in the movie Judge Dredd,

Even with money out of the equation it will still be things of value to trade human flesh being the most obvious second obvious being illegal drugs,

But you go ahead and Dream On Dream On Dream On Dream until your dream comes true..





This technology will be used to make rich people richer and poor people poorer and to keep the masses in line, you will not be able to go anywhere without there being a record of it, like the tesla car, remember it's Tethered to the Mothership..


----------



## There’s no need to tip

painfreepc said:


> This technology will be used to make rich people richer and poor people poorer and to keep the masses in line


That is certainly the other position several economists and professors are taking. It could go either way at this point.


----------



## RamzFanz

painfreepc said:


> No in reality many people will just be sitting on their ass getting stoned on drugs and others will be running violent gangs just for the fun of it, like in the movie Judge Dredd,
> 
> Even with money out of the equation it will still be things of value to trade human flesh being the most obvious second obvious being illegal drugs,
> 
> But you go ahead and Dream On Dream On Dream On Dream until your dream comes true..
> 
> 
> 
> 
> 
> This technology will be used to make rich people richer and poor people poorer and to keep the masses in line, you will not be able to go anywhere without there being a record of it, like the tesla car, remember it's Tethered to the Mothership..


Nope. Poor people are about to become better off because the cost of living is going to go down dramatically and they will have access to products and services they can't right now. Food, clothing, transportation, just about everything is going to be cheaper.


----------



## observer

http://www.marketwatch.com/story/wh...d-on-teslas-autopilot-2016-07-08?link=sfmw_tw


----------



## Ca$h4

This is what Autopilot looks like. Tesla didn't disclose accident because is was issuing $2 billion of stock.










*Legal Trouble Accelerating for Tesla Motors*

http://www.law.com/sites/almstaff/2...rt&src=EMC-Email&et=editorial&bu=The Recorder

*Tesla told regulators about Autopilot crash nine days after accident*

http://www.reuters.com/article/us-tesla-autopilot-disclosure-idUSKCN0ZL2UC


----------



## painfreepc

I would like to know about the model S with autopilot, 

What is this total number of accidents is it more, same or less than another car of its class.


----------



## WeirdBob

RamzFanz said:


> UBI is inevitable at some point. Somewhere down the road, humans will, for the most part, not have jobs. We will mostly do creative things instead of labor.


Or sit around getting stoned and drunk and play video games. Some of column A, a LOT of column B


----------



## observer

Consumer reports wants Tesla to rename Autopilot. Tesla says no way.

http://www.recode.net/2016/7/14/12187856/tesla-autopilot-consumer-reports


----------



## observer

In self driving cars most people want manufacturers to take liability.

http://www.recode.net/2016/6/29/12061492/volvo-self-driving-car-crashes-liability


----------



## RamzFanz

WeirdBob said:


> Or sit around getting stoned and drunk and play video games. Some of column A, a LOT of column B


Could be. The point is not what we do with the free time, just that it is inevitable.


----------



## LAuberX

RutRoh:

https://www.washingtonpost.com/news...any-questions-about-teslas-autopilot-feature/

most "questions" a politician has can be solved with a check. "UberStyle"


----------

