# Tesla in Autopilot self-driving mode crashes into parked police cruiser



## jocker12 (May 11, 2017)

The latest crash involving a Tesla in "Autopilot" mode didn't turn tragic, as some past ones have, but certainly was embarrassing.

A Tesla Model S veered into a parked police cruiser Tuesday, severely damaging both vehicles in Laguna Beach, Calif., a coastal community south of Los Angeles.

The driver, a 65-year-old from Laguna Niguel, Calif., told officers that he had engaged the car's partial self-driving system, called Autopilot. "He told us in his own statement he was in driver-assisted mode," police Sgt. Jim Cota said.

The driver suffered minor injuries, Cota said. The parked cruiser was unoccupied, the officer standing about 100 feet away off Laguna Canyon Road as he responded to a call.

Cota said the luxury electric car crashed in almost the same place as another Tesla about a year ago. The driver, he said, also pointed to the Autopilot system as being engaged.

The crash comes as Tesla has been facing scrutiny involving Autopilot. Since March, a driver died when his Model X SUV crossed the center divider in Mountain View, Calif., while in Autopilot mode. And another driver in Salt Lake City was injured in a crash when her car hit a parked fire truck. In both cases, Tesla says it can tell from its logs that drivers were either distracted or ignored the car's warnings to take control.

Regarding the latest crash, Tesla issued a statement saying "when using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. "

The statement also said that Tesla "has always been clear that *Autopilot doesn't make the car impervious to all accidents*, and before a driver can use Autopilot, they must accept a dialogue box which states that 'Autopilot is designed for use on highways that have a center divider and clear lane markings.'"

https://www.usatoday.com/story/mone...-driving-crashes-parked-police-car/654209002/


----------



## Wonkytonk (Jan 28, 2018)

"when using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times."

Talk about wanting your cake and eating it too. I mean what the heck is the point of an auto pilot that you have to maintain control at all times, I mean, isn't that called driving?


----------



## goneubering (Aug 17, 2017)

This accident is another very bad look for Tesla. I think this makes two staionary fire engines and one cop car they’ve hit. It makes me wonder if there’s a software flaw. Why can’t these cars “see” and avoid things that should be extremely obvious??!!


----------



## tomatopaste (Apr 11, 2017)

goneubering said:


> This accident is another very bad look for Tesla. I think this makes two staionary fire engines and one cop car they've hit. It makes me wonder if there's a software flaw. Why can't these cars "see" and avoid things that should be extremely obvious??!!


It's a feature, not a bug. By hitting firetrucks and police cars it saves you the hassle of having to call 911. Just, "eh, a little help over here, please."


----------



## jocker12 (May 11, 2017)

Wonkytonk said:


> I mean what the heck is the point of an auto pilot that you have to maintain control at all times


For most drivers it makes it more difficult to drive, because it says you can disconnect yourself from the driving (that would be the purpose of such system) and at the same time instructs you to stay connected and pay attention. Then, what is the purpose of Autopilot if you still need to pay attention and stay alert ready to take over? (big dilemma for the car manufacturer and the customers paying for the software).



goneubering said:


> It makes me wonder if there's a software flaw.


It is the FALSE POSITIVES detection.

"These systems are designed to ignore static obstacles because otherwise, they couldn't work at all.

"You always have to make a balance between braking when it's not really needed, and not braking when it is needed," says Erik Coelingh, head of new technologies at Zenuity, a partnership between Volvo and Autoliv formed to develop driver assistance technologies and self-driving cars. He's talking about *false positives."
*
"Raj Rajkumar, who researches autonomous driving at Carnegie Mellon University, thinks those assumptions concern one of Tesla's key sensors. "The radars they use are apparently meant for detecting moving objects (as typically used in adaptive cruise control systems), and seem to be not very good in detecting stationary objects," he says.

That's not nearly as crazy as it may seem. Radar knows the speed of any object it sees, and is also simple, cheap, robust, and easy to build into a front bumper. But it also detects lots of things a car rolling down the highway needn't worry about, like overhead highway signs, loose hubcaps, or speed limit signs. So engineers make a choice, *telling the car to ignore these things* and keep its eyes on the other cars on the road: *They program the system to focus on the stuff that's moving.*

This unsettling compromise may be better than nothing, given evidence that these systems prevent other kinds of crashes and save lives. And it's not much of a problem if every human in a semi-autonomous vehicle followed the automakers' explicit, insistent instructions to pay attention at all times, and take back control if they see a stationary vehicle up ahead.

The long term solution is to combine a several sensors, with different abilities, with more computing power. Key amongst them is lidar. These sensors use lasers to build a precise, detailed map of the world around the car, and can easily distinguish between a hub cap and a cop car. (_*obviously not that easily anymore*_)"
from https://www.wired.com/story/tesla-autopilot-why-crash-radar/

Rajkumar recommends "a long-term solution" that actually could create more problems. Adding more and different sensors on the car could potentially lead to conflicting readings that also could overload the systems, delaying the software reaction time.


----------



## uberdriverfornow (Jan 10, 2016)

jocker12 said:


> For most drivers it makes it more difficult to drive, because it says you can disconnect yourself from the driving (that would be the purpose of such system) and at the same time instructs you to stay connected and pay attention. Then, what is the purpose of Autopilot if you still need to pay attention and stay alert ready to take over? (big dilemma for the car manufacturer and the customers paying for the software).
> 
> It is the FALSE POSITIVES detection.
> 
> ...


you can't program a human brain into software, period, end of story

that's why sdc's will never ever work


----------



## jocker12 (May 11, 2017)

uberdriverfornow said:


> you can't program a human brain into software, period, end of story
> 
> that's why sdc's will never ever work


I keep saying this is not going to work and the project is doomed to collect dust in the history books and the museums.

This is only one of their problems. All companies involved use the same approach and their lack of answers when the robots crash shows they've hit the wall in terms of system development. At this point, they need to go back and start the "computer vision" "disruption" "transportation network" delusional charade from scratch. These imposters hoping to make fortunes at the end of their tunnel are facing their deep bankruptcies.

They've disrupted themselves.

This article - Yes, Autonomous Cars Are Going To Kill People Before They Save Lives, Expert Says - is another painful PR effort to flip the BS into a possible golden opportunity. The "expert" is "Mark Rosekind, a former administrator of the National Highway Traffic Safety Administration during the Obama Administration." A year ago, and please pay attention to this, Recode reported how _Mark Rosekind, the former head of the National Highway Traffic Safety Administration, is now at *secretive self-driving startup Zoox** as its chief safety innovation officer**. *_Futurism.com doesn't mention Rosekind's last career achievment, but trumpets how the sooner more people will die, the better chances we have to save more lives. This is like the stupid slogan "The more you spend, the more you save!"

I challenge the idiot in charge named Mark Rosekind, to sacrifice his entire family testing unsafe robots, to feel the burden of having innocents killed for corporate greed. Or come out of his deep rat hole and tell people how much money is being paid by Zoox to eat BS like it's whipped cream on top of a cherry pie.


----------



## uberdriverfornow (Jan 10, 2016)

jocker12 said:


> I keep saying this is not going to work and the project is doomed to collect dust in the history books and the museums.
> 
> This is only one of their problems. All companies involved use the same approach and their lack of answers when the robots crash shows they've hit the wall in terms of system development. At this point, they need to go back and start the "computer vision" "disruption" "transportation network" delusional charade from scratch. These imposters hoping to make fortunes at the end of their tunnel are facing their deep bankruptcies.
> 
> ...


sorry, though i quoted your post, i didn't to make it apoear you were saying sdc's would work


----------



## Hackenstein (Dec 16, 2014)

They keep pushing it and investing Billions, but the US economy is such that a whole lot of people rely on driving for a living. 

And a lot of people rely in part on someone who drives for a living. 

What exactly do they think is going to happen here? Eliminate an incomprehensible number of jobs and replace them with what? AI is making the job pool smaller by the day.


----------



## jocker12 (May 11, 2017)

Hackenstein said:


> They keep pushing it and investing Billions, but the US economy is such that a whole lot of people rely on driving for a living.
> 
> And a lot of people rely in part on someone who drives for a living.
> 
> What exactly do they think is going to happen here? Eliminate an incomprehensible number of jobs and replace them with what? AI is making the job pool smaller by the day.


It is a scam.

When you're scamming people you only focus on how to get away with it, get the money, run and disappear. If you ask what do they think is going to happen here, I am telling you, they have no clue how they are going to clean the cars between the rides yet. - https://uberpeople.net/threads/the-dirty-truth-coming-for-self-driving.260675/

A bunch of incompetents hyping the nerds about how their future is about getting drunk or getting laid (tell nerds what they like to hear) while a "more responsible" robot will take them to their parents home for 10 cents a mile. Hahahaha.....

Remember this?


----------



## tomatopaste (Apr 11, 2017)

Hackenstein said:


> They keep pushing it and investing Billions, but the US economy is such that a whole lot of people rely on driving for a living.
> 
> And a lot of people rely in part on someone who drives for a living.
> 
> What exactly do they think is going to happen here? Eliminate an incomprehensible number of jobs and replace them with what? AI is making the job pool smaller by the day.


Almost 6 billion more was invested today alone. Waymo ordered 65 thousand more self-driving Pacificas worth 2.6 billion. Softbank and GM dumped another 3.5 billion into Cruise Automation. If only they had signed up for UP they'd realize just how stupid they really are. Eh, live and learn.



Hackenstein said:


> They keep pushing it and investing Billions, but the US economy is such that a whole lot of people rely on driving for a living.
> 
> And a lot of people rely in part on someone who drives for a living.
> 
> What exactly do they think is going to happen here? Eliminate an incomprehensible number of jobs and replace them with what? AI is making the job pool smaller by the day.


Have you ever considered dog grooming?


----------



## jocker12 (May 11, 2017)

Oh... and about those invested billions.... ask these guys where their money went - $1.1 Trillion In Assets Were Impacted By Software Failures In 2016

Softbank money is coming from Asia and Saudi Arabia - "SoftBank recently took a hit from its Sprint acquisition. But Son has lost big before. As the dot.com bubble burst, he reportedly *lost $70 billion in one day.* He admits that 99% of his net worth was wiped out in 2000.

His latest big venture is a $100 billion fund launched by SoftBank and the government of Saudi Arabia in October."Life's too short" to do anything small, Son said recently in India." - http://money.cnn.com/2016/12/07/technology/masayoshi-son-trump-softbank-japan/index.html

Masayoshi Son is a God for Silicon Valley tech eager to waste foreign billions only because Son's life is too short anyway.


----------



## uberdriverfornow (Jan 10, 2016)

Softbank is the resident idiot when it comes to investing money. There are literally no videos of length over 5 minutes from any sdc company that shows the cars actually work and they are pumping all this money into these companies and technology. It'll all come crashing down just as soon as that drive.ai company goes live in Texas and the first death happens.


----------



## Hackenstein (Dec 16, 2014)

tomatopaste said:


> Have you ever considered dog grooming?


So much anger. Sad.


----------



## tomatopaste (Apr 11, 2017)

Hackenstein said:


> So much anger. Sad.


----------



## transporter007 (Feb 19, 2018)

jocker12 said:


> The latest crash involving a Tesla in "Autopilot" mode didn't turn tragic, as some past ones have, but certainly was embarrassing.
> 
> A Tesla Model S veered into a parked police cruiser Tuesday, severely damaging both vehicles in Laguna Beach, Calif., a coastal community south of Los Angeles.
> 
> ...


Yet the SDC didn't pull out a weapon and shoot the passenger


----------



## tomatopaste (Apr 11, 2017)

uberdriverfornow said:


> you can't program a human brain into software, period, end of story
> 
> that's why sdc's will never ever work


No, that's why self-driving cars are much much safer. There's no emotion, no anger, no panic. More and better sensors, clinical reaction to input it receives.


----------



## uberdriverfornow (Jan 10, 2016)

tomatopaste said:


> No, that's why self-driving cars are much much safer. There's no emotion, no anger, no panic. More and better sensors, clinical reaction to input it receives.


anger and panic have no real negative affect on driving, they are just emotions

and common sense that humans have far outweighs any negative effects you can argue that anger and panic could have


----------



## tomatopaste (Apr 11, 2017)

uberdriverfornow said:


> anger and panic have no real negative affect on driving, they are just emotions


Really?

*Uber driver held after fatal shooting of passenger in Denver*

*https://www.cnn.com/2018/06/01/us/uber-driver-shooting-denver/index.html*


----------



## uberdriverfornow (Jan 10, 2016)

tomatopaste said:


> Really?
> 
> *Uber driver held after fatal shooting of passenger in Denver*
> 
> *https://www.cnn.com/2018/06/01/us/uber-driver-shooting-denver/index.html*


I think being forced to shoot someone while driving is going to affect driving. I think that's common sense. But common sense is clearly overrated.

But that doesn't mean that simply being angered while driving is going to affect your driving.


----------



## tomatopaste (Apr 11, 2017)

uberdriverfornow said:


> I think being forced to shoot someone while driving is going to affect driving. I think that's common sense. But common sense is clearly overrated.
> 
> But that doesn't mean that simply being angered while driving is going to affect your driving.


So what you're saying is, if someone is eating potato chips in the car, the car will have no choice but to shoot him? Ok, I get your point.


----------



## transporter007 (Feb 19, 2018)

uberdriverfornow said:


> anger and panic have no real negative affect on driving, they are just emotions
> 
> and common sense that humans have far outweighs any negative effects you can argue that anger and panic could have


"_Anger and panic have no real affect on driving"_
uberdriverfornow, renaissance man



tomatopaste said:


> So what you're saying is, if someone is eating potato chips in the car, the car will have no choice but to shoot him? Ok, I get your point.


Side note: these comments seem to originate from a foreign parallel universe










Up is down, Black is white, day is night, Kirk & Spock are blood thirsty opportunist (and Spock has a beard)


----------



## getawaycar (Jul 10, 2017)

Another day, another major accident involving Tesla's dangerous autopilot feature. What else is new?

Why the heck haven't these cars been recalled yet? They are a clear threat to public safety. Did Tesla pay someone off at the DOT? Cars from every other automaker routinely get recalled for problems that are much less serious, but for some reason Tesla is treated with kid gloves and enjoys special treatment even when their defective products are killing people on a routine basis.


----------



## heynow321 (Sep 3, 2015)

getawaycar said:


> Another day, another major accident involving Tesla's dangerous autopilot feature. What else is new?
> 
> Why the heck haven't these cars been recalled yet? They are a clear threat to public safety. Did Tesla pay someone off at the DOT? Cars from every other automaker routinely get recalled for problems that are much less serious, but for some reason Tesla is treated with kid gloves and enjoys special treatment even when their defective products are killing people on a routine basis.


b/c millennials would have a mass crying fit. they love elons d#$%


----------

