# Tesla in Autopilot mode crashes into fire truck



## goneubering (Aug 17, 2017)

http://money.cnn.com/2018/01/23/technology/tesla-fire-truck-crash/index.html

A Tesla Model S crashed into a fire truck while driving down a California highway, according to Culver City, California, firefighters.

A tweet by the local firefighter's union Monday showed a photo of a Tesla Model S with its nose wedged under the back end of a fire truck and its hood badly wrinkled. The car had been traveling at 65 miles an hour, the tweet said.


----------



## heynow321 (Sep 3, 2015)

of course it did.


----------



## jocker12 (May 11, 2017)

"Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead."

Volvo's semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. "Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed," Volvo's manual reads, meaning the cruise speed the driver punched in. "The driver must then intervene and apply the brakes." In other words, your Volvo won't brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.

The same is true for any car currently equipped with adaptive cruise control, or automated emergency braking. It sounds like a glaring flaw, the kind of horrible mistake engineers race to eliminate. Nope. These systems are designed to ignore static obstacles because otherwise, they couldn't work at all.

From https://www.wired.com/story/tesla-autopilot-why-crash-radar/


----------



## getawaycar (Jul 10, 2017)

how do you crash into a fire truck LMAO


----------



## RamzFanz (Jan 31, 2015)

So, another human caused accident?



getawaycar said:


> how do you crash into a fire truck LMAO


I know right?!

God, humans sure can suck at driving!


----------



## getawaycar (Jul 10, 2017)

RamzFanz said:


> So, another human caused accident?
> 
> I know right?!
> 
> God, humans sure can suck at driving!


Speak for yourself. The car was in autopilot mode, for the reading impaired. No human was driving it. Tesla says their autopilot can't detect stationary objects which implies the fire truck was parked or stopped at a light when the Tesla ran into it. So better pray you don't get rear-ended by a Tesla in autopilot mode lmao. Some people may suck at driving but autonomous cars suck even more.


----------



## RamzFanz (Jan 31, 2015)

getawaycar said:


> Speak for yourself. The car was in autopilot mode, for the reading impaired. No human was driving it. Tesla says their autopilot can't detect stationary objects which implies the fire truck was parked when the Tesla ran into it. So better pray you don't get rear-ended by a Tesla in autopilot mode lmao. Some people may suck at driving but autonomous cars suck even more.


Per the many many agreements you have to sign to activate autopilot in your tesla and actually use it, the human is ALWAYS the driver, just like with cruise control.

Teslas are NOT autonomous or even self driving. If a Tesla crashes because it was in autopilot, it's the DRIVER'S fault.

Driver, NOT passenger.

I've yet to hear of an actual self driving car caused accident, and they launched in May 2016.


----------



## observer (Dec 11, 2014)

getawaycar said:


> how do you crash into a fire truck LMAO


Especially going 65 MPH and not get hurt.


----------



## goneubering (Aug 17, 2017)

observer said:


> Especially going 65 MPH and not get hurt.


That part is AMAZING!!


----------



## getawaycar (Jul 10, 2017)

RamzFanz said:


> Per the many many agreements you have to sign to activate autopilot in your tesla and actually use it, the human is ALWAYS the driver, just like with cruise control.


That agreement is a joke. If you are required to be fully attentive at all times in autopilot mode then there is no point of a having an autopilot or self-driving car.

There is no point of robo-cars when they can't even detect that there is a giant fire truck in their path. You think humans are bad at driving but this is at least the second time a Tesla in autopilot mode has failed to detect a large vehicle in its path resulting in a serious accident. The first time it happened a couple years ago resulted in a fatality, and that time the truck was moving and the Tesla still failed to detect it. And they still have not fixed the problem.



RamzFanz said:


> Teslas are NOT autonomous or even self driving. If a Tesla crashes because it was in autopilot, it's the DRIVER'S fault.


Then the term autopilot is highly misleading. Tesla could be sued for the name alone. Secondly it is not humanly possible for a person to slam the brakes fast enough at 65mph to avoid a collision when the autopilot fails to do so. The Tesla disclaimer is a total copout.


----------



## Sydney Uber (Apr 15, 2014)

jocker12 said:


> "Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead."
> 
> Volvo's semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. "Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed," Volvo's manual reads, meaning the cruise speed the driver punched in. "The driver must then intervene and apply the brakes." In other words, your Volvo won't brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.
> 
> ...


It's happened to me a number of times in my X. It's The scariest thing when a car begins to accelarate towards a stationary object.


----------



## jocker12 (May 11, 2017)

Sydney Uber said:


> It's happened to me a number of times in my X. It's The scariest thing when a car begins to accelarate towards a stationary object.


Take care with that software. Maybe is better not to use it at all.


----------



## heynow321 (Sep 3, 2015)

RamzFanz said:


> Per the many many agreements you have to sign to activate autopilot in your tesla and actually use it, the human is ALWAYS the driver, just like with cruise control.
> 
> Teslas are NOT autonomous or even self driving. If a Tesla crashes because it was in autopilot, it's the DRIVER'S fault.
> 
> ...


Uh how about that car that crashed into a bus.


----------



## goneubering (Aug 17, 2017)

heynow321 said:


> Uh how about that car that crashed into a bus.


The one in Las Vegas?


----------



## RamzFanz (Jan 31, 2015)

getawaycar said:


> That agreement is a joke. If you are required to be fully attentive at all times in autopilot mode then there is no point of a having an autopilot or self-driving car.
> 
> There is no point of robo-cars when they can't even detect that there is a giant fire truck in their path. You think humans are bad at driving but this is at least the second time a Tesla in autopilot mode has failed to detect a large vehicle in its path resulting in a serious accident. The first time it happened a couple years ago resulted in a fatality, and that time the truck was moving and the Tesla still failed to detect it. And they still have not fixed the problem.
> 
> Then the term autopilot is highly misleading. Tesla could be sued for the name alone. Secondly it is not humanly possible for a person to slam the brakes fast enough at 65mph to avoid a collision when the autopilot fails to do so. The Tesla disclaimer is a total copout.


So cruise control is also a joke?

A tesla isn't a "robo-car." It has some driver assist features. DRIVER assist. I, personally, think driver assist is a terrible idea. Human's are not trustworthy.

Pilots use autopilot. Are they not still responsible for flying the plane and the proper use of autopilot?


----------



## RamzFanz (Jan 31, 2015)

heynow321 said:


> Uh how about that car that crashed into a bus.


The car had legal right of way.


----------



## heynow321 (Sep 3, 2015)

RamzFanz said:


> The car had legal right of way.


no, you yield to buses.


----------



## RamzFanz (Jan 31, 2015)

heynow321 said:


> no, you yield to buses.


Yes, that's the lesson they learned, but the car had the legal right of way.


----------



## heynow321 (Sep 3, 2015)

RamzFanz said:


> Yes, that's the lesson they learned, but the car had the legal right of way.


that doesn't matter. yet another example of the giant gap between real world driving and "driving on paper".


----------



## observer (Dec 11, 2014)

RamzFanz said:


> The car had legal right of way.





heynow321 said:


> no, you yield to buses.


While I mostly agree with RamzFanz, RamzFanz and I had already agreed that the car was in the wrong.

I reaaaallly don't want to dig back through the old threads where you agreed that the car went into the bus' path. The car made a mistake and it knowingly went into the lane as evidenced by the car having it's turn signal on. Signalling IT wanted to go into the bus' path. The bus was in it's correct lane.


----------



## getawaycar (Jul 10, 2017)

RamzFanz said:


> So cruise control is also a joke?


CC forces you to pay attention because you are manually steering. It's not comparable.
With autopilot the car is doing everything for you so it's very easy to lose your attention because you aren't doing anything to drive it. Which is why it's a bad idea. It is next to impossible to maintain your attention on the road for long when you aren't doing anything.



RamzFanz said:


> Pilots use autopilot. Are they not still responsible for flying the plane and the proper use of autopilot?


Again apples and oranges. Airplanes don't have to navigate through heavy bumper to bumper, stop and go traffic. Not a good comparison.


----------



## RamzFanz (Jan 31, 2015)

heynow321 said:


> that doesn't matter. yet another example of the giant gap between real world driving and "driving on paper".


I think you're saying is the difference between driving on paper and _learning _real world lessons. The only question is if they learned that lesson.

The only people who expected them to be perfect in testing were the naysayers. Incidents like these were inevitable and anticipated. There WILL be more. A LOT more before level 5.

But they didn't launch thinking they were going to crash every 5,000 miles and not yield to buses. That's my bet.



observer said:


> While I mostly agree with RamzFanz, RamzFanz and I had already agreed that the car was in the wrong.
> 
> I reaaaallly don't want to dig back through the old threads where you agreed that the car went into the bus' path. The car made a mistake and it knowingly went into the lane as evidenced by the car having it's turn signal on. Signalling IT wanted to go into the bus' path. The bus was in it's correct lane.


In no way am I saying the car wasn't in the wrong in the real world.

It wasn't ticketed and was not found at fault, because it was technically not at fault. It had right of way. It was a single lane, tradition or common courtesy withstanding, it had every legal right to the lane.

What we've seen since the incident is no more Waymo SDCs demanding right of way, just because it's theirs. Not that I've seen anyways.



getawaycar said:


> CC forces you to pay attention because you are manually steering. It's not comparable.


So does driver assist with many features including pulling over if you don't. Imperfect, to be sure, but so was the man in an RV who sued and won because he put it on cruise control and went to fix a meal.

I think driver assist is just not smart.



getawaycar said:


> With autopilot the car is doing everything for you so it's very easy to lose your attention because you aren't doing anything to drive it. Which is why it's a bad idea. It is next to impossible to maintain your attention on the road for long when you aren't doing anything.


I agree. Waymo agrees.



getawaycar said:


> Again apples and oranges. Airplanes don't have to navigate through heavy bumper to bumper, stop and go traffic. Not a good comparison.


You've never been on a tarmac at LAX?

The question remains. Who is in charge?

Not the car or plane. Level 4, where Waymo is, means it IS the car. Any lower comparison is the driver's fault. You can watch Tesla drivers sleep in stop and go LA traffic on youtube, but that doesn't mean they aren't in legal control.


----------



## observer (Dec 11, 2014)

RamzFanz said:


> I think you're saying is the difference between driving on paper and _learning _real world lessons. The only question is if they learned that lesson.
> 
> The only people who expected them to be perfect in testing were the naysayers. Incidents like these were inevitable and anticipated. There WILL be more. A LOT more before level 5.
> 
> ...


https://www.engadget.com/2016/02/29/google-self-driving-car-accident/

Googles statement on this accident.

Our self-driving cars spend a lot of time on El Camino Real, a wide boulevard of three lanes in each direction that runs through Google's hometown of Mountain View and up the peninsula along San Francisco Bay. With hundreds of sets of traffic lights and hundreds more intersections, this busy and historic artery has helped us learn a lot over the years. And on Valentine's Day we ran into a tricky set of circumstances on El Camino that's helped us improve an important skill for navigating similar roads.

El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you're teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It's vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.

On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn. It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph -- and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it. (You can read the details below in the report we submitted to the CA DMV.)

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that's a normal part of driving -- we're all trying to predict each other's movements. In this case, we clearly bear some responsibility, because if our car hadn't moved there wouldn't have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

We've now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.

"We *clearly* bear some responsibility"

I think they bear ALL responsibility for THIS accident. Their statement was made to limit their liability.


----------



## RamzFanz (Jan 31, 2015)

observer said:


> https://www.engadget.com/2016/02/29/google-self-driving-car-accident/
> 
> Googles statement on this accident.
> 
> ...


What liability? They were in the right.


----------



## htboston (Feb 22, 2016)

Hopefully the Tesla owner switches to Geico and save 15% off his car insurance because his premiums are going to rise


----------



## observer (Dec 11, 2014)

RamzFanz said:


> What liability? They were in the right.


The bus was in the lane as far left as he could be, right against the dividing line, like he should be. The car was to the far right of the lane, regardless of wether the lane is wide or not, the proper place to be in any lane is against the left line.

The automomous car was too far to the right, which is why it *signalled left*, to merge back into traffic. The car knew it was out of the lane *or else it wouldn't have turned on it's left blinker in the first place.*

Like I told you in that thread months ago, you lose credibility when you debate lost causes like this one, where it is abundantly clear you are wrong.

*One *accident caused by an autonomous vehicle is nothing in the bigger scheme of autonomous vehicles.


----------

