# Tesla predicts accident seconds before it happens.



## RamzFanz (Jan 31, 2015)

Just a few weeks ago, we published a report about how Tesla's new radar technology for the Autopilot is already proving useful in some potentially dangerous situations. We now have a new piece of evidence that is so spectacularly clear that it's worth updating that report.

The video of an accident on the highway in the Netherlands caught on the dashcam of a Tesla Model X shows the Autopilot's forward collision warning predicting an accident before it could be detected by the driver.

With the release of Tesla's version 8.0 software update in September, the automaker announced a new radar processing technology that was directly pushed over-the-air to all its vehicles equipped with the first generation Autopilot hardware.

One of the main features enabled by the new radar processing capacity is the ability for the system to see ahead of the car in front of you and basically track two cars ahead on the road. The radar is able to bounce underneath or around the vehicle in front of the Tesla Model S or X and see where the driver potentially can not because the leading vehicle is obstructing the view.

That's demonstrated clearly in this real world situation on the Autobahn today.

In the video embedded below, we can hear the Tesla Autopilot's Forward Collision Warning sending out an alert for seemingly no reason, but a fraction of a second later we understand why when the vehicle in front of the Tesla crashes into an SUV that wasn't visible from the standpoint of the Tesla driver, but apparently it was for the Autopilot's radar:

Hans Noordsij, the Tesla driver from the Netherlands who reported the video, said that everyone involved in the accident "turned out to be OK" despite the fact that the SUV rolled over.

What is most impressive is that fact that we can clearly hear the Forward Collision Warning alert before the lead vehicle even applied the brake, which shows that the Autopilot wasn't only using the lead vehicle to plan the path, but also the vehicle in front of it - the black SUV.

The driver of the Tesla also reported that Autopilot started braking before he could apply the brakes himself, according to Noordsij.

Again, this new feature was pushed via an over-the-air software update to all Tesla vehicles equipped with the first generation Autopilot and it should soon be pushed to the vehicles equipped with the second generation Autopilot hardware.

While collision warning and automatic emergency braking systems are evidently useful safety features, they are no substitute to staying vigilant and being ready to take control at any time.

https://electrek.co/2016/12/27/tesla-autopilot-radar-technology-predict-accident-dashcam/


----------



## RamzFanz (Jan 31, 2015)

...an accident with a car the human driver couldn't have even seen. Wild stuff.


----------



## tohunt4me (Nov 23, 2015)

RamzFanz said:


> Just a few weeks ago, we published a report about how Tesla's new radar technology for the Autopilot is already proving useful in some potentially dangerous situations. We now have a new piece of evidence that is so spectacularly clear that it's worth updating that report.
> 
> The video of an accident on the highway in the Netherlands caught on the dashcam of a Tesla Model X shows the Autopilot's forward collision warning predicting an accident before it could be detected by the driver.
> 
> ...


Tesla Radar causes Testicular Cancer !


----------



## andaas (May 19, 2015)

I can clearly see the brake lights through the windows of the car that struck the SUV about 3-4 seconds before the Tesla audible alert is fired. I would have let off the gas at that point.

I'm glad the technology exists, but doubt it would have had any impact on me unless I was distracted at the time of the incident.

*edit* I should add that I noticed the brake lights the first time I watched the video early this morning on Gizmodo... not like on one of several repeat viewings. I actually watched it a second time earlier because I thought I had noticed the lights before the alert and confirmed my suspicion.


----------



## Jermin8r89 (Mar 10, 2016)

What if the roads were wet or icey? It wouldnt have matter. Other cars have this feature already in them so not impressed


----------



## Jermin8r89 (Mar 10, 2016)

tohunt4me said:


> Tesla Radar causes Testicular Cancer !


All technoligy gives us cancer wich is y millininals will not live as long as past generation


----------



## Danny3xd (Nov 7, 2016)

tohunt4me said:


> Tesla Radar causes Testicular Cancer !


Don't blasfame our overlords!


----------



## RamzFanz (Jan 31, 2015)

andaas said:


> I can clearly see the brake lights through the windows of the car that struck the SUV about 3-4 seconds before the Tesla audible alert is fired. I would have let off the gas at that point.
> 
> I'm glad the technology exists, but doubt it would have had any impact on me unless I was distracted at the time of the incident.
> 
> *edit* I should add that I noticed the brake lights the first time I watched the video early this morning on Gizmodo... not like on one of several repeat viewings. I actually watched it a second time earlier because I thought I had noticed the lights before the alert and confirmed my suspicion.


Yes, I see your point. If this were a shorter vehicle up front or a taller vehicle behind it though, you probably couldn't have seen the lights. The point being they will be able to observe things we simply can't and react faster than humans.

Then you might consider if these were all SDCs, every car for miles could instantly know what was happening from each car's perspective and this is one more accident that will become unnecessary in the near future. If an accident still occurred, traffic could reroute on the fly within seconds, emergency services notified with video feed so they could see what they are getting into, and audio and video from every perspective archived for the investigation.


----------



## RamzFanz (Jan 31, 2015)

Jermin8r89 said:


> What if the roads were wet or icey? It wouldnt have matter. Other cars have this feature already in them so not impressed


Wet or icy? The cars would travel at safer speeds I suppose. Besides, if it were unexpected wet or ice, an SDC can react faster and avoid better with a 360 degree constant view and awareness of what options it has along with coordinating avoidance with all of the other cars.

Other cars have radar that would know a car ahead of the one in front of you is decelerating quickly? I don't think so. Not in public use.


----------



## Jermin8r89 (Mar 10, 2016)

Id support SDCs maybe abit more if there evidence of these vehicals handling weather better. These cars r based off of learning and as a new englander every storm is different. Id be currious if it goes threw a snowstorm how itd react to light snowcoateing on street an then its going down a steep hill. I know how to do that even in a focus. What about it can go threw snow but it builds up and u going down a high steep street. If i encounter that then i dont go down it if i know i wont stop at intersection at the end safely.

I feel driveing is like sex. U either r really good or really bad.
I think theres too many variables for this work even tbough we want it too


----------



## RamzFanz (Jan 31, 2015)

Id be currious if it goes threw a snowstorm how itd react to light snowcoateing on street






Baby steps.

Baby steps in technology takes days, weeks, months, and even years, but not decades.

Yes, you get better at sex with intent and practice. The entire industrialised world, using _massive_ brainpower and investments, is applying intent and practicing. There's no chance at all they are all wrong and a few discontents are correct.


----------



## Gung-Ho (Jun 2, 2015)

Big deal. I can predict an accident seconds before it happens too. Because that's about all the time you have to avoid one. If these genius cars could predict an accident a minute before it would happen then I'd be impressed


----------



## elelegido (Sep 24, 2014)

RamzFanz said:


> Just a few weeks ago, we published a report about how Tesla's new radar technology for the Autopilot is already proving useful in some potentially dangerous situations. We now have a new piece of evidence that is so spectacularly clear that it's worth updating that report.
> 
> The video of an accident on the highway in the Netherlands caught on the dashcam of a Tesla Model X shows the Autopilot's forward collision warning predicting an accident before it could be detected by the driver.
> 
> ...


Of course, the skilled and experienced driver does not just blindly follow the car in front, but will be looking at traffic and road conditions up to several hundred yards in front of him.

Had the Tesla driver been doing that then he wouldn't have had to rely on this gizmo to tell him he was at imminent risk of crashing his car.


----------



## Do tell (Nov 11, 2016)




----------



## Jermin8r89 (Mar 10, 2016)

Do tell said:


>


If hes stolked anout them then y is trying to sway peoples minds?


----------



## RamzFanz (Jan 31, 2015)

elelegido said:


> Of course, the skilled and experienced driver does not just blindly follow the car in front, but will be looking at traffic and road conditions up to several hundred yards in front of him.
> 
> Had the Tesla driver been doing that then he wouldn't have had to rely on this gizmo to tell him he was at imminent risk of crashing his car.


You can't observe what you can't see. The point of the story is the SDC could observe what it couldn't "see" using senses we simply don't have.


----------



## Jermin8r89 (Mar 10, 2016)

RamzFanz said:


> You can't observe what you can't see. The point of the story is the SDC could observe what it couldn't "see" using senses we simply don't have.


The point of the story is for big corparations control us


----------



## elelegido (Sep 24, 2014)

RamzFanz said:


> You can't observe what you can't see. The point of the story is the SDC could observe what it couldn't "see" using senses we simply don't have.


And my point is that it's a matter of driving standards. In other countries, where drivers are tested against significantly higher standards than in the US in order to obtain their licenses, all drivers are taught what are, in fact, extremely basic and common sense driving rules, including that:

- a driver should always be able to stop in the distance they can see to be clear. Thus, driving blindly behind a vehicle at speed without being able to stop or avoid it (just in case, you know, it crashes right in front of you) is _not _a good idea.
- if you don't have a good, clear view of the road ahead, slow down and increase the gap between your vehicle and the one in front. until you do have a good view of the road ahead.

These are very, very basic driving skills, which the crashing drivers in the video failed to employ. It really doesn't get much simpler than, "If you can't see what's ahead of you, slow down", and even if drivers are not taught this, it doesn't take a genius to work it out.

Yes, technology can indeed be used as last resort correctors for drivers' poor driving, but there is a great risk that they will be responsible for driving standards actually worsening. "It's ok; I can drive carelessly/like a tool; the gizmos will keep me out of trouble". Given the choice, I would favor raising driving standards to international levels.


----------



## Do tell (Nov 11, 2016)

elelegido said:


> And my point is that it's a matter of driving standards. In other countries, where drivers are tested against significantly higher standards than in the US in order to obtain their licenses, all drivers are taught what are, in fact, extremely basic and common sense driving rules, including that:
> 
> - a driver should always be able to stop in the distance they can see to be clear. Thus, driving blindly behind a vehicle at speed without being able to stop or avoid it (just in case, you know, it crashes right in front of you) is _not _a good idea.
> - if you don't have a good, clear view of the road ahead, slow down and increase the gap between your vehicle and the one in front. until you do have a good view of the road ahead.
> ...


The poor guy that was driving his tesla that crashed into a tractor-trailer in Florida must have thought this.


----------



## RamzFanz (Jan 31, 2015)

elelegido said:


> Yes, technology can indeed be used as last resort correctors for drivers' poor driving


Or to replace humans with a superior driving system.


----------



## elelegido (Sep 24, 2014)

RamzFanz said:


> Or to replace humans with a superior driving system.


SDC's time will definitely come, but for now that's still science fiction


----------



## RamzFanz (Jan 31, 2015)

elelegido said:


> SDC's time will definitely come, but for now that's still science fiction


2-3 years, so says almost the entire industry.


----------



## Jermin8r89 (Mar 10, 2016)

RamzFanz said:


> 2-3 years, so says almost the entire industry.


Agenda 21


----------



## WeirdBob (Jan 2, 2016)

RamzFanz said:


> 2-3 years, so says almost the entire industry.


Except for the head of R&D at Nissan

*Nissan says driverless cars will never match human skills-so it's using humans to back them up.*

https://www.wired.com/2017/01/nissans-self-driving-teleoperation/

_Alex Davies | Transportation | 01.05.17 | 7:00 pm_

"This is it!" Maarten Sierhuis says. "I mean, look at this." He points to a photo of road construction at an intersection in Sunnyvale, California, near Nissan's Silicon Valley research center, which Sierhuis runs. A line of cones shunts traffic to the left side of the double yellow line. The light is red. A worker holds a "Slow" sign. It's the sort of seemingly unremarkable situation that can trigger convulsions in the brain of an autonomous vehicle.

"There is so much cognition that you need here," Sierhuis says. The driver-or the car-has to interpret the placement of the cones and the behavior of the human worker to understand that in this case, it's OK to drive through a red light on the wrong side of the road. "This is not gonna happen in the next five to ten years."

It's a stunning admission, in its way: Nissan's R&D chief believes the truly driverless car-something many carmakers and tech giants have promised to deliver within five years or fewer-is an unreachable short-term goal. Reality: one; robots: zero. Even a system that could handle 99 percent of driving situations will cause trouble for the company trying to promote, and make money off, the technology. "We will always need the human in the loop," Sierhuis says.

But Nissan has a solution: a call center with human meatbags ready to take command via remote control.
. . .​


----------



## WeirdBob (Jan 2, 2016)

RamzFanz said:


> 2-3 years, so says almost the entire industry.


Except for the CEO of the Toyota Research Institute.

*Automakers are slowing their self-driving car plans - and that could be a good thing* 

_Danielle Muoio | Jan 8, 2017_

http://www.businessinsider.com/self-driving-cars-not-feasible-in-5-years-automakers-say-2017-1

It was barely two years ago that self-driving car companies were putting forth a Utopian vision of driverless cars whizzing through streets allowing passengers to sleep in steering-wheel-less cars.
. . .

But at this year's Consumer Electronics Show, Toyota pushed back on the idea that we are just a few years off from an autonomous reality.

"I need to make it perfectly clear, [full autonomy is] a wonderful, wonderful goal. But none of us in the automobile or IT industries are close to achieving true Level 5 autonomy. We are not even close," Gill Pratt, the CEO of the Toyota Research Institute, said at CES. Level 5 is an industry term for cars that are fully autonomous and do not require human supervision.
. . .

But automakers are catching on to these risks and reacting accordingly. Toyota is exploring AI that can keep a driver engaged while autonomy is still in its relative infancy. Nissan is exploring using call centers so humans can remotely intervene when self-driving cars fail. Google is keeping driver controls.
. . .​


----------



## RamzFanz (Jan 31, 2015)

WeirdBob said:


> Except for the head of R&D at Nissan
> 
> *Nissan says driverless cars will never match human skills-so it's using humans to back them up.*
> 
> ...


Yet another outlier as an example. Oh, and one they are prepared to solve in the short term with a temporary solution, so it delays nothing. This scenario would require a signal from the car to a human that it needed assistance, a human negotiating the obstacle, and then all the other cars would follow suit.

OR

The construction company notifies a central control of the lane closure who notifies all SDCs. Issue solved before the cones even go up. They will know the path to take or avoid it altogether.

Again, not a roadblock.

I noticed you forgot this line: _Knowing there's a person, somewhere, ready to help if the technology falters, could *accelerate the shift* toward a mostly a human-light future._

You notice the headline says _never _and no one said never? Did you change it? Because that's not the actual headline. Why would you? Ohhhhh, never mind, I know why.


----------



## RamzFanz (Jan 31, 2015)

WeirdBob said:


> Except for the CEO of the Toyota Research Institute.
> 
> *Automakers are slowing their self-driving car plans - and that could be a good thing*
> 
> ...


One man's opinion in the face of 20 major corporations predicting 2020 or sooner. Of course they'll have limitations at first. Did you think they wouldn't?

News flash: Level 5 is not needed to start taking our jobs en mass. The low hanging fruit is low speed urban cars, which is also the bulk of our work. Level 5 means they can go anywhere and do anything in reasonable conditions and at high speeds. It's just not necessary.


----------



## Grahamcracker (Nov 2, 2016)

RamzFanz said:


> Just a few weeks ago, we published a report about how Tesla's new radar technology for the Autopilot is already proving useful in some potentially dangerous situations. We now have a new piece of evidence that is so spectacularly clear that it's worth updating that report.
> 
> The video of an accident on the highway in the Netherlands caught on the dashcam of a Tesla Model X shows the Autopilot's forward collision warning predicting an accident before it could be detected by the driver.
> 
> ...


A very similar incident happened to me.

I was in the left lane following Suburban in a 50mph zone when the Suburban suddenly jumps into the right lane. OH [email protected]#! It's a stopped car trying to make a left!

I slam on my brakes and horn "thinking maybe the stopped car can attempt to pull forward" but I'm too close. I take a nanosecond to see if I can jump in the right lane. I can't because I'm slowing down and there's traffic passing me on the right.

So, with not a second to lose before impact, I steer my car between the stopped vehicle and the traffic on the right, straddling the center of the road.

I was hoping the traffic on the right could see me and help compensate. I figured a side swipe would cause less damage than rear ending a stopped vehicle.

When I stopped the front of my vehicle was about 2 feet passed the rear of the stopped car and traffic on the right did compensate but not without horns blowing.

Teslas software during my incident would have prevented that close call. I'm sure it will save lifes.


----------

