# Replacing human-driven cars with fully autonomous vehicles will take 30 years or more, Aurora CEO Ch



## jocker12 (May 11, 2017)

https://www.recode.net/2017/9/4/162...le-human-safety-recode-decode-podcast-swisher

And full podcast - https://www.recode.net/2017/9/8/162...iving-car-engineer-chris-urmson-recode-decode


----------



## jocker12 (May 11, 2017)

I think this podcast needs some analysis in order to understand how the "smart" guys from Silicon Valley want to trick the other "smart" guys from the car industry, and also how the Silicon Valley nerds lie to the general public in order to feed peoples imagination and keep the hype for a false technology alive.

Part I

*First - The "statistical" fallacy*
In order to make the autonomous cars idea appealing to the general public, Silicon Valley's main argument from the beginning was how those robot cars will save thousands of lives. In order to support that, they used something they are very good at - statistics. They've claimed that if you want to understand the good in the self driving cars you need to understand how _statistically _(and I am going to use this website's info - http://asirt.org/initiatives/informing-road-users/road-safety-facts/road-crash-statistics), 3.287 people die from car crashes every single day. The general public got their punch in the face, because any mentally sane person, looking at those numbers, it will accept self driving cars engineers as angels meant to save human lives with their technology.

Hummmmmm... Let's take a break and listen to Chris Urmson here for a second - _"*So you know in America, somebody dies in a car accident about 1.15 times per 100 million miles. That's like 10,000 years of an average person's driving*. So, let's say the technology is pretty good but not that good. You know, someone dies once every 50 million miles.". _According to Insurance Institute for Highway Safety - Highway Loss data Institute, "There were 32,166 fatal motor vehicle crashes in the United States in 2015 in which 35,092 deaths occurred. This resulted in 10.9 deaths per 100,000 people and 1.13 deaths per 100 million miles traveled", so Chris Urmson is correct, and a person needs to drive for 10.000 years in average to get to the unfortunate point of the possibility of being killed in a car accident. So, the cars and the real people driving them are incredibly safe at this point, and is no reason for anybody to actually panic because "driving is not safe" and needs to be replaced.

*Second - The "road rage" fallacy*
Here is what Chris Urmson, former Google self driving project CTO had to say about driving - _"So, people like to drive some of the time, right? There's an awful lot of driving that people do that they really hate. *Road rage would not be a thing if people truly liked driving*." 
_
Whaaat? The National Highway Traffic Safety Administration defines road rage as when a driver "commits moving traffic offenses so as to endanger other persons or property; an assault with a motor vehicle or other dangerous weapon by the operator or passenger of one motor vehicle on the operator or passengers of another motor vehicle". So Aurora CEO Chris Urmson tries to convince the idiots how the majority of drivers suffer from road rage (wants to endanger other people lives or property - essentially need mental medical help) and subsequently, there are not so many people that like driving. Well, by driving everyday, every single driver can agree how road rage, even if it occurs, is not a constant state of mind or a permanent "ecosystem" (if I want to use one of Chris's favorite words during this interview). Road rage is rare, impredictable and in very many cases, lasts seconds. No question - is dangerous, but implying how road rage won't be a thing if people truly liked driving is not even close to reality. Chris Urmson, like many self drivable cars advocates, uses *generalization* and *exaggeration*, meant to scare people and trigger their emotional response, which in this case is about embracing the autonomous cars technology as a blessing to be generally accepted and supported.

In reality, and Chris Urmson says it in this interview - _"*The company really wants to make a product*. I believe that deeply. There's very little value in working on technology if you don't get it out there and get it helping people.". _You are so generous Chris, but maybe it will be better to stick with engineering and stop trying to scare people to make them buy your product....

*Third - Human psychology problem*
It is interesting to learn how the self driving cars engineers already know what the main problem is - human psychology. Something they barely understand and can't control. Here is Chris again - _"Well, it's not even that they grab for it, it's that they experience it for a while and it works, right? And maybe it works perfectly every day for a month. The next day it may not work, but their experience now is, "Oh this works," *and so they're not prepared to take over and so their ability to kind of save it and monitor it decays with time*." _

In other words, once adjusted to a non threatening environment, the human brain relaxes and it disconnects its focus from the initial potential source of danger. People change their focus, and even if they don't, the ability to reconnect and react to a sudden change in the environment, when is about a vehicle in motion, it is not quick enough. The self drivable system is failing and the brain cannot react quick enough (hands on the steering wheel, feet on the pedals, eyes on the road, asses the situation and properly react to it), resulting in a potentially fatal situation. Joshua's Brown death speaks for itself -*Tesla driver in fatal 'Autopilot' crash got numerous warnings: U.S. government* 
(https://www.reuters.com/article/us-...umerous-warnings-u-s-government-idUSKBN19A2XC)
*
Fourth - Self drivable engineers sneaky approach
*
This guys don't know what they are doing. They know what they want, but don't know how to achieve those goals. They try to lie the car manufacturers and the general public, because they feel like they "sit" on something valuable, and if their PR speech is done right, they might end up as billionaires. Let's listen to Chris again - _"Yeah, I think on the one hand, you'd like to be able to be the one company doing this and pushing it forward, and on the other, it's awesome. Because you know a - what is it? - a rising sea floats all boats. And so if you think about the social values of the increased safety on the road, the better access to mobility for people, we want to see this happen, right? I think it's really important for society, *and so as more companies get involved, there's a broader ecosystem, there's more likelihood that one of them succeeds*. So I think that's fantastic, right? That's very desirable."
_
If they would have known how to do it, if it would have been only about building the hardware and writing the simple software with binary options, the self drivable cars would have been on the roads long time ago. But they are not, and those guys saying they would change the world of transportation, are clueless about how to do it. They think, according to Chris Ulmson, the more companies involved, the better the chances to succeed. So, according to that logic, it is not about the quality of the brains involved, it is about the number of the organizations involved....


----------



## jocker12 (May 11, 2017)

Part II

If we take a closer look to what Chris is saying, the industry is already questioning Tesla's Auto Pilot approach - "S_o some of what he's doing really makes a ton of sense, _*but it's hard for me to believe that with that just cameras that they're going to, in any time in the near future*_, get to the level of reliability you really need to launch a car where you can kind of go to sleep in it and let it take you where you want to go."_

People familiar enough with this technology, remember how Anthony Lewandowsky, the Uber engineer that left Google with 14000 documents, also questioned Teslas approach - "_Yo! I'm back at 80%, super pumped ... _*we've got to start calling Elon on his shit. I'm not on social media but let's start "faketesla" and start giving physics lessons about stupid shit Elon says*_ like this: "we do not anticipate using lidar. Just to make it clear, lidar essentially is active photon generation in the visible spectrum - radar is active photon generation in essentially the radio spectrum. But lidar doesn't penetrate intrusions so it does not penetrate rain, fog, dust and snow, whereas a radar does. Radar also bounces and lidar doesn't bounce very well. You can't do the "look in front of the car in front of you" thing. So I think the obvious thing is to use radar and not use lidar._"(https://www.recode.net/2017/8/15/16...-levandowski-texts-tesla-google-waymo-lawsuit)

Also Wall Street Journal noted in its Aug.27th 2017 article - *Tesla's Push to Build a Self-Driving Car Sparked Dissent Among Its Engineers - *Elon Musk's ambitious goals for Autopilot technology have prompted safety warnings and resignations - "In a meeting after the October announcement, someone asked Autopilot director Sterling Anderson how Tesla could brand the product "Full Self-Driving," several employees recall. "This was Elon's decision," they said he responded. Two months later, Mr. Anderson resigned. Behind the scenes, the Autopilot team has clashed over deadlines and design and marketing decisions, according to more than a dozen people who worked on the project and documents reviewed by The Wall Street Journal. In recent months, the team has lost at least 10 engineers and four top managers-including Mr. Anderson's successor, who lasted less than six months before leaving in June."(https://www.wsj.com/articles/teslas...sparks-dissent-among-its-engineers-1503593742)

*Fifth - Ignoring the corporate thing - Planned Obsolescence*

So what is Planned Obsolescence? According to Wikipedia - _"*Planned obsolescence*, or *built-in obsolescence*, in industrial design and economics is a policy of planning or designing a product with an artificially limited useful life, so it will become obsolete (that is, unfashionable or no longer functional) after a certain period of time. The rationale behind the strategy is to generate long-term sales volume by reducing the time between repeat purchases (referred to as "shortening the replacement cycle")."_

When referring to car manufacturers and cars in general, planned obsolescence is called "depreciation". In value and in quality. It is planned and religiously followed by the industry.

Now what has depreciation to do with Chris Urmson's interview? Let's take a look and listen to Aurora's CEO -
_"I don't know if you own a car or not.

Yeah, I do.

You probably use it an hour, maybe two hours a day at most. *If you're in the ride-sharing business and you operate it effectively, you might be operating that car 16, 18 hours a day*. And so you get much higher utilization so there's much higher value you create out of that vehicle."_

Well, what Chris is intentionally ignoring here is exactly the cars "depreciation" at a much higher rate, if you "operate it effectively". Of course you can do that, but the products life it will be much shorter than it normally is, so realistically, your product ages at a much faster rate than the manufacturer projected it, and in conclusion, your losses will add up much quicker than you are used to. And you will need a new product much sooner than you think.

*Sixth - The lie of mutual respect between car manufacturers and Silicon Valley nerds
*
Chris Urmson again - "_So I think it's really important to understand the context, right? And *I think the biggest thing that's lacking - honestly, on both sides - is mutual respect*. Because I think it's very easy for Silicon Valley to look at the car companies and say, "Oh my goodness, they're so slow." They're going to disrupt them, right? And then it's very easy for the car companies to say, "Oh my goodness, look at those Silicon Valley guys, they're so seat-of-their-pants," right? "How can they actually ever do anything big and complicated?" Obviously both of those statements are completely false._"

As I've already mentioned above, driving is already very safe as it is or as Chris puts it -"*S*_*o you know in America, somebody dies in a car accident about 1.15 times per 100 million miles. That's like 10,000 years of an average person's driving*_" and the car manufacturers know it. They also know, like good corporate people they are, that as long as the product doesn't need a change, there is no reason to make one. That is simple, logical, almost surgical way of corporate thinking. Realistically, they want the most profit with the least investment. 

But the Silicon Valley nerds, decided that in order to potentially make more billions, they will need the "next big thing" to happened, and transportation could be the right industry to disrupt for it. If Uber had the luck to do it at that scale, why not change the whole thing all together?

If we want to put al the puzzle pieces together, we need to go back to Anthony Levandowski, former Google engineer, former Uber self driving car unit leader, that left Google with 14000 stolen documents. He was "One of the pioneers of autonomous vehicles, the brains behind much of Google's self-driving efforts. He left Google last year to create his own startup, Otto, which retrofits trucks to make them autonomous. Three months after its public debut, Otto was bought by Uber in exchange for 1% of the company - a deal valued at nearly $680 million. Plus they get a 20% cut of all future profits from the trucking division."(http://www.businessinsider.com/anthony-levandowski-uber-interview-2016-12)

The whole idea of self driving cars was pushed forward inside Google by a crook. According to his Wikipedia page - "In 2007 Levandowski joined Google to work on Google Street View with Sebastian Thrun, whom he had met at the 2005 DARPA Grand Challenge. While still working at Google he founded 510 Systems, a mobile mapping start-up that experimented with Lidar technology. Then in 2008 he founded Anthony's Robots to build a self-driving Toyota Prius called the "Pribot." According to The Guardian, it was "a self-driving Toyota Prius with one of the first spinning Lidar laser ranging units, and the first ever to drive on public roads." Google acquired both 510 Systems and Anthony's Robots to advance the development of its self-driving car project"
According to The New York Times, Levandowski "said that he had decided to leave Google because he was eager to commercialize a self-driving vehicle as quickly as possible."(https://www.nytimes.com/2016/05/17/...a-self-driving-car-trucks-may-come-first.html). The more detailed reason was because he was pushing Google to have more testing in real life environments, but Google didn't want to take the risk to do it. And Anthony left for Uber, by creating Otto in January 2016 to be acquired by Uber in late July 2016 .

Infested by the self drivable cars virus, Silicon Valley wants to create the next big thing out of the blue, by pushing the car manufacturers into a unnecessary change. Meanwhile, the car manufacturers, scared by the hype initiated by Silicon Valley, started their own self drivable cars programs, only to make sure that if the "next big thing" happens and it's about autonomous cars, they will be part of it and not out of it, like Kodak during the digital photography revolution (https://dealbook.nytimes.com/2012/01/19/eastman-kodak-files-for-bankruptcy/?mcubz=0).

As Chris acknowledges, "_*I don't think that's necessarily the right path, for Google to make a car*. I think Google is very good at the technology side of this, the self-driving technology ..._". Also Apple dropped their ambitions of building a self drivable car from scratch. This is car manufacturers job, and in this "ecosystem" Elon Musk should be greatly appreciated for taking the risk to challenge the industry with an entirely new product against all odds.

Because the electric car, not the self drivable car, is the future.


----------



## heynow321 (Sep 3, 2015)

most of the smarter people around here already know all of this as it's not far removed from common sense. only the mental toddlers here think this crap is coming to fruition any time soon.


----------



## RamzFanz (Jan 31, 2015)

jocker12 said:


> *First - The "statistical" fallacy*
> In order to make the autonomous cars idea appealing to the general public, Silicon Valley's main argument from the beginning was how those robot cars will save thousands of lives. In order to support that, they used something they are very good at - statistics. They've claimed that if you want to understand the good in the self driving cars you need to understand how _statistically _(and I am going to use this website's info - http://asirt.org/initiatives/informing-road-users/road-safety-facts/road-crash-statistics), 3.287 people die from car crashes every single day. The general public got their punch in the face, because any mentally sane person, looking at those numbers, it will accept self driving cars engineers as angels meant to save human lives with their technology.
> 
> Hummmmmm... Let's take a break and listen to Chris Urmson here for a second - _"*So you know in America, somebody dies in a car accident about 1.15 times per 100 million miles. That's like 10,000 years of an average person's driving*. So, let's say the technology is pretty good but not that good. You know, someone dies once every 50 million miles.". _According to Insurance Institute for Highway Safety - Highway Loss data Institute, "There were 32,166 fatal motor vehicle crashes in the United States in 2015 in which 35,092 deaths occurred. This resulted in 10.9 deaths per 100,000 people and 1.13 deaths per 100 million miles traveled", so Chris Urmson is correct, and a person needs to drive for 10.000 years in average to get to the unfortunate point of the possibility of being killed in a car accident. So, the cars and the real people driving them are incredibly safe at this point, and is no reason for anybody to actually panic because "driving is not safe" and needs to be replaced.


You focus on deaths because it suits your argument. Deaths are down not because we are good drivers, we aren't, but because of improved safety standards.

Now consider the worldwide 20-50 million injuries per year, most of them permanent injuries.



jocker12 said:


> *Second - The "road rage" fallacy*
> Here is what Chris Urmson, former Google self driving project CTO had to say about driving - _"So, people like to drive some of the time, right? There's an awful lot of driving that people do that they really hate. *Road rage would not be a thing if people truly liked driving*."
> _
> Whaaat? The National Highway Traffic Safety Administration defines road rage as when a driver "commits moving traffic offenses so as to endanger other persons or property; an assault with a motor vehicle or other dangerous weapon by the operator or passenger of one motor vehicle on the operator or passengers of another motor vehicle". So Aurora CEO Chris Urmson tries to convince the idiots how the majority of drivers suffer from road rage (wants to endanger other people lives or property - essentially need mental medical help) and subsequently, there are not so many people that like driving. Well, by driving everyday, every single driver can agree how road rage, even if it occurs, is not a constant state of mind or a permanent "ecosystem" (if I want to use one of Chris's favorite words during this interview). Road rage is rare, impredictable and in very many cases, lasts seconds. No question - is dangerous, but implying how road rage won't be a thing if people truly liked driving is not even close to reality. Chris Urmson, like many self drivable cars advocates, uses *generalization* and *exaggeration*, meant to scare people and trigger their emotional response, which in this case is about embracing the autonomous cars technology as a blessing to be generally accepted and supported.
> ...


This is taking Chris out of context. _There's an awful lot of driving that people do that they really hate _and _Road rage would not be a thing if people truly liked driving _are two seperate thoughts.

Are you trying to say most people don't hate most miles they drive? Congestion? Long boring commutes? Staying alert and trapped?

Of course we do.



jocker12 said:


> *Third - Human psychology problem*
> It is interesting to learn how the self driving cars engineers already know what the main problem is - human psychology. Something they barely understand and can't control. Here is Chris again - _"Well, it's not even that they grab for it, it's that they experience it for a while and it works, right? And maybe it works perfectly every day for a month. The next day it may not work, but their experience now is, "Oh this works," *and so they're not prepared to take over and so their ability to kind of save it and monitor it decays with time*." _
> 
> In other words, once adjusted to a non threatening environment, the human brain relaxes and it disconnects its focus from the initial potential source of danger. People change their focus, and even if they don't, the ability to reconnect and react to a sudden change in the environment, when is about a vehicle in motion, it is not quick enough. The self drivable system is failing and the brain cannot react quick enough (hands on the steering wheel, feet on the pedals, eyes on the road, asses the situation and properly react to it), resulting in a potentially fatal situation. Joshua's Brown death speaks for itself -*Tesla driver in fatal 'Autopilot' crash got numerous warnings: U.S. government*
> (https://www.reuters.com/article/us-...umerous-warnings-u-s-government-idUSKBN19A2XC)


----------



## RamzFanz (Jan 31, 2015)

jocker12 said:


> *Fourth - Self drivable engineers sneaky approach
> *
> This guys don't know what they are doing. They know what they want, but don't know how to achieve those goals. They try to lie the car manufacturers and the general public, because they feel like they "sit" on something valuable, and if their PR speech is done right, they might end up as billionaires. Let's listen to Chris again - _"Yeah, I think on the one hand, you'd like to be able to be the one company doing this and pushing it forward, and on the other, it's awesome. Because you know a - what is it? - a rising sea floats all boats. And so if you think about the social values of the increased safety on the road, the better access to mobility for people, we want to see this happen, right? I think it's really important for society, *and so as more companies get involved, there's a broader ecosystem, there's more likelihood that one of them succeeds*. So I think that's fantastic, right? That's very desirable."
> _
> If they would have known how to do it, if it would have been only about building the hardware and writing the simple software with binary options, the self drivable cars would have been on the roads long time ago. But they are not, and those guys saying they would change the world of transportation, are clueless about how to do it. They think, according to Chris Ulmson, the more companies involved, the better the chances to succeed. So, according to that logic, it is not about the quality of the brains involved, it is about the number of the organizations involved....


I don't even know where you're going with this. It's a proven fact that our minds and attention would wander if we auto-piloted around 98% of the time but had to take over 2% of the time. There are already studies.

And no, that's not what Chris is saying. The quality of brains involved on this single goal probably exceeds any undertaking by corporations in history. They didn't do it before because the technology wasn't there. Now it is.



jocker12 said:


> If we take a closer look to what Chris is saying, the industry is already questioning Tesla's Auto Pilot approach - "S_o some of what he's doing really makes a ton of sense, _*but it's hard for me to believe that with that just cameras that they're going to, in any time in the near future*_, get to the level of reliability you really need to launch a car where you can kind of go to sleep in it and let it take you where you want to go."_
> 
> People familiar enough with this technology, remember how Anthony Lewandowsky, the Uber engineer that left Google with 14000 documents, also questioned Teslas approach - "_Yo! I'm back at 80%, super pumped ... _*we've got to start calling Elon on his shit. I'm not on social media but let's start "faketesla" and start giving physics lessons about stupid shit Elon says*_ like this: "we do not anticipate using lidar. Just to make it clear, lidar essentially is active photon generation in the visible spectrum - radar is active photon generation in essentially the radio spectrum. But lidar doesn't penetrate intrusions so it does not penetrate rain, fog, dust and snow, whereas a radar does. Radar also bounces and lidar doesn't bounce very well. You can't do the "look in front of the car in front of you" thing. So I think the obvious thing is to use radar and not use lidar._"(https://www.recode.net/2017/8/15/16...-levandowski-texts-tesla-google-waymo-lawsuit)
> 
> Also Wall Street Journal noted in its Aug.27th 2017 article - *Tesla's Push to Build a Self-Driving Car Sparked Dissent Among Its Engineers - *Elon Musk's ambitious goals for Autopilot technology have prompted safety warnings and resignations - "In a meeting after the October announcement, someone asked Autopilot director Sterling Anderson how Tesla could brand the product "Full Self-Driving," several employees recall. "This was Elon's decision," they said he responded. Two months later, Mr. Anderson resigned. Behind the scenes, the Autopilot team has clashed over deadlines and design and marketing decisions, according to more than a dozen people who worked on the project and documents reviewed by The Wall Street Journal. In recent months, the team has lost at least 10 engineers and four top managers-including Mr. Anderson's successor, who lasted less than six months before leaving in June."(https://www.wsj.com/articles/teslas...sparks-dissent-among-its-engineers-1503593742)


Not sure where you're going here. People disagree. People come and go. What's your point?


----------



## RamzFanz (Jan 31, 2015)

jocker12 said:


> *Fifth - Ignoring the corporate thing - Planned Obsolescence*
> 
> So what is Planned Obsolescence? According to Wikipedia - _"*Planned obsolescence*, or *built-in obsolescence*, in industrial design and economics is a policy of planning or designing a product with an artificially limited useful life, so it will become obsolete (that is, unfashionable or no longer functional) after a certain period of time. The rationale behind the strategy is to generate long-term sales volume by reducing the time between repeat purchases (referred to as "shortening the replacement cycle")."_
> 
> ...


OK? So? If you're earning profit per mile, who cares that they will be high mileage and/or replaced more often? More miles faster means more profit faster.



jocker12 said:


> *Sixth - The lie of mutual respect between car manufacturers and Silicon Valley nerds
> *
> Chris Urmson again - "_So I think it's really important to understand the context, right? And *I think the biggest thing that's lacking - honestly, on both sides - is mutual respect*. Because I think it's very easy for Silicon Valley to look at the car companies and say, "Oh my goodness, they're so slow." They're going to disrupt them, right? And then it's very easy for the car companies to say, "Oh my goodness, look at those Silicon Valley guys, they're so seat-of-their-pants," right? "How can they actually ever do anything big and complicated?" Obviously both of those statements are completely false._"
> 
> ...


I see you have convicted Anthony Levandowski without him even being charged. Your obvious tilt is obvious. Forget that he had a perfectly reasonable explanation for taking the documents.

Once again, I don't know what your point is here.

The electric self driving TNC car is the future because it makes economic sense and will be safer and more environmentally friendly.


----------



## jocker12 (May 11, 2017)

RamzFanz said:


> Not sure where you're going here. People disagree. People come and go. What's your point?


Because of their greed, some of the people involved are starting to explain the flaws their competitors have. It is difficult for a regular consumer to understand the complexity of the guidance systems used and their weaknesses. It takes a very good engineer to explain it, and when you have few Google former top Lidar development guys, expressing doubts about Tesla Autopilot project, you can understand how nobody knows exactly what they are doing. This is not a matter of time to do the testing and the fine tuning, which is going to go on forever. At this point is a matter of being capable to convince the car manufacturers and the public how your idea, which you are not entirely sure about, it could be better. They are an army of blind people guided in their self declared progressive efforts, by a blind ideal.



RamzFanz said:


> OK? So? If you're earning profit per mile, who cares that they will be high mileage and/or replaced more often? More miles faster means more profit faster


Hahahahaha.... First, we are yet to see any profit out of any TNC platform.....

Second, the quicker the product reaches it's inevitable projected death, the "profit" you very gracefully but totally unrealistically mention here, it will be recycled into the new product, so won't be "profit" anymore. This scenario only benefits the corporation selling their product, in this case the cars, because it assures their customers will be back for more. The sooner you''ll devalue the product by abusing it's utilization over it's projected average daily use, the sooner you'll buy another one to replace it. You need to understand what "planned obsolesce" was meant for - "Planned obsolescence tends to work best when a producer has at least an oligopoly (*a **market form wherein a market or industry is dominated by a small number of sellers*).] Before introducing a planned obsolescence, the producer has to know that the consumer is at least somewhat likely to buy a replacement from them. In these cases of planned obsolescence, *there is an information asymmetry between the producer - who knows how long the product was designed to last - and the consumer, who does not*."

Also - "By the late 1950s, _planned obsolescence_ had become a commonly used term for products designed to break easily or to quickly go out of style."
and "In 1960, cultural critic Vance Packard published _The Waste Makers_, promoted as an exposé of "*the systematic attempt of business to make us wasteful, debt-ridden, permanently discontented individuals*".



RamzFanz said:


> I see you have convicted Anthony Levandowski without him even being charged. Your obvious tilt is obvious. Forget that he had a perfectly reasonable explanation for taking the documents.


Whaaat? Perfectly reasonable explanation? To steal the documents in order to make sure Google will pay his $120 million bonus? Hahahaha... Was there any strong enough suspicion Google won't pay him the bonus in the first place? And is this practice of stealing sensitive information from your employer ever acceptable under any circumstances? When he played the fifth he told the world who he really is. In December 2016 I've said on a different forum, months before this scandal got started, that Levandowski is an idiot. If you have a different opinion, please hire him in your company and expect him to fornicate you life like a "good and innocent" guy he is.



RamzFanz said:


> This is taking Chris out of context. _There's an awful lot of driving that people do that they really hate _and _Road rage would not be a thing if people truly liked driving _are two seperate thoughts.
> 
> Are you trying to say most people don't hate most miles they drive? Congestion? Long boring commutes? Staying alert and trapped?


I am not. You try to take what he says out of context. He mentions "road rage" for the psychological effect the term has. When you want to trigger an emotional response from the audience, you need to use strong terms. You need to go back to the definition. Road rage has nothing to do with congestion or boredom. It is all about aggressivity and endangering other people lives.



RamzFanz said:


> You focus on deaths because it suits your argument. Deaths are down not because we are good drivers, we aren't, but because of improved safety standards.


You have my answer here.


----------



## Oscar Levant (Aug 15, 2014)

jocker12 said:


> https://www.recode.net/2017/9/4/162...le-human-safety-recode-decode-podcast-swisher
> 
> And full podcast - https://www.recode.net/2017/9/8/162...iving-car-engineer-chris-urmson-recode-decode


My gut feeling is that it will take a generation to have passed for it to be accepted, and at that time, all cars will be SDC, which will be able to talk to the road, the signs, other SDCs, and work as one huge synchronized system. THEN it work. But, today, Americans love their cars and the vast majority are not giving them up for cheap transportation. We've had cheap transportation, and it never replaces cars in most areas of the US.



jocker12 said:


> Because of their greed, some of the people involved are starting to explain the flaws their competitors have. It is difficult for a regular consumer to understand the complexity of the guidance systems used and their weaknesses. It takes a very good engineer to explain it, and when you have few Google former top Lidar development guys, expressing doubts about Tesla Autopilot project, you can understand how nobody knows exactly what they are doing. This is not a matter of time to do the testing and the fine tuning, which is going to go on forever. At this point is a matter of being capable to convince the car manufacturers and the public how your idea, which you are not entirely sure about, it could be better. They are an army of blind people guided in their self declared progressive efforts, by a blind ideal.
> 
> Hahahahaha.... First, we are yet to see any profit out of any TNC platform.....
> 
> ...


It states: "It's humbling, as someone working in this space, how easy some of these tasks are for humans to do, and how hard they are to actually get software and technology to solve," he said.

That's just it, humans can THINK. AI learns fast, AI can process data much faster than humans, but the variables are infinite, and there will be times when the machine will face situations that require THOUGHT, and no machine will ever be able do that. If you think otherwise, you fail to grasp the difference between a machine and life. I had an argument with a scientist, who argued, "given moore's law, it's only a matter of time when machines will be able to 'think' ", to which I replied, that would be true if there was a finite distance , a linear path between human consciousness and machine, but the true distance is this: infinity, i.e., they are two different realms and there is no line from one to the other. A scientist could not grasp the fundamental difference between consciousness and data processing.


----------



## jocker12 (May 11, 2017)

Oscar Levant said:


> My gut feeling is that it will take a generation to have passed for it to be accepted, and at that time, all cars will be SDC, which will be able to talk to the road, the signs, other SDCs, and work as one huge synchronized system. THEN it work. But, today, Americans love their cars and the vast majority are not giving them up for cheap transportation. We've had cheap transportation, and it never replaces cars in most areas of the US.
> 
> It states: "It's humbling, as someone working in this space, how easy some of these tasks are for humans to do, and how hard they are to actually get software and technology to solve," he said.
> 
> That's just it, humans can THINK. AI learns fast, AI can process data much faster than humans, but the variables are infinite, and there will be times when the machine will face situations that require THOUGHT, and no machine will ever be able do that. If you think otherwise, you fail to grasp the difference between a machine and life. I had an argument with a scientist, who argued, "given moore's law, it's only a matter of time when machines will be able to 'think' ", to which I replied, that would be true if there was a finite distance , a linear path between human consciousness and machine, but the true distance is this: infinity, i.e., they are two different realms and there is no line from one to the other. A scientist could not grasp the fundamental difference between consciousness and data processing.


There is no AI. The software needs to be as simple as possible because the more complicated you make it, the more conflicting it gets. Imagine there are 2 or 3 good options to follow in a given situation. The learning machine will crash, because it lacks CONTEXT. You say humans think, I say we are capable to understand context and take decisions based on it. When is about driving, it involves geometry, geography, weather, past, present and future(anticipation). A computer will not be able to detect and understand context.

Fundamentally, the biggest problem autonomous cars have is not the software (meant to operate without failure and continuously only in ideal conditions) or logistics (impossible to estimate giving the variety of the forces involved). The problem lays with those forces involved. Who knows how corporations function, understands the impossibility of the autonomous cars to became reality.

People involved with this technology say how easy it will be without the existing obstacles on the road, traffic, lights, road signs, pedestrians, bicyclists, bridges, lines on the asphalt, or buildings. In simulations created for perfect environments with 100% self drivable cars on the road, you don't need any lines on the road, or lights, or steering wheel, or even a windshield with wipers on it. In theory, the computer will be able to manage the driving based on real time analysis of the surrounding environment. Nobody wants to come out and admit how the perfect environment with very few insignificant obstacles to no obstacles at all, already exists.... IF you put the car in the air and fly it as an aircraft. No lines, no lights, not too many tall buildings, no impredictable pedestrians. 
Why is nobody willing to talk about it? Because of the FAA very strict regulations and the immense liability in case of an accident. On the road, in case the autonomous car has a minor malfunction won't necessary equate in a loss of life for every single time. But if you have a minor malfunction in a autonomous aircraft, for sure that will end up in a messy crash with casualties for every single time. Any normal individual can see that and the self driving crooks could not get away with their fantasies for too long if they will advocate for autonomous aircrafts. The danger is way too obvious.

Corporations will get to test their robots, but the same people hyping up for them today, will be the ones to avoid them after they'll have the chance to sit in one for a ride, because of the technology limitations and inconsistencies. I really feel bad for these naives that are getting excited for nothing.


----------



## peteyvavs (Nov 18, 2015)

Silicon Valley tech wizards are like a woman with silicon implants, it looks good until someone sticks a pin in them. Self driving cars will be acceptable to some people until one is rare ended or a flat tire sends one out of control with a passenger.


----------



## Mars Troll Number 4 (Oct 30, 2015)

The cars WILL depreciate at an incredible rate once put on autonomous mode 24/7

The cab drive for,

Well 450,000 miles in 5 years isn't unexpected.


----------

