# I have always notice



## Bobbyk5487 (Jan 28, 2019)

I posted before this same algorithms try to send minority drivers to minority pax as much as possible...the air port can be busy with no drivers...and I'll still be getting calls to go 20 miles away to the "hood"...God bless the USA

https://venturebeat.com/2020/06/12/...cing-algorithms-used-by-uber-lyft-and-others/


----------



## mbd (Aug 27, 2018)

Hood - closest driver first, then if driver declines , it goes to the highest AR close by, which could be 5 minutes or 20 minutes.
Algo keeps a track of the pax’s decline rates by other drivers. Rentals have high AR and most rentals are driven by the minorities. 
Algo does not know if the pax is a 👽 Martian, but it knows the Pax has high decline rate. Algo’s job is to learn and guess which driver will pick up the pax. 
Airports- you can skip the queue if the perfect Martian requests a ride.


----------



## Bobbyk5487 (Jan 28, 2019)

So when you download apps and it want access to your pics...why do you think that is


----------



## mbd (Aug 27, 2018)

Bobbyk5487 said:


> So when you download apps and it want access to your pics...why do you think that is


It does not care about your pics... it only cares about your number tendencies. MSFT, IBM and AMZN all going away from developing facial recognition technologies :smiles:
Uber does not care about anybody's pics or data ....it is worthless to them &#128516;


----------



## Bobbyk5487 (Jan 28, 2019)

mbd said:


> It does not care about your pics... it only cares about your number tendencies. MSFT, IBM and AMZN all going away from developing facial recognition technologies :smiles:
> Uber does not care about anybody's pics or data ....it is worthless to them &#128516;


-A blind sheep



mbd said:


> It does not care about your pics... it only cares about your number tendencies. MSFT, IBM and AMZN all going away from developing facial recognition technologies :smiles:
> Uber does not care about anybody's pics or data ....it is worthless to them &#128516;


Well exactly why do it ask for access to your pictures


----------



## mbd (Aug 27, 2018)

Another trend
If the pax got charged 10 dollar extra on the last trip, same trip next time , Algo will charge the exact 10 dollar extra. If the pax declines the ride, and tries again , the rates will go down .👍 Algo thinks, this Pax paid 10$ extra last time, so why not charge extra.
You can have two pax at the same spot, but they could get charged differently.


----------



## waldowainthrop (Oct 25, 2019)

Bobbyk5487 said:


> Well exactly why do it ask for access to your pictures


So that they can request photos of your face for identity verification. The application needs permission to access your camera and/or photo gallery to use those pictures. All phones require this permission as a basic security measure, so that third party applications can't access the camera by default.

If they want to know your race, they don't need access to the camera of your phone to do it. They can profile drivers by their driver's license photos, or use the photo you submit for your profile picture, or find that information in the mandatory background check. The demographic data on drivers is ubiquitous and they don't need access to your phone to get it.


----------



## mbd (Aug 27, 2018)

They don’t profile by race, but by tendencies.
Certain zip codes will have certain tendencies.


----------



## Bobbyk5487 (Jan 28, 2019)

waldowainthrop said:


> So that they can request photos of your face for identity verification. The application needs permission to access your camera and/or photo gallery to use those pictures. All phones require this permission as a basic security measure, so that third party applications can't access the camera by default.
> 
> If they want to know your race, they don't need access to the camera of your phone to do it. They can profile drivers by their driver's license photos, or use the photo you submit for your profile picture, or find that information in the mandatory background check. The demographic data on drivers is ubiquitous and they don't need access to your phone to get it.


I mean why do they need access to the passengers pictures also



mbd said:


> They don't profile by race, but by tendencies.
> Certain zip codes will have certain tendencies.


So this study got it wrong... And you got it right okay


----------



## mbd (Aug 27, 2018)

Bobbyk5487 said:


> I mean why do they need access to the passengers pictures also


It's looking for nude pics :thumbup:



Bobbyk5487 said:


> I mean why do they need access to the passengers pictures also
> 
> 
> So this study got it wrong... And you got it right okay


You are welcome &#128591;


----------



## waldowainthrop (Oct 25, 2019)

Bobbyk5487 said:


> I mean why do they need access to the passengers pictures also


_That_ is a good question.

Are we talking Lyft? Lyft used to have a rider profile feature which would explain that detail. They could misuse that information, of course, but most passengers never provided a picture voluntarily. It's possible they could abuse this feature, but I'd have to check how it works on iOS and Android. They certainly could misuse the pictures that passengers voluntarily submit.

Uber doesn't ask for access to the camera on the passenger app, at least in my experience.

There are a lot of other methods and sources of information that they have available to discriminate with, both legally and extra-legally.


----------



## mbd (Aug 27, 2018)

waldowainthrop said:


> _That_ is a good question.
> 
> Are we talking Lyft? Lyft used to have a rider profile feature which would explain that detail. They could misuse that information, of course, but most passengers never provided a picture voluntarily. It's possible they could abuse this feature, but I'd have to check how it works on iOS and Android. They certainly could misuse the pictures that passengers voluntarily submit.
> 
> Uber doesn't ask for access to the camera on the passenger app, at least in my experience.


They don't ask for access to your pics &#128077;


----------



## waldowainthrop (Oct 25, 2019)

mbd said:


> They don't ask for access to your pics &#128077;


Lyft passenger app does under some circumstances, or at least they used to.


----------



## Bobbyk5487 (Jan 28, 2019)

waldowainthrop said:


> _That_ is a good question.
> 
> Are we talking Lyft? Lyft used to have a rider profile feature which would explain that detail. They could misuse that information, of course, but most passengers never provided a picture voluntarily.
> 
> Uber doesn't ask for access to the camera on the passenger app, at least in my experience.


Lyft and uber...they demand access to everything you phone has to offer before you can download the app...inclulding video picture and audio



waldowainthrop said:


> Lyft passenger app does under some circumstances, or at least they used to.


They still do...hes just throwing sht at the wall to see what sticks


----------



## waldowainthrop (Oct 25, 2019)

Bobbyk5487 said:


> Lyft and uber...they demand access to everything you phone has to offer before you can download the app...inclulding video picture and audio


Are you using an Android device?

iOS Uber passenger app does not ask for or require camera or photos access. Lyft has asked for both. I'm looking at permissions on my phone right now, and Lyft passenger app has permissions for those, while Uber passenger app does not.


----------



## Bobbyk5487 (Jan 28, 2019)

waldowainthrop said:


> Are you on Android?
> 
> iOS Uber app does not ask for or require camera or photos access. Lyft has asked for both.


Yea android


----------



## mbd (Aug 27, 2018)

waldowainthrop said:


> Lyft passenger app does under some circumstances, or at least they used to.


It will just ask access to your camera so the pic can be taken .


----------



## waldowainthrop (Oct 25, 2019)

Confirmed: Lyft still asks passengers for photo access.

I still think the apps have other methods for discrimination, without needing photos.


----------



## mbd (Aug 27, 2018)

waldowainthrop said:


> Confirmed: Lyft still asks passengers for photo access.
> 
> I still think the apps have other methods for discrimination, without needing photos.


It Just wants a pic to put on your profile .
Discrimination- it is not discrimination, Algo is looking for tendencies. In football it is called analytics &#128077;


----------



## Bobbyk5487 (Jan 28, 2019)

waldowainthrop said:


> Confirmed: Lyft still asks passengers for photo access.
> 
> I still think the apps have other methods for discrimination, without needing photos.


Like when you name is Tyrone or Juqiun....or Raj or ismal....or Shaquina


----------



## waldowainthrop (Oct 25, 2019)

Bobbyk5487 said:


> Like when you name is Tyrone or Juqiun....or Raj or ismal....or Shaquina


Yes, they _could_ have an algorithm that distinguishes passengers based on names. It wouldn't be that accurate, but it could do the job at determining something about their identity. Guaranteed there are black people with my first name who would not make the cut, though.


----------



## mbd (Aug 27, 2018)

Bobbyk5487 said:


> Like when you name is Tyrone or Juqiun....or Raj or ismal....or Shaquina


Drivers decline those names , so that particular name/ account holder builds up a tendency- Algo does not know which race it is . Uber Algo knew( pre covid), how many pings from each sq ft will come from NYC and at what time.


----------



## waldowainthrop (Oct 25, 2019)

mbd said:


> Drivers decline those names


----------



## Bobbyk5487 (Jan 28, 2019)

mbd said:


> Drivers decline those names , so that particular name/ account holder builds up a tendency- Algo does not know which race it is .


Those are the only names that comes to me


----------



## mbd (Aug 27, 2018)

1st time pax will always get a good rated driver.


----------



## Bobbyk5487 (Jan 28, 2019)

mbd said:


> Drivers decline those names , so that particular name/ account holder builds up a tendency- Algo does not know which race it is . Uber Algo knew( pre covid), how many pings from each sq ft will come from NYC and at what time.


What do you think some Pro Trumper that does the hiring at some place do with those names... Decline them also... And that's why we kneel



mbd said:


> 1st time pax will always get a good rated driver.


These passengers catch Uber to work everyday none of them are first time Riders


----------



## waldowainthrop (Oct 25, 2019)

mbd said:


> 1st time pax will always get a good rated driver.


That's an interesting possibility. I got a _lot_ of first time riders, and I'm a top 5% by rating, with 80% average AR. I don't know if I got an abnormal amount of new passengers, but it seemed like a lot.


----------



## mbd (Aug 27, 2018)

waldowainthrop said:


> That's an interesting possibility. I got a _lot_ of first time riders, and I'm a top 5% by rating, with 80% average AR. I don't know if I got an abnormal amount of new passengers, but it seemed like a lot.


On Lyft , pop up window will tell you it is a local first time pax, but at the airport the pop up window will not appear because the pax is from out of state . You will not know if it is a first time pax at the airport , unless the pax tells you .


----------



## dmoney155 (Jun 12, 2017)

My name is Graham, I put in the info that I grew up in England, and I speak with perfect english accent... no wonder I keep keeping stacked, long distance pings from upper class neighborhoods then. You took can make your uber experience easy and maximize your earnings by buying my guide to Uber for low price of $999.99 and following tips such as above


----------



## Ssgcraig (Jul 8, 2015)

Bobbyk5487 said:


> I posted before this same algorithms try to send minority drivers to minority pax as much as possible...the air port can be busy with no drivers...and I'll still be getting calls to go 20 miles away to the "hood"...God bless the USA
> 
> https://venturebeat.com/2020/06/12/...cing-algorithms-used-by-uber-lyft-and-others/


What is a minority driver/pax? The driver is a minority of what? What is the majority?


----------



## Trafficat (Dec 19, 2016)

Humans can be racist. An algorithm can pick up on human cues. Suppose an algorithm is designed to determine what causes a passenger to rate a driver highly. It stores a list of 100 variable traits for each driver, and looks for statistical correlations between passenger ratings of drivers and those traits. Suppose one of those traits is race. And the algorithm finds a statistically significant trend for a passenger where the passenger consistently downrates drivers who are of a certain race and thus sends a different race driver to pick them up. Racially unfair? Yes. But is it a good thing, or a bad thing? I know I personally would rather not be paired with a passenger that is going to downrate me due to my race. And the passenger apparently would be happier if his racist preferences are matched as well.


----------



## Cdub2k (Nov 22, 2017)

waldowainthrop said:


> Yes, they _could_ have an algorithm that distinguishes passengers based on names. It wouldn't be that accurate, but it could do the job at determining something about their identity. Guaranteed there are black people with my first name who would not make the cut, though.


If they are involved with this type of profiling (I'm sure they are) it would be based on the demographics of the neighborhood in which the Ride Request originates from. That kind of info is free to the public and can be easily implemented into the algorithm and then matched to a Driver that's somewhat in the area who meets the criteria of the Ride Request.

We'll never know the specifics but it is not out of the realm of possibilities and at this point it's more probable than unlikely.


----------



## waldowainthrop (Oct 25, 2019)

Cdub2k said:


> If they are involved with this type of profiling (I'm sure they are) it would be based on the demographics of the neighborhood in which the Ride Request originates from. That kind of info is free to the public and can be easily implemented into the algorithm and then matched to a Driver that's somewhat in the area who meets the criteria of the Ride Request.
> 
> We'll never know the specifics but it is not out of the realm of possibilities and at this point it's more probable than unlikely.


I totally agree that it's plausible. It's way more plausible to alter the algorithms in either intentionally or unintentionally discriminatory ways using geographic data and general demographic information, than it is that they use "name databases" or photos to discriminate. Geographic discrimination is way easier to accomplish (and get away with without legal issues) than some other forms of discrimination. They could be using any of these methods, or other ones not talked about here, but some are more plausible and substantiated than others.

I always have to add this disclaimer, but not all discrimination is bad or illegal. The main kind of discrimination that people think about is racial discrimination, but an algorithm can also discriminate against drivers who have worse ratings or do certain things that the company doesn't prefer them to do. That's just as much discrimination, but most people understand and excuse that more than they excuse discrimination based on bigotry.


----------



## CJfrom619 (Apr 27, 2017)

mbd said:


> Another trend
> If the pax got charged 10 dollar extra on the last trip, same trip next time , Algo will charge the exact 10 dollar extra. If the pax declines the ride, and tries again , the rates will go down .&#128077; Algo thinks, this Pax paid 10$ extra last time, so why not charge extra.
> You can have two pax at the same spot, but they could get charged differently.


Lol wow you really put to much thought into the algorithms. This is not accurate by the way but if you think so then that's good enough for me.


----------



## mbd (Aug 27, 2018)

CJfrom619 said:


> Lol wow you really put to much thought into the algorithms. This is not accurate by the way but if you think so then that's good enough for me.


:smiles:given rides to same pax's multiple times, and they talk. You can have same exact locations and distance , two different pax's can get two different quotes.


----------



## CJfrom619 (Apr 27, 2017)

mbd said:


> :smiles:given rides to same pax's multiple times, and they talk. You can have same exact locations and distance , two different pax's can get two different quotes.


Time of day and day of week. A ride at 4:00 on Tuesday might cost different on Wednesday or Saturday even if its same exact trip.


----------



## mbd (Aug 27, 2018)

CJfrom619 said:


> Time of day and day of week. A ride at 4:00 on Tuesday might cost different on Wednesday or Saturday even if its same exact trip.


Same day, same time , same building, going to the same place, two different people can get quoted two different prices.


----------



## CJfrom619 (Apr 27, 2017)

mbd said:


> Same day, same time , same building, going to the same place, two different people can get quoted two different prices.


Ok friend if that's what you believe then that's all good. I believe in an algorithm but I think you put to much thought into it as far as how they distribute trips to certain drivers. I don't think its that complicated of a system.


----------



## mbd (Aug 27, 2018)

driver gets paid distance and time, after that Algo will try to get the max out of the pax.
Somebody going from Palo Alto/ Cupertino to Manhattan will be charged differently just because of the income/zip code profile. That pax at LaGuardia or Newark will be targeted.
Most of it is business expense and they know it.


----------



## CarpeNoctem (Sep 12, 2018)

Or, it could have nothing to do with race. Maybe the algo is using insurance data that says certain areas have a higher risk of accidents or carjackings or whatever... and they offset the fares to cover those risks.


----------



## Bobbyk5487 (Jan 28, 2019)

CarpeNoctem said:


> Or, it could have nothing to do with race. Maybe the algo is using insurance data that says certain areas have a higher risk of accidents or carjackings or whatever... and they offset the fares to cover those risks.


Nothing has anything to do with race if you leave to people who post racist crap like ummm.....YOU!!!...I Guess you posting racist things has nothing to do with race either


----------



## Tony73 (Oct 12, 2016)

The algorithm learns your work pattern. Drivers who frequently work in the hood will keep going there.

Drivers that go into LAX or NYC will keep getting rides that way. After you cancel 10 of these you won’t be bothered for a very long time. That could also backfire if you live near these spots as it will take you far away from them. Might have some empty miles going home.

Also saying **** Uber x3 and turning the Lyft app on grants you better rides on both as Uber will compete with Lyft by giving you the same type of rides.


----------



## CarpeNoctem (Sep 12, 2018)

Bobbyk5487 said:


> Nothing has anything to do with race if you leave to people who post racist crap like ummm.....YOU!!!...I Guess you posting racist things has nothing to do with race either


ROFL!!!!! I give a possible explanation and get called a racist. Everything is racist to you. I could blow my nose and you would say it is racist. I could donate to the Salvation Army and you would say it is racist. I could knit mittens for handicapped homeless children and you would say it is racist. I guess I have nothing to lose being accused of being a racist BY A RACIST!

Take your critical theory BS and extemporaneous correlations and, well, you know...


----------



## Bobbyk5487 (Jan 28, 2019)

CarpeNoctem said:


> ROFL!!!!! I give a possible explanation and get called a racist. Everything is racist to you. I could blow my nose and you would say it is racist. I could donate to the Salvation Army and you would say it is racist. I could knit mittens for handicapped homeless children and you would say it is racist. I guess I have nothing to lose being accused of being a racist BY A RACIST!
> 
> Take your critical theory BS and extemporaneous correlations and, well, you know...


I'll just never forget you telling me 600k dying in the civil war was more than the 4 million slaves America had at the time...and you proudly stood on that and wouldn't admit that you were lying

The white American Way... Let's just ignore and deny the mistreatment of others and see what type of benefits can we get out of it...

https://www.newscientist.com/articl...ng-algorithms-charge-more-in-non-white-areas/


----------



## BunnyK (Dec 12, 2017)

Bobbyk5487 said:


> I'll just never forget you telling me 600k dying in the civil war was more than the 4 million slaves America had at the time...and you proudly stood on that and wouldn't admit that you were lying
> 
> The white American Way... Let's just ignore and deny the mistreatment of others and see what type of benefits can we get out of it...
> 
> https://www.newscientist.com/articl...ng-algorithms-charge-more-in-non-white-areas/


People ignore and deny the black on black crime stats. Why should people care if their own communities don't?


----------



## Bobbyk5487 (Jan 28, 2019)

BunnyK said:


> People ignore and deny the black on black crime stats. Why should people care if their own communities don't?


You are so right... That's why I don't hear any of this nonsense when people tell me things don't have to do with race... On a public forum you just admitted that everything is based on race and you don't care about black people... Thank you!



BunnyK said:


> People ignore and deny the black on black crime stats. Why should people care if their own communities don't?


The PC answer is Americans are Americans regardless of race... but you kept it real&#128077;


----------



## BunnyK (Dec 12, 2017)

Bobbyk5487 said:


> You are so right... That's why I don't hear any of this nonsense when people tell me things don't have to do with race... On a public forum you just admitted that everything is based on race and you don't care about black people... Thank you!


You didn't answer the question. It was simple. Why should other races care about blacks when they do not care about themselves enough to tackle the real issues facing their community, and instead elect the same crooked people into power in their cities that further worsen their situation?

People who care do not lie to you, or avoid uncomfortable truths based in reality.


----------



## Bobbyk5487 (Jan 28, 2019)

BunnyK said:


> You didn't answer the question. It was simple. Why should other races care about blacks when they do not care about themselves enough to tackle the real issues facing their community, and instead elect the same crooked people into power in their cities that further worsen their situation?
> 
> People who care do not lie to you, or avoid uncomfortable truths based in reality.


Because in America we see all American as American we one nation undivided under God... the sad part about it young black kids pledged those Allegiance every morning... And as they grow up they slowly realize it's all BS... and people like you say things like they don't have a reason to care about Black Americans


----------



## CarpeNoctem (Sep 12, 2018)

Bobbyk5487 said:


> I'll just never forget you telling me 600k dying in the civil war was more than the 4 million slaves America had at the time...and you proudly stood on that and wouldn't admit that you were lying
> 
> The white American Way... Let's just ignore and deny the mistreatment of others and see what type of benefits can we get out of it...
> 
> https://www.newscientist.com/articl...ng-algorithms-charge-more-in-non-white-areas/


I don't know where you are getting your numbers from.

However, It is very apparent you didn't even read the report backing that BS article. You are saying it as fact and the article forwards it as fact that it is currently happening but the study is a preprint article which means the study has not yet been vetted via peer review. It is the study of machine learning perhaps getting some bias. The cost of fares is just a McGuffin.

7 Conclusion
We introduce a new method, IESB for social datasets, that measures the effect of continuous variables
on prediction outcomes to measure social bias on populations. While *demand and speed have the
highest correlation with ridehailing fares*, analysis shows that users of ridehailing applications in the
city of Chicago *may be* experiencing social bias with regard to fare prices when they are picked up or
dropped off in neighborhoods with a *low percentage of individuals over 40* or a *low percentage of
individuals with a high school diploma or less*. In addition users may be facing social bias *if *picked
up in a neighborhood with a *low percentage of houses priced less than the median house price* of
Chicago, *or dropped off in a neighborhood with a low percentage of white people.* This means that
users in neighborhoods with lower percentages of these demographics *may be* experiencing disparate
impact and have to pay more for rides than users in other neighborhoods. Our analysis of ACS census
data and protected attributes also *suggests* that the difference in fare prices based on neighborhoods
*may* also extend to *individual attributes of age, education, home prices, and ethnicity.*

8 Broader Impact
Any algorithmic decision making process based on geolocation data in the United States *has the
potential to learn and express demographic biases*, due to demographic differences in locations
throughout the United States. When geolocation is used in conjunction with dynamic pricing
algorithms, applications *may learn* to charge people belonging to a particular demographic more
than others [4, 5]. The methodologies *proposed* in this work can be used to audit applications and
determine where biases *may exist* in a geolocation-based algorithmic decision maker, such as a
dynamic pricing model. But an audit of algorithmic bias is only the first step to ensuring algorithmic
fairness. *Debiasing any discovered bias* requires a separate concerted effort - one that can only be
completed after an audit, but yet* requires further detail of a model's inner workings.* Performing an
audit of algorithmic decision makers can help reduce the *possible disparate impact* of applications
that result in unfair outcomes for users downstream given that any discovered bias is subsequently
dealt with using a specifically tailored solution for each individual decision maker.

I don't think I have a read a less definitive study.
_correlation, may be, low percentage, if, or, may be, suggests, may, potential, may learn, proposed, may exist, any discovered, requires further detail._

Note there are few direct references to race. Race is just one point of a set of possible variables. While you are trying to use the article and study as a club to gaslight us all into submission of white guilt, you are really just swinging around a duck that is crapping on you. Lucky you!


----------



## Bobbyk5487 (Jan 28, 2019)

BunnyK said:


> You didn't answer the question. It was simple. Why should other races care about blacks when they do not care about themselves enough to tackle the real issues facing their community, and instead elect the same crooked people into power in their cities that further worsen their situation?
> 
> People who care do not lie to you, or avoid uncomfortable truths based in reality.


And people like you say democrats run cities where blacks are doing bad...they also run cities where blacks are doing good...because when black move to a city white Republicans run to the rural areas...



CarpeNoctem said:


> I don't know where you are getting your numbers from.
> 
> However, It is very apparent you didn't even read the report backing that BS article. You are saying it as fact and the article forwards it as fact that it is currently happening but the study is a preprint article which means the study has not yet been vetted via peer review. It is the study of machine learning perhaps getting some bias. The cost of fares is just a McGuffin.
> 
> ...


Where did you get that more died in the civil war than there were slaves?...oh I know.. from where the sun don't shine


----------

