© 2024 WXPR
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

When your boss is an algorithm

A senior operations manager for Uber directs drivers at the Seattle-Tacoma International Airport.
Ted S. Warren
/
AP
A senior operations manager for Uber directs drivers at the Seattle-Tacoma International Airport.

Updated April 25, 2023 at 7:57 AM ET

Two brothers who drive for Uber recently conducted an experiment. They opened their Uber apps while sitting in the same room, and tested which brother could earn more money to do the same work.

In a video published on The Rideshare Guy YouTube channel, the brothers recorded themselves looking for rides on the app. They found that Uber showed them nearly identical jobs, but offered to pay one of them a little better. The siblings could only guess why. Had Uber's algorithm somehow calculated their worth differently?

University of California College of the Law professor Veena Dubal says that's exactly what's going on. In a recent paper, she says rideshare apps promote "algorithmic wage discrimination" by personalizing wages for each driver based on data they gather from them. The algorithms are proprietary, so workers have no way of knowing how their data is being used, Dubal says.

"The app is their boss," Dubal told Morning Edition's A Martinez. "But unlike a human boss who you can negotiate with or withhold information from, the algorithms know so much about these workers."

Uber says workers who drive electric vehicles get a $1 bonus per ride, but the company does not use drivers' personal data to set their pay rates. "Uber does not personalize fares to individual drivers, and a driver's race, ethnicity, acceptance rate, total earnings, or prior trip history are not considered when calculating fares," a spokesperson writes in a statement. A representative for Lyft calls Dubal's paper "biased", saying it relies on cherry-picked data and debunked anecdotal information.

Personalized digitized pay is already the new normal in some workplaces, according to Dubal, and it's begun to attract attention from regulators.

This conversation has been lightly edited for clarity and length.


Interview highlights

On how algorithmic wage discrimination works

Rideshare drivers say the app is their boss. And unlike a human boss who you can negotiate with or withhold information from, the algorithms know so much about these workers. They know how much a worker is willing to accept for a particular ride. They know how much workers try to earn on any given day. They can really personalize how much that worker makes in order to influence their behavior in particular ways.

I look specifically at ride-hailing firms to discuss the phenomenon of digitalized variable pay, but it's happening across the on-demand economy and even maybe beyond it. Basically, these firms, because they treat their workers as independent contractors, cannot tell them what to do and where to go. Instead, they use these pay mechanisms to influence their behavior. They learn everything that they can about particular workers and use that knowledge to shape how workers get paid.

On how this differs from other forms of unequal pay

In a more familiar employment setting where workers make different amounts of money [to do similar jobs], we still have a legal norm: equal pay for equal work. In those contexts, there is often some logic to why people are earning more, whether it's seniority or experience or skill, and that is often transparent. There are also laws that ensure that companies check their own practices to make sure people are earning roughly the same amounts. What's complicated about algorithmic pay is that there is no logic. Instead, it might be that the person who works for a really long time, works really hard, and has the most experience is earning the least, and we just can't know. The logic is all hidden behind black box algorithms.

On how algorithms can reproduce discrimination

Some of these firms have, in their own research, found that these practices can lead to women earning less than men. They ascribe these differences to the algorithms. But if these algorithms are recreating traditional wage differences that are illegal under employment laws, then something is deeply wrong.

On whether workers can access how algorithms calculate their pay

In Europe, under GDPR [General Data Protection Regulation], some workers have, after litigation, won the right to have some access to what data companies are extracting from their work to determine particular prices. None of that has been revealed yet. In the U.S. there are similar privacy laws, but none of this has been litigated yet, and attempts to get at it from regulators has largely been met with resistance. Companies maintain that this is, oddly, about privacy, that they don't want to unveil their practices because it might lead to information about workers being leaked. They also maintain that these systems they've developed are their intellectual property.

On whether variable digitized pay is illegal

It might be illegal under antitrust laws. There is the potential to say that some of this is price fixing, if all of these workers are independent contractors. That is being litigated in California courts right now. But absent a finding on an antitrust violation, this isn't necessarily illegal. It's a brave new world.

On whether regulators could intervene

The Federal Trade Commission is looking into this. They're very interested in whether or not this violates antitrust laws. And I think that any number of lawmakers who are generally interested in economic equality – from Sen. Elizabeth Warren to Sen. Bernie Sanders – are likely very interested in what amounts to a dystopic system of work.

Ziad Buchh produced the audio version of this story. contributed to this story

Copyright 2024 NPR. To see more, visit https://www.npr.org.

Ally Schweitzer (she/her) is an editor with NPR's Morning Edition. She joined the show in October 2022 after eight years at WAMU, the NPR affiliate in Washington.
Up North Updates
* indicates required