In June 2020to take benefit Following a Chicago law requiring ride-hailing apps to disclose their prices, researchers at George Washington University published an analysis of algorithms used by ride-sharing startups, such as Uber and Lyft, to set fares. It highlighted evidence that the algorithms charged riders who lived in buildings with older, lower-income and less-educated populations more than those who came from affluent areas, an effect the researchers found on the high popularity of — and thus the high demand for – car sharing in wealthier neighbourhoods.
Uber and Lyft rejected the study’s findings, claiming there were flaws in the methodology. But it was barely the first study to identify troubling inconsistencies in the apps’ algorithmic decision-making.
Riders aren’t the only ones who fall victim to routing and pricing algorithms. Uber was recently criticized for implementing “prepay” for drivers, which uses an algorithm to pre-calculate fares using factors that don’t always favor drivers.
In the delivery room, Amazon’s routing system reportedly encourages drivers to make dangerous road decisions in pursuit of shorter delivery windows. Meanwhile, apps like DoorDash and Instacart are using algorithms to calculate pay for couriers — algorithms that some deliverers claim have made it more difficult to to predict and find out their merits.
As experts like Amos Toh, a senior researcher for Human Rights Watch who studies the effects of AI and algorithms at work, point out, the more opaque the algorithms are, the more regulators and the public have a hard time holding companies accountable.