A workers' union has filed a claim alleging that a facial recognition algorithm used by Uber is leading to the termination of darker-skinned workers at a disproportionate rate.
The Independent Workers' Union of Great Britain (IWGB) alleges that the ride-sharing company's Real-Time ID Check software is five times more likely to cause the termination of darker-skinned workers.
At the beginning of the month, the union filed a claim for indirect racial discrimination on behalf of one of its members. According to the union, the member's account was wrongly terminated following a facial recognition error.
Two further claims were made on the worker's behalf, including a claim that Uber denied the claimant's right to paid holidays while they were working for the company up until March 2021 when Uber agreed to pay all its drivers for annual leave.
A claim that Uber paid the claimant below the national minimum wage up until March 2021 when the company agreed to pay all its drivers at least the minimum wage, has also been put forward.
The case was made possible by the Supreme Court's ruling earlier this year which found that Uber drivers were working under worker's contracts for the company.
Its claim of indirect discrimination, which the group hopes will lead to Uber scrapping the technology entirely, coincided with a 24-hour boycott of Uber organised by the union.
An increase in earnings and a more transparent process for account terminations were among the demands set by the union.
The software under fire is designed to ensure the correct person is using the driver's account. Drivers are asked to take a real-time photo of themselves for verification which is then matched against the account holder's profile picture.
According to Uber, it is used in conjunction with a process of human review, and no one can be deactivated based on the technology alone.
A spokesperson for Uber said: "Our Real-Time ID Check is designed to protect the safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel. The system includes robust human review to make sure that this algorithm is not making decisions about someone's livelihood in a vacuum, without oversight."
Microsoft said it cannot comment on ongoing legal matters.
The preliminary hearing is set for 28 February 2022.