Former Uber drivers have accused the taxi app firm of using automated “robo-firing” algorithms to dismiss them.
British drivers want courts in the Netherlands – where Uber’s data is based – to overrule the algorithm that they say caused them to be fired.
Experts say the legal challenge is the first of its kind to test the protections of GDPR Article 22.
Uber told the BBC that drivers’ accounts were only deactivated following manual review by humans.
“Uber provides requested personal data and information that individuals are entitled to,” said a spokeswoman for Uber.
“We will give explanations when we cannot provide certain data, such as when it doesn’t exist or disclosing it would infringe on the rights of another person under GDPR.”
The European Union’s (EU) General Data Protection Regulation (GDPR), which came into force in 2018, imposes obligations on companies who collect people’s personal information, no matter where they are located in the world, if that data is related to EU consumers.
“As part of our regular processes, the drivers in this case were only deactivated after manual reviews by our specialist team,” the spokeswoman added.
The App Drivers & Couriers Union (ADCU), which is bringing the legal challenge, says that since 2018, it has seen well over 1,000 individual cases where drivers have allegedly been wrongly accused of fraudulent activity and immediately had their accounts terminated without a right of appeal.
“For any private hire operator in London, if they fire someone, there is a requirement where they have to report the driver to Transport for London (TfL),” James Farrar, the ADCU’s general secretary told the BBC.
“This is putting drivers in a Kafkaesque situation where they may be called in by TfL, they’re given 14 days to explain the situation and why they should keep their licence. Our drivers are in a terrible position because they don’t know what the issue is, Uber hasn’t told them.”
Mr Farrar further claims that when TfL asked for additional details, Uber told TfL that it could not provide them, because it would compromise Uber’s security.
ADCU adds that none of the drivers represented by it in this lawsuit have been reported to the police by Uber after having their accounts terminated.
A former Uber driver with ADCU, who has asked not to be named, told the BBC that he had been driving with Uber for about two years and had a customer rating of 4.94 when he was suddenly terminated from the app.
“The day it happened, I went to work and on my app, it said I wasn’t allowed to log in. The app said to call customer support,” he said.
“I rang customer support and I was told that my account was deactivated because I had been engaging in fraudulent activities.”
He said that he contacted Uber more than 50 times over a year and a half via the phone and email, but claims he was never told what he had done that was “fraudulent”.
When he called customer support, he was told that a “specialised team” was dealing with the issue, and that they would call him back. They never called, he says.
“I was pleading with them in my emails repeatedly. I even asked if I could have a face-to-face meeting with the specialised team. I was willing to travel to another country to meet them,” he said.
“I have a family to feed. I’m not a fraudster or a criminal.”
After being terminated, the driver was reported to TfL by Uber. But the taxi app firm did not report him to the police.
TfL wrote to the driver to ask him to answer the allegations in writing. When the driver explained, TfL dropped the matter and did not revoke his licence.
Anton Ekker is a privacy lawyer based in Amsterdam who is representing the British former Uber drivers.
“We know for sure that Uber is using algorithms for decisions about fraud and deactivation of drivers. This is happening everywhere,” he said.
On Uber’s claims that its termination decisions are made by humans, Mr Ekker said: “If it is automated decision-making, then the GDPR says they must have legal grounds to use such technology, and they must give drivers the possibility to object to an automated decision, which they clearly did not do.”
Mr Ekker added that on Twitter he had seen thousands of complaints from Uber drivers all over the world, saying they had been automatically terminated for committing fraud without an explanation.
His intention is to seek a ruling from the Dutch courts, which, if successful, would then make it possible to bring a class action lawsuit against Uber.
According to Prof Lilian Edwards, chair of Law, Innovation and Society at Newcastle University, ADCU’s legal challenge could set a precedent with the European Court of Justice.
“This is probably the biggest case we’ve had so far on Article 22 of the GDPR that’s ever gotten to the courts,” she told the BBC.
In 2017, Prof Edwards, together with Dr Michael Veale of University College London, published an academic paper exploring the challenges relating to transparency and fairness when it comes to the use of computer algorithms to make decisions that affect people’s lives.
“Article 22 is really important because this is the provision that arguably gives you the right to an explanation about why an automated decision was made about you,” she explained.
“There’s been huge debate for years about whether the law could give people some rights over it, and this is a way for us to get some control over it and to be able to challenge it if it’s wrong.
“So this is really big news,” she says.