TikTok facing London lawsuit over child privacy concerns

Action accuses video app of violating UK, European data protection laws

TikTok said that the claims in the London case “lacked merit” and the company would vigorously defend the action.  Photograph: iStock
TikTok said that the claims in the London case “lacked merit” and the company would vigorously defend the action. Photograph: iStock

TikTok faces a London lawsuit filed on behalf of millions of children in the UK and Europe over privacy concerns.

The suit accuses the popular video app and its parent company ByteDance of violating UK and European Union data protection laws. The suit seeks to stop TikTok from "illegally processing millions of childrens' information" and demands any personal information is deleted, the group behind the lawsuit said in a statement Wednesday.

Every child that has used the app since May 2018, regardless of their account status or privacy settings, may have had their private personal information collected for the benefit of unknown third parties, according to the suit filed by Anne Longfield, England's former Children's Commissioner.

The case follows increased scrutiny of the app by several EU data watchdogs. Last year, EU data-protection regulators pledged to coordinate potential investigations into the Chinese company, establishing a task force to get a better understanding of “TikTok’s processing and practices”.

READ MORE

In the US, Bytedance was fined $5.7 million in 2019 by the Federal Trade Commission to settle allegations that Musical.ly, which ByteDance bought and renamed TikTok, illegally collected information from minors. It was the largest FTC penalty in a children's privacy case.

The company is also seeking permission to settle a privacy suit in the US.

TikTok said that the claims in the London case “lacked merit” and the company would vigorously defend the action.

“Privacy and safety are top priorities for TikTok,” the company said in the statement. “We have robust policies, processes and technologies in place to help protect all users, and our teenage users in particular.”

The suit was filed in December, but details were only released Wednesday. If the case is successful, children could be entitled to thousands of pounds in compensation. The claimants estimate that more than 3.5 million kids are affected in the UK alone, meaning a potentially hefty bill for the app if it loses.

Practices ‘hidden and shady’

The former children's commissioner for England told the PA news agency she felt the app's data collection policies, in general, were "excessive for a video-sharing app" but was most troubled by the "collection of data on an industrial scale without either the kids or the parents realising". Although

TikTok’s policy on data collection was listed on its website, Ms Longfield said she felt TikTok’s practices were “hidden” and “shady”. “In terms of what they take there are addresses, names, date of birth information, their likes, their interests, who they follow, their habits — all of these — the profiling stuff, but also the exact geolocation, that is very much outside what would be deemed appropriate,” she said. “You shouldn’t be doing that when it’s kids.” The claim accuses TikTok and ByteDance of being “deliberately opaque” about who has access to the data it collects, but notes that the company makes billions of dollars from advertising revenue generated by providing advertisers with information about users.

‘Powerful test case’

Ms Longfield said she hoped the case would be a “powerful test case” and “landmark” which would be a “wake-up call” for other social media platforms. She added that she hoped to force TikTok to delete the data and put new measures in place to protect children. “I’d like to see them acknowledge the problem, stop collecting the illegal data, delete the illegal data they have and put safeguards in place, so they can demonstrate that they’re acting responsibly,” she said. “I’d like to see them reassure parents — they have introduced some measures over recent months — great, I’m pleased when people take action, but while this is absolutely at the core of what the business model is, any action won’t get to the heart of what needs to be done.

“So I think they need to communicate that to parents, they need to stop doing it, they need to delete it and put measures in place and then look at how they’re going to rebuild trust — I think that really is what we’re talking about.”

Last year, a London judge granted a 12-year-old girl anonymity in the case, so as to avoid her receiving online bullying by other users of the app. She is the lead claimant in the case, which has been stayed until a Supreme Court decision in a similar case against Google. – Bloomberg and PA