TikTok sued for misuse of private information
Children's commissioner brings representative claim on behalf of 12-year-old London girl
The children’s commissioner for England, Anne Longfield, has begun legal action against the social media platform TikTok on behalf of a 12-year-old girl from London.
Longfield, who is bringing the claim as the child’s “litigation friend”, alleges that the video-sharing platform has misused the girl’s private information. She is seeking damages for “loss of control of personal data”. She also wants an order for the data in question to be erased. No further details of the data were given in today’s judgment.
The claim, if it goes ahead, would be a representative action on behalf of all children under 16 who are, or were, users of TikTok or Musical.ly — its predecessor. Longfield’s lawyers are waiting for a ruling from the UK Supreme Court in a similar case before deciding whether to proceed.
The children’s commissioner has a legal duty to promote and protect the rights of all children in England with a particular focus on children and young people who are in or leaving care, living away from home or receiving social care services.
Although TikTok’s terms of service say it is “only for people 13 years old and over”, the company says that “users of all ages can find a place on the app”:
We accommodate users under the age of 13 in a limited app experience — “TikTok for Younger Users” — that introduces additional safety and privacy protections designed specifically for an audience that is under 13 years old.
At a High Court hearing today, Longfield asked Mr Justice Warby for permission to issue the proceedings without naming the child. The claim was heard urgently because Longfield’s lawyers argued that they might be at a disadvantage after the Brexit transition period ends tomorrow.
Warby said that did not explain why this “novel if not unique” claim was filed late on Sunday 20 December, one day before the end of the legal term. He also noted that the lawyers had not, at that stage, identified the claimant. “A right to bring a claim on behalf of a person whose identity is known but kept secret from the court has never yet been recognised,” the judge said.
Warby accepted that the claim was likely to attract a good deal of public attention:
It is a direct challenge to the practices of a very well-known and highly influential social media platform. It is reasonable to suppose that some of that attention would be focussed on the claimant, if their identity was known. But that is not enough of itself to justify anonymity. Nor is the mere fact that the claimant is 12 years old. It is necessary to consider the nature of the likely attention, and the harm that it could cause.
But Longfield had said there was “a risk of direct online bullying by other children or users of the TikTok app; and a risk of negative or hostile reactions from social media influencers who might feel their status or earnings were under threat”.
The question for the court, said Warby, was whether there was sufficient general public interest in publishing a report identifying the girl to justify curtailing her right, and her family’s right, to respect for their private and family life.
In Warby’s view, there was no sufficient general public interest. In some cases, it was important for the media to name a party to engage the interest of the public and support open justice. “This is not a case in which that aim could justify the risk of harm,” he said:
The commissioner will take steps to publicise the case, in order to ensure that represented parties become aware of it. The topic is in any case one that will interest the public, whether or not the claimant is identified by name. This is not a case in which the identity or other singular attributes of the claimant are central, or even important aspects of the claim, that therefore need to be known and understood by the public.
Concluding his judgment, Warby said:
This is a public judgment, and I have not made, nor have I been asked to make, any reporting restriction order. That is why… there has been no need to give notice of this application to the media or any other person who might be affected.
That said, the hearing was conducted with discretion, and I have so far not disclosed any information about the claimant other than their age…
In the course of argument, however, it was accepted on behalf of the claimant that the public can be told that the claimant is a 12-year-old girl from London. Disclosure of that information is a lesser measure than total elimination of all personal information other than her age, and one that does not create a material risk of the harms identified above.
TikTok was not represented at the preliminary hearing, which was conducted remotely.
Warby said that the claim was clearly inspired by a case launched against Google in 2018. The claimant, Richard Lloyd, worked for many years in consumer protection. He wants to bring a US-style (opt-out) class action on behalf of more than 4 million Apple iPhone users. It’s alleged that Google secretly tracked some of their internet activity, for commercial purposes, between 9th August 2011 and 15th February 2012.
To bring his claim, Lloyd needs permission to serve proceedings on Google at its registered office in the US state of Delaware. Permission to serve proceedings outside the jurisdiction of the courts of England and Wales was refused by Warby in 2018. That decision was overturned by the Court of Appeal. Google has been granted permission to appeal to the UK Supreme Court and a hearing is expected next year.
Update 31 December
The judiciary website, which had originally published Warby’s judgment in full, removed the judgment shortly afterwards and replaced it with the judge’s order on anonymity. However, the full judgment remains available on BAILII and there are news reports of the hearing.
I have taken the opportunity to add some of the background to the case.