Is this a TikTok that escapes Brexit?

27 January 2021

Olivia_o'kane_-_cropped

A claim has been issued against six corporate defendants said to be involved in or responsible for the operation of the social media platform TikTok and its “effective predecessor” Musical.ly. ‘TikTok is a social video app that allows users to share short videos. It was the world's most downloaded app in 2020 and is particularly popular with ‘Gen Z’ and social media influencers.

The claimant in this matter is a 12 year old girl from London who alleges that the defendants have misused children’s private information and processed their personal data in breach of the duties imposed by the General Data Protection Regulation (GDPR). The remedies sought are a declaration, damages, injunctions, and orders for erasure of the data in question for all children in the UK under the age of 16. The damages claimed are for the “loss of control of personal data”.

The Children’s Commissioner for England (the “Commissioner”) has brought this claim to the court as the child’s litigation friend.

Like most social media platforms, TikTok maintains age restrictions for users and requires that all users be at least 13 years old to join their app. However, it is a commonly known problem that age restrictions are routinely ignored. Online age verifications are easily bypassed by any child with independent access to a smart device and many do so with parental knowledge and consent.

The timing of this claim was critical, being lodged with the court on 20th December 2020 and issued by Mr Justice Warby on 30th December 2020. Having reviewed the papers, Mr Justice Warby noted that it was clear that the claimant’s representatives did not wish to press on with the case until the outcome of the appeal in Lloyd v Google became known. However, they were keen to issue the claim before the year end. This urgency “stemmed from the fact that the end of the Brexit transition period on 31 December 2020 will bring about changes in the law which are, or are at least said to be, relevant to the intended claim.”

The claimant’s skeleton argument included the following;

“Further, and crucially, if these intended proceedings are issued prior to 1 January 2021, any judgment given is enforceable in Member States without further procedures. If the proceedings are issued from 1 January 2021 onwards, local laws of each Member State will apply which could severely impact and/or prejudice to Claimant’s ability to enforce.”

While Mr Justice Warby noted that the timing of this matter was inconvenient for the court and potentially prejudicially against the defendants, he did not believe there had been undue delay and so could see no reason to delay addressing the application forthwith.

Before issuing the claim the Mr Justice Warby first had to hear an application to issue these proceedings under a pseudonym. This protective measure was sought due the concerns of the child’s parents that the child may be subject to harmful abuse online if her identity was to be made public.

The Court acknowledged that while the defendant has a strong common law right to open justice, this must be considered in balance with the child’s right to private and family life under Article 8 of the Human Rights Act 1998 (“HRA”). It was further noted that the general rule of open justice found in Article 6 HRA may be departed from “where the interests of juveniles or the protection of the private life of the parties so require.” However, the Court highlighted that this does not provide an automatic protection for children and a “balance must always be struck, and attention must be paid to the specifics of the individual case.”

Mr Justice Warby then considered the specific factors in this case in turn.

Significant weight was given to the Commissioner’s witness statement which identified a risk of direct bullying and hostile reactions from other children, TikTok users and “social media influencers who might feel their status or earnings were under threat.” Further weight was given to the views of the child’s parents who shared similar concerns. While it was acknowledged that these events were not inevitable, it was accepted they were reasonably foreseeable and significant weight was given to the evidence due to the Commissioner’s specific expertise.

Interestingly, it was further considered that it was uncertain if the child’s parents would support the proceedings if the child’s identity was not protected. This raised broader concerns as to how this decision could impact upon children’s access to justice. Mr Justice Warby considered “that if the Court required the claimant to be named that could have a chilling effect on the bringing of claims by children to vindicate their data protection rights. On that footing, the grant of anonymity supports the legitimate and important aim of affording access to justice, and the order is necessary in order to secure the administration of justice.”

Considerations that the child could mitigate these negative effects by remaining off social media were quickly rejected as the court acknowledged the importance of accessing these platforms for social and educational purposes. It was noted that access to these platforms was particularly significant for children due to the increased use of online learning as a result of the ongoing COVID-19 pandemic.

Finally, it was considered that the child’s identity was not required in order for damages to be specified and calculated as the alleged damage of ‘loss of control’ was not specific to the individual. As a result, Mr Justice Warby approved the application and this matter has been issued under the pseudonym “SMO (a child) by their litigation friend, Anne Longfield”.

While the immediate impact of this decision will be of interest to many it may be that more significant impacts of this case are yet to come. On 13th January 2021, TikTok announced that it would make changes for its users in the United States of America to make the experience safer for younger users. All accounts for users aged 13 to 15 are now to be set to private by default and tighter controls will be added for all users under 18. This follows their recent fine of $5.7 million issued by the Federal Trade Commission for their violation of U.S. children’s privacy laws. With an increased global scrutiny on the protection of personal data and the increased independent access to online content for children, all online service providers across the EU should pay special attention to this case as it progresses.

*This information is for guidance purposes only and does not constitute, nor should be regarded, as a substitute for taking legal advice that is tailored to your circumstances.

Back