How can we help you?

On 4 April 2023, the UK's Information Commissioner's Office (ICO) issued TikTok Inc and TikTok Information Technologies UK Limited (together, TikTok) with a Penalty Notice of £12.7 million.

The Penalty Notice, publication of which was delayed until 15 May 2023 to allow TikTok and the ICO to agree on redactions, was levied in response to breaches of the UK GDPR requirements to process data fairly, lawfully and transparently (Article 5(1)(a)), to provide appropriate information to individuals about that processing (Articles 12 and 13) and the requirements relating to obtaining consent on behalf of children (Article 8).

According to the ICO, TikTok's UK GDPR breaches were "substantive and very serious" and "amongst the most serious breaches of information rights obligations". The Penalty Notice gives us a detailed insight into the ICO's decision-making and how it justified imposing one of its largest ever penalties.

So what did TikTok do to warrant such a reprimand from the ICO, and what can other organisations learn from the ICO's approach?

What kind of personal data was involved?

As with many social media apps, users provide (and TikTok processes) extensive personal data, including users' names, dates of birth, contact details, content (including photos, videos, comments and 'likes'), as well as internal user profiles detailing the user's preferences, interests, gender and other factors.

Between 25 May 2018 and 28 July 2020 (the Relevant Period), TikTok's terms of service stated that the platform in the UK was solely offered to users aged 13 and over. To create an account, users provided their date of birth, and in doing so 'self-certified' their age and purported to 'consent' to TikTok processing their personal data.

However, TikTok had no measures in place to prevent underage users from making a false declaration of their age and no processes in place for obtaining consent or other authorisation from those with parental responsibility for underage users.

While TikTok was unable to provide the ICO with an actual or estimated number of users under 13 years of age, the ICO estimated that approximately 1.1-1.4 million under 13s were account holders. This represents 11-14% of the total number of under 13s in the UK in 2020.

Consent: breaches of Article 8 UK GDPR

TikTok's 'age gateway' failed to account for underage users falsely self-certifying as being 13 years old or over. And so underage users were able to create accounts and submit personal data to TikTok for processing. TikTok failed to take any steps to obtain the required parental consent for those underage users, putting it in breach of Article 8 of the UK GDPR.

In its written representations to the ICO, TikTok submitted that it had applied steps to remove underage users of the platform as part of its underage account banning policy. It had compiled lists of words or word combinations that an underage user may include on their profile, or content uploaded to the platform, to help identify potentially underage users. However, the ICO noted that the word lists were not implemented until January 2019, and TikTok provided no reason as to why equivalent processes were not in place from the introduction of the UK GDPR in May 2018.

TikTok argued that it relied on contractual necessity as an alternative legal basis for processing the personal data of users under 13. However, the ICO disagreed – TikTok was unable make a valid contractual offer to underage users, nor did underage users have the legal capacity to accept a contractual offer.

Privacy Notices: breaches of Articles 12 and 13

The ICO held that Tik Tok's privacy policies failed to convey to users the information required under Article 13 UK GDPR in a manner which was concise and accessible. The ICO highlighted that the privacy policy failed to communicate the information in plain language such that a younger user would be able to understand.

Further, TikTok's privacy policies did not clearly specify to which overseas jurisdictions UK users' personal data would be transferred. Notably, in spite of TikTok's public statements to the contrary, UK users' personal data was in fact processed in China during the Relevant Period.

Ultimately, UK users (in particular, children) could not make informed choices about whether to provide their data to TikTok for processing.

Fair and lawful processing: breaches of Article 5(1)(a)

The breaches of Articles 8, 12 and 13 meant the ICO was satisfied that TikTok had failed to process users' data fairly and lawfully during the Relevant Period, in breach of Article 5(1)(a).

Calculating the penalty

The ICO considered that TikTok's breaches of the UK GDPR were of "sufficient seriousness to warrant issuing a penalty, and indeed amongst the most serious breaches of information rights obligations". It set the £12.7 million penalty after following the five-step process in its Regulatory Action Policy (RAP), as follows:

Step 1 – removing any financial gain

  • While TikTok was likely to have made a financial gain from the breaches, there was insufficient evidence to allow the ICO to make a calculation under Step 1. However, the ICO had regard to likely financial gain under Step 2.

Step 2 – considering scale and severity of the breach

  • The ICO noted, among other things, that TikTok's breach of the UK GDPR was substantive, very serious and happened over more than 2 years.
  • TikTok acted negligently, its efforts to identify underage users were inadequate and the steps taken were not commensurate to its size, resources or the seriousness of the issue.

As for the remaining steps, the ICO held that there were no aggravating factors (Step 3), considered that Step 2 addressed any 'deterrent effect' (Step 4) and that there were no mitigating factors not already accounted for (Step 5). In particular, the ICO held the £12.7 million would likely not cause financial hardship to TikTok. Both TikTok entities had a combined turnover of $702 million (£547 million) in 2020, with the parent company's global turnover reaching $35 billion.

TikTok's penalty is one of the largest ever issued by the ICO, after 2020's £20 million and £18.4 million penalties issued to British Airways and Marriot, respectively.

Regulatory action worldwide

The ICO's Penalty Notice follows regulatory action taken or initiated in other jurisdictions against TikTok:

  • South Korea – in July 2020 the Korea Communications Commission fined TikTok ₩186 million (£123,000) for mishandling children's data.
  • Netherlands – in July 2021 the Dutch Data Protection Authority noted widespread use of TikTok by 6-18 year olds and fined TikTok €750,000 (£640,000) for failing to adequately explain to users its processing of their personal data.
  • Republic of Ireland – in September 2021 the Data Protection Commission commenced inquiries into TikTok's compliance with GDPR, including in the context of processing under 18s' personal data. The matter is currently with the European Data Protection Board for consideration and a decision awaited in the coming months.
  • Canada – in February 2023 the Office of the Privacy Commissioner of Canada and its counterparts in Quebec, British Columbia and Alberta announced a joint investigation into whether TikTok is operating in compliance with Canadian privacy laws, with a particular focus on younger users.

Takeaways

TikTok is a global phenomenon with monthly users exceeding 1 billion as at September 2021. In the UK alone, its monthly active users in the UK rose from 2 million to over 17 million during the Relevant Period.

However, this Penalty Notice still contains key takeaways for organisations of all sizes, in particular those that process children's personal data, special category data or otherwise handle large volumes of personal data.

These include:

  • Conducting spot-checks and random sampling on your data – TikTok failed to conduct random checks for underage users until August 2021. While TikTok did delete 152,978 accounts that it suspected belonged to under 13s between December 2018 to July 2020, this represented less than 1% of TikTok's total users and as little as 9% of estimated UK underage users.
  • Keeping accurate records – TikTok did not retain any data prior to 2020 about the number of accounts proactively reviewed on suspicion of belonging to underage users.
  • Listening to and investigating staff concerns – the ICO's investigation revealed that senior staff members expressed concerns about underage users' accounts not being banned. TikTok did not take adequate steps to investigate or address the problem.
  • Reviewing and updating your policies and privacy notices – the ICO held that TikTok's internal policies on the removal of underage user accounts were excessively strict and, even after staff raised concerns, remained broadly unchanged. Criticism of TikTok's privacy notices demonstrates a need to revisit these key documents from time to time to ensure they are clear, fit for purpose and suitable for the audience in mind.