How can we help you?

This represents another significant fine by the Information Commissioner's Office (ICO), accompanied by an enforcement notice requiring Clearview to delete UK residents' data from its systems. 

It is also a stark reminder of the ICO's strong stance in relation to facial recognition technology (FRT) and how, without proper consideration and planning, its use will easily fall foul of UK data protection law.

However, in an increasingly online and globalised society, how effective can the ICO be when enforcing UK data subjects' rights against a US company that operates in cyberspace?

Who are Clearview AI Inc?

Clearview AI Inc is a US company that describes itself as the "leading facial recognition technology company". Clearview markets its facial recognition database of over 20 billion images to law enforcement agencies. Although Clearview no longer offers it services to UK agencies, during a test phase UK law enforcement carried out over 700 searches of its database.

Clearview's system acts like a search engine for faces. Its database is made up of images 'scraped' from the web, including social media platforms, but without the consent or knowledge of the individuals concerned. A user of the system, such as a law enforcement body, will upload an image of a face and search for matches in the facial recognition database. The search results also provide links to where these matching images are located, which may assist the user in identifying an individual.

While Clearview has previously stated that it only provides its services to criminal law enforcement agencies, the ICO noted that it has also provided its FRT to the Ukrainian Government for the purposes of identifying Russian soldiers and the deceased.

What does the ICO say?

The ICO's provisional view of November 2021 was that it may fine Clearview over £17m. While the ultimate £7.5m penalty notice levied in May 2022 represents a significant reduction, we note it is also accompanied by an enforcement notice requiring Clearview to:

  • stop obtaining and using personal data of UK residents that is publicly available online; and
  • delete from its systems all data of UK residents.

The ICO held that Clearview had contravened UK data protection law as follows:

  • lack of fairness and transparency;
  • no lawful basis for collecting personal data;
  • no process to prevent indefinite retention of personal data;
  • not adhering to the higher data protection standards for biometric data; and
  • potentially disincentivising people from requesting removal of their data from the database by requiring more personal information (such as photos) to first be provided.

What is the ICO's position on facial recognition technology?

FRT, when used for identification purposes, involves processing special category 'biometric' data. Among other things, it may also incorporate data about individuals' racial or ethnic origin.

In previous investigations and opinions, the ICO has made clear that use of FRT may constitute a threat to a person's rights under data protection law because:

  • it might be used on people without their knowledge;
  • it enables surveillance on a mass scale; and
  • there are concerns that it discriminates against women, disabled people and people from ethnic minority backgrounds.

An EU paper notes the risk of discrimination in FRT, and any data-supported algorithmic decision making, can stem from biases incorporated at the design and testing phase. For example, where a facial recognition database is drawn from a sample of images where white, able-bodied men are over-represented, the technology may be less accurate and work less well for black, disabled women. This is particularly the case where the underlying algorithms have not been adjusted to take account of these biases.

The onus is on the organisation or company that wants to use facial recognition technology to do so in a lawful, fair and transparent manner. As part of this, they will need to (among other considerations):

  • identify a lawful basis for processing facial recognition data;
  • meet additional conditions for processing special category and/or criminal offence data; and
  • carry out a Data Protection Impact Assessment (DPIA).

What do the Courts say?

In a case supported by the campaign group Liberty, the Court of Appeal has previously held that a 2017 FRT trial by South Wales Police (SWP) was in some respects unlawful, notably:

  • the use of FRT engages and may infringe Article 8 of the European Convention of Human Rights (right to respect for private and family life);
  • SWP was in breach of the Public Sector Equality Duty by not having done enough to satisfy itself that its FRT was not biased;
  • the DPIA failed to adequately assess the risks around the use of FRT and set out appropriate mitigation measures; and
  • the trial was not fully compliant with SWP's own policies on processing sensitive personal data.

The College of Policing has since issued new guidance on FRT for law enforcement purposes and SWP commenced new trials of the technology in 2022.

What have other regulators done about Clearview?

With over 20 billion images on its database, Clearview's operations – and the data protection concerns to which it gives rise – are clearly of worldwide importance.

The ICO's penalty and enforcement notices follow its joint investigation with the Office of the Australian Information Commissioner (OAIC), which commenced two years ago. The OAIC ordered that Clearview stop collecting data of Australian citizens and delete their data. Both the Canadian and French data protection authorities made similar orders. In March 2022, the Italian authorities fined Clearview €20m and ordered that the company appoint an EU representative as a point of contact for all EU data subjects and regulatory authorities.

In the United States, the American Civil Liberties Union (ACLU) reached a settlement with Clearview in May 2022 that prevents the sale of its technology in the US to most organisations and private businesses on a permanent basis.  Clearview continues to lawfully provide its services to certain law enforcement bodies in the US.

What happens next?

The ICO has issued a monetary penalty notice against Clearview for over £7.5m. The ICO has also issued an enforcement notice requiring Clearview to delete all of its UK data.

Clearview has contested whether the ICO and other non-US regulators have jurisdiction over its operations. Clearview's position has been that, as a US-based company downloading images in the US, it is not subject to the data protection laws of other countries.

While Clearview has stated that it does not know how many images it holds of UK residents, the ICO considers it would still be able to comply with the enforcement notice. The ICO notes, in particular, that Clearview has already committed in US court proceedings to blocking photos geolocated in Illinois, creating a 'geofence' around Illinois and not collecting facial vectors from images with Illinois-associated metadata or IP addresses.

It remains to be seen whether Clearview will formally appeal the ICO's decision to the First-tier Tribunal (Information Rights). If not, will the ICO consider taking further steps to enforce compliance?  Clearview is a US-based entity with no ongoing operations, revenue or employees in the UK, so any further formal enforcement action could prove to be costly or fruitless (or both) for the ICO.

In any event, the large fines and international condemnation of Clearview's practices send a clear message to others looking into facial recognition technology: failure to adhere to the ICO's requirement of 'data protection by design and by default' could prove to be a very expensive oversight.