Mishcon de Reya page structure
Site header
Main menu
Main content section

Facial recognition technology – the risks and how to avoid them

Posted on 2 December 2021

The use of CCTV to protect premises, visitors and employees is, these days, ubiquitous. Although that use, in itself, raises some compliance and legal challenges, it is increasingly the case that such systems either come equipped with, or can be adapted to include, live facial recognition (LFR) technology. For those in the real estate sector, there are some key issues to note in the recently issued Opinion from the Information Commissioner’s Office (ICO) on the use of LFR in public places. The ICO is the regulator and enforcer of data protection law (including the UK GDPR – the post-Brexit version of the EU GDPR) and the Opinion will be a key document when it comes to any action which might be taken against those who infringe the law in this area.

The main initial point of note is that “public place” is given a remarkably wide definition in the Opinion: “any physical space outside a domestic setting, whether publicly or privately owned” (in fact, this seems to encompass literally everywhere outside people’s homes, rather than just areas to which the public might have access). Undoubtedly, it would cover most – and probably all – areas on the premises of property owners and operators across the breadth of the real estate sector. 

Leading on from this, the guidance reminds us that processing facial images for the purpose of identifying individuals will constitute “biometric processing”, and that the legal justifications for doing this are limited (because, as the courts have accepted, biometric data is of an “intrinsically private” character).

Crucially, the guidance stresses that the automatic collection of biometric data without clear justification, proportionate to the circumstances, will be unlawful, and may result in enforcement action (and by extension, potential liability to legal claims from data subjects). When one considers that enforcement action can take the form of fines, to a nominal maximum of £17.5 million or 4% of global annual turnover, and also "stop notices" requiring cessation of processing, it can be seen that the adoption of LFR without a full prior analysis can result in high risks for a business. These are not merely theoretical risks: the ICO’s accompanying blogpost explains that the Opinion has been informed in part by six separate investigations into the use of LFR, and in none of those cases was the processing fully compliant with legal requirements. 

When it comes to conducting a risk analysis before adopting LFR, the Opinion points to the need to undertake a Data Protection Impact Assessment or DPIA (these are in any case mandated in certain circumstances under Article 35 of the UK GDPR). This will, the Opinion advises, allow an "assessment of risks and potential impacts on the data protection interests, rights and freedoms of individuals" and any "direct or indirect impact on wider human rights such as freedom of expression, association and assembly". It is clear that an absence of a DPIA will be a significant negative, or aggravating, factor in the event of an investigation by the ICO of any organisation deploying LFR.

As we have recent and full guidance from the ICO, it is essential that companies wishing to take advantage of LFR undertake a comprehensive and robust DPIA. For some organisations and for some circumstances, the DPIA may indicate that LFR may not be used, or may be used only where there are significant additional safeguards. With regulator attention clearly in this area, the risk of not complying with this application of the law is significant, and in any purchase of LFR systems you should satisfy yourself that the system can be legally used on your premises.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else