Facial Recognition in the Private Sector: Unmasking the Privacy Issues 

Introduction

What’s in a face? Quite a lot it seems if you have the right technology to extrapolate information from it.

Use of facial recognition technology (FR) in law enforcement has been in the spotlight recently by virtue of the ICO’s announcement of an investigation into Clearview AI’s use of scraped images to identify criminal suspects. There has also been a proliferation of news articles regarding companies such as IBM and Amazon taking a step back from the development of FR technology because of concerns about its use by law enforcement agencies.

Earlier this year, the EU was considering a blanket 3-5 year ban on FR, but eventually left it up to individual Member States to assess how and when to allow its use. In the UK, the House of Commons Science and Technology Committee has called for British police to stop using FR until regulation is in place.

Potential use cases

However, with many individuals spending extended periods at home, both working and studying, could valid uses exist in the private sector? This article considers whether certain limited private sector use cases could be justifiable by reference in particular to European privacy laws.

Proctoring

Proctoring involves the monitoring, by a human or FR, of exams taken remotely. Thanks to machine learning, FR mitigates effectively against certain cheating methods – like asking another person to take the test – and makes sure that the correct individuals are present during the exam.  During the pandemic in particular, FR offers an alternative to cancelling exams. However, it may be difficult to surmount the privacy hurdles. Some of the factors to be taken into account are indicated below.

Premises access

Lots of companies now use fingerprint scanners as a method of validating entry to company premises, particularly in sensitive industries with large numbers of staff. FR is a natural evolution from that, and feasible if companies can overcome the legal hurdles.

Covid19 measures and homeworking

In the era of Covid19, FR may offer some tempting solutions to complex problems, such as ensuring productivity during homeworking, determining whether individuals have the virus etc. In most scenarios, the use of FR to monitor workers is likely to be disproportionate, which will make it difficult for organisations seeking to deploy such technology to comply with European privacy rules. In general, it is likely to be advisable to avoid subjecting workers to intrusive surveillance in any event as it indicates a lack of trust vis-à-vis those employees, reduces morale and increases staff anxiety.

That said, there may be particular security measures that are appropriate in highly regulated workplace scenarios that may be difficult to replicate in a homeworking context, e.g. measures that apply to individuals who work on the trading floors of investment banks. FR tools may be able to offer solutions in this context, again subject to careful assessment by reference to privacy rules.

Privacy implications

Any use of FR technology by private sector organisations will need to surmount a number of privacy obstacles, particularly in light of individuals possibly claiming compensation for distress, and in view of other potential penalties for GDPR breaches.

Lawfulness of processing of personal data – personal data can only be processed under one of six legal basis listed in the GDPR. For private sector organisations using FR, the most suitable are likely to be consent, legitimate interests or possibly contractual necessity.

Consent is problematic as GDPR mandates that it must be freely given – this requirement is likely to pose difficulties for proctoring (one is forced to consent in order to take the exam) or home working/Covid measures (employee consent is not considered freely given due to the imbalance in power with the employer).  Consent must be capable of being withdrawn, which would reduce the feasibility of any such measures for an employer.

Legitimate interests is an alternative option. However, reliance on that ground requires that the company’s (or a third party’s) interests are balanced against the interference with the rights of the relevant individuals.  Companies should document any reliance on such ground with a “legitimate interests” assessment. However, basing processing on this ground requires that the use of FR technology is necessary and proportionate.  It also requires compliance with other GDPR measures, such as accuracy, security and transparency. Meeting these requirements is likely to pose significant challenges to the use of FR technology in most workplace scenarios.

As FR involves biometric data it will need to meet one of nine additional conditions to keep it lawful. However, in many scenarios organisations will likely have to seek to rely on explicit consent. There are few circumstances in which it will be practical to rely on this legal basis, although it may be possible in some circumstances, e.g. a biometric building access point which offers a genuine choice in relation to use of a non-biometric alternative.

Proportionality of data processing – processing must be proportionate and justified in relation to the purpose that the organisation that engages in the processing is seeking to achieve. Organisations must consider whether there is likely to be a less intrusive way to meet the relevant objectives: Could the same objective be achieved if proctoring is done by a human invigilator?  Is it feasible to continue to facilitate home working or, if not, address Covid 19 concerns with less intrusive checks?

Fairness – processing should not use personal data in a way that is against the reasonable expectations of the individual.

Transparency – is key and linked to fairness. Individuals need to be told about all the processing, particularly any unusual uses of their personal data such as FR.  Recent case law has highlighted the importance of being transparent before monitoring employees.

Others – there are a host of other issues organisations hoping to utilise FR will face, such as algorithmic bias, data retention, purpose limitation, the need for a data protection impact assessment etc.

Conclusion

Organisations considering deploying FR solutions should proceed with caution. It will be very difficult to justify in the majority of cases as there will generally will be a less-intrusive alternative means of reaching the same goal.

Source: Facial Recognition in the Private Sector: Unmasking the Privacy Issues – Artificial Lawyer

0 Reviews

Write a Review

Vinkmag ad

The Investor

Read Previous

Mobile Operators Move Rapidly Toward 5G Deployments 

Read Next

Bank of Thailand Tests its Central Bank Digital Currency with Large Businesses Before Official Debut

Leave a Reply

Your email address will not be published. Required fields are marked *