The European Data Protection Board (“EDPB”) has published Guidelines 5/2022 on privacy issues in the use of facial recognition technology by law enforcement agencies.
Below are the most relevant points set out in the EDPB’s.
Facial recognition technology
Facial recognition represents one of the most innovative and revolutionary technological inventions of our time in the ICT field: it is a biometric technique used to uniquely identify a subject on the basis of his or her facial features.
Facial recognition technology (“FRT”) can have various applications: it can be used to (i) do “filters” on some popular social networks, and to enrich faces with augmented reality elements; (ii) unlock devices, and use one’s face as a password; and (iii) identify subjects for security reasons, through algorithms that can draw from a vast database of faces.
However, this technology is particularly intrusive and thus poses significant issues under privacy regulations.
The EDPB adopted the 5/2022 Guidelines on facial recognition in order to explore the issues that FRTs can pose from a technological point of view, as they are tools that can be deployed in both the public and private sectors. Indeed, that of FRTs is a billion dollar business: it is estimated that by 2024 this technology will be present on 1.3 billion devices. However, the greatest concern is over how public authorities could (and can) use FRTs for public security and counterterrorism purposes.
The EDPB notes how the use of FRTs can result in serious risks to the rights of data subjects with respect to compliance with the law on the processing of personal data, and this could result in several possible violations of the Charter of Fundamental Rights of the European Union by member states in the public use of such technologies:
- 1, 10, 11, and 12 of the Charter: the collection of biometric data could result in discrimination of individuals, since cataloguing by ethnic, racial, or religious categories would be carried out;
- 52 and 53 of the Charter: limitations on the exercise of rights must be provided for by law (Art. 52 of the Charter); but there must be correspondence between the rights enshrined in the Charter and the rights posed by the European Convention on Human Rights. Therefore, FRTs can be provided for by law (Art. 53 of the Charter). This means that the use of FRTs can be provided for by law only when strictly necessary, and under the vigilant control of the supervising authorities.
The EDPB reiterates and recognizes the importance of the use of such tools for security issues, but calls for an assessment to be made prior to the use of FRTs as to the requirements of necessity and proportionality as provided for in Article 52(1) of the Charter. In addition, the EDPB calls for a general ban on such types of tools in public settings in cases where they may result in discrimination among subjects, intervene on their emotions or result in large-scale recognition.
However, the EDPB, in Annex III, allows for the use of FRTs in the case of:
- facial recognition at borders, guaranteed by human verification;
- child abduction scenarios, to recover abducted children;
- violent demonstrations to recognize the subjects involved;
- judicial search of a person suspected of committing a crime (in certain cases).
Although limited, there is an openness of the EDPB for such tools, which could result in a move closer to what is the model adopted by states with regimes different from the European one. It will therefore be up to political forces to determine how far public authorities can go, within the limits set out by the EDPB national data protection regulators.