Written by Tambiama Madiega and Hendrik Mildebrath.
Artificial intelligence (AI) powers the use of biometric technologies, including facial recognition applications, which are used for verification, identification and categorisation purposes by private or public actors. While facial recognition markets are poised to grow substantially in the coming years, the increasing use of facial recognition technologies (FRTs) has emerged as a salient issue in the worldwide public debate on biometric surveillance.
While there are real benefits to using facial recognition systems for public safety and security, their pervasiveness and intrusiveness, as well as their susceptibility to error, give rise to a number of fundamental rights concerns with regard, for instance, to discrimination against certain segments of the population and violations of the right to data protection and privacy. To address such effects, the EU has already put strict rules in place under the Charter of Fundamental Rights, the General Data Protection Regulation, the Law Enforcement Directive and the EU framework on non-discrimination, which also apply to FRT-related processes and activities. However, various actors question the effectiveness of the current EU framework in adequately addressing the FRT-induced fundamental rights concerns. Even if courts attempted to close gaps in protection through an extensive interpretation of the pre-existing legal framework, legal uncertainties and complexities would remain.
Against this backdrop, the draft EU artificial intelligence (AI) act, unveiled in April 2021, aims to limit the use of biometric identification systems including facial recognition that could lead to ubiquitous surveillance. In addition to the existing applicable legislation (e.g. data protection and non-discrimination), the draft AI act proposes to introduce new rules governing the use of FRTs in the EU and to differentiate them according to their ‘high-risk’ or ‘low-risk’ usage characteristics. A large number of FRTs would be considered ‘high risk’ systems which would be prohibited or need to comply with strict requirements. The use of real-time facial recognition systems in publicly accessible spaces for the purpose of law enforcement would be prohibited, unless Member States choose to authorise them for important public security reasons, and the appropriate judicial or administrative authorisations are granted. A wide range of facial recognition technologies used for purposes other than law enforcement (e.g. border control, market places, public transport and even schools) could be permitted subject to a conformity assessment and compliance with some safety requirements before entering the EU market. Conversely, facial recognition systems used for categorisation purposes would be considered ‘low risk’ systems and only subject to limited transparency and information requirements. While stakeholders, researchers and regulators seem to agree on a need for regulation, some critics question the proposed distinction between low-risk and high-risk biometric systems, and warn that the proposed legislation would enable a system of standardisation and self-regulation without proper public oversight. They call for amendments to the draft text, including with regard to the Member States’ leeway in implementing the new rules. Some strongly support stricter rules – including an outright ban on such technologies.
Looking beyond the EU, there is a global surge in use of facial recognition technologies, whilst concerns about state surveillance are mounting and amplified by the fact that there are, so far, very limited legally binding rules applicable to FRTs even in major jurisdictions such as the United States of America (USA) and China. Policy- and law-makers around the globe have the opportunity to discuss – in a multilateral and possibly in a bilateral context – how to put in place adequate controls on the use of facial recognition systems.
Read the complete in-depth analysis on ‘Regulating facial recognition in the EU‘ in the Think Tank pages of the European Parliament.