Legal tools accelerated by etimologic

Insight

News

AI Facial Recognition – EDPB Expresses Bold View


If you are out and about and doing nothing wrong, should you have anything to fear from AI automated recognition of your features, such as your face or gait? This is yet another instance of whether the end – prevention of crime – justifies the means – invasion of your privacy.

In a bold move, The European Data Protection Board has urged the banning of the technology in the EU as part of the Commission’s proposed Regulation on a European Approach for Artificial Intelligence (AIR). In the EDPB’s view, the use of AI for these purposes and (amongst others) to categorize a person’s ethnicity, gender, political or sexual orientation according to biometrics, infringes upon our fundamental rights and freedoms enshrined in EU law (see Statement on the Digital Services Package and Data Strategy | European Data Protection Board (europa.eu)).

At present the proposed AIR allows for real-time use of remote biometric systems in publicly accessible places for law enforcement. The British Academy reports (https://www.thebritishacademy.ac.uk/projects/uk-international-challenges-18-facial-recognition-criminal-justice-system/) that China and Australia are introducing facial recognition schemes, and some UK police forces, including the Met, use the technology.  This is taking place despite reports of algorithms being racially biased, with divergent error rates across demographic groups. The poorest accuracy are reportedly found in subjects who are female, Black, and 18-30 years old (https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/).  Concerns over facial recognition technology were such as to prompt IBM, Amazon, and Microsoft to stop selling the kit to US police (https://qz.com/1876072/the-lobbyists-set-to-fight-a-us-ban-on-police-facial-recognition/).

The issue of whether the technology should be banned or regulated, may depend on the question asked. Doubtless few would deprecate the use of the technology if it would have prevented the Manchester nail bombing, but it is questionable whether the answer would be so clear if the wrong person would have been identified and mistakenly shot by police, however well-intentioned and with all precautions being taken. This brings into play the issue of whether the answer turns on the quality of the output or whether, as per the EDPB, no rule of reason exists, and the use case should be banned per se.

Some may say that the technology is but a small extension from use of surveillance systems such as CCTV which are an indispensable tool for the police and security services. The counter, of course, is that human judgment can be tested and cross-examined, not so with black-box AI algorithms.

Article 8 of the European Convention on Human Rights, which is given effect in the UK by the Human Rights Act 1998, provides that interference with respect for a person’s private life, home and correspondence may take place in accordance with the law and where it is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others. So, the exception exists, and the question is whether the case is made out. It seems not, for the time being at least, and it is submitted that the most likely outcome will be a form of regulation to ensure transparency and proportionality, rather than an outright ban.

November 2021

Reduce costs and save time

Mattersmith legal advice Etimologic
Mattersmith Limited is a company incorporated in England and Wales with registered number 11320618, whose registered office is at 11 Telford Court, Morpeth, Northumberland, NE61 2DB, United Kingdom, and is a recognised body authorised and regulated by the Solicitors Regulation Authority with number 650058. © Mattersmith, 2022

Site by: Thinkfarm

Mattersmith Limited