Thursday, August 21, 2025
Police use of facial recognition could fall foul of law
The Guardian reports on claims by the equalities regulator that Scotland Yard’s plan to widen the use of live facial recognition technology is unlawful because it is incompatible with European laws.
The paper says that as the UK’s biggest force prepares to use instant face-matching cameras at this weekend’s Notting Hill carnival, the Equality and Human Rights Commission (EHRC) said its use was intrusive and could have a “chilling effect” on individuals’ rights:
The development will be a blow to Mark Rowley, the Metropolitan police commissioner, who has backed the use of the technology at mass events such as this weekend’s carnival, when 2 million people are expected to descend upon west London.
The EHRC has been given permission to intervene in a judicial review launched last month by the anti-knife campaigner Shaun Thompson. Thompson, a Black British man, was wrongly identified by live facial recognition (LFR) as a criminal, held by police, then faced demands from officers for his fingerprints.
Data seen by the EHRC shows that the number of black men triggering an “alert” by the technology is higher than would be expected proportionally when compared with the population of London, it said.
A letter last week from 11 anti-racist and civil liberty organisations, disclosed in the Guardian, urged the Met to scrap the use of the technology over concerns of racial bias and the impending legal challenge.
LFR technology captures and analyses the faces of individuals passing in front of real-time CCTV cameras. It extracts unique biometric data from each face and compares it against a “watchlist” of thousands of people sought by the police.
There is at present no specific domestic legislation regulating police use of LFR, with police using common law powers instead. The Met insists that the Equality Act 2010 places legal obligations upon them to eliminate discrimination.
The EHRC said that the claim brought forward by Thompson “raises issues of significant public importance” and will provide submissions “on the intrusive nature of LFR technology” which focus on the way in which the technology has been used by the police.
The Met’s policy on LFR technology was unlawful because it was incompatible with articles 8 (right to privacy), 10 (freedom of expression), and 11 (freedom of assembly and association) of the European convention on human rights, the watchdog said.
Rebecca Vincent, the interim director of Big Brother Watch, said the EHRC’s intervention was “hugely welcome”.
She added: “The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today. Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we’ve seen in Shaun’s case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities.”
John Kirkpatrick, the chief executive of the EHRC, said: “There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan police’s current policy falls short of this standard.”
If this technology is to be used then there needs to be a common standard regulatory regime across the UK that ensures that it is used fairly and is subject to scrutiny. Without that the system will not command public confidence and there will continue to be doubts as to how it is being used.
The paper says that as the UK’s biggest force prepares to use instant face-matching cameras at this weekend’s Notting Hill carnival, the Equality and Human Rights Commission (EHRC) said its use was intrusive and could have a “chilling effect” on individuals’ rights:
The development will be a blow to Mark Rowley, the Metropolitan police commissioner, who has backed the use of the technology at mass events such as this weekend’s carnival, when 2 million people are expected to descend upon west London.
The EHRC has been given permission to intervene in a judicial review launched last month by the anti-knife campaigner Shaun Thompson. Thompson, a Black British man, was wrongly identified by live facial recognition (LFR) as a criminal, held by police, then faced demands from officers for his fingerprints.
Data seen by the EHRC shows that the number of black men triggering an “alert” by the technology is higher than would be expected proportionally when compared with the population of London, it said.
A letter last week from 11 anti-racist and civil liberty organisations, disclosed in the Guardian, urged the Met to scrap the use of the technology over concerns of racial bias and the impending legal challenge.
LFR technology captures and analyses the faces of individuals passing in front of real-time CCTV cameras. It extracts unique biometric data from each face and compares it against a “watchlist” of thousands of people sought by the police.
There is at present no specific domestic legislation regulating police use of LFR, with police using common law powers instead. The Met insists that the Equality Act 2010 places legal obligations upon them to eliminate discrimination.
The EHRC said that the claim brought forward by Thompson “raises issues of significant public importance” and will provide submissions “on the intrusive nature of LFR technology” which focus on the way in which the technology has been used by the police.
The Met’s policy on LFR technology was unlawful because it was incompatible with articles 8 (right to privacy), 10 (freedom of expression), and 11 (freedom of assembly and association) of the European convention on human rights, the watchdog said.
Rebecca Vincent, the interim director of Big Brother Watch, said the EHRC’s intervention was “hugely welcome”.
She added: “The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today. Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we’ve seen in Shaun’s case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities.”
John Kirkpatrick, the chief executive of the EHRC, said: “There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan police’s current policy falls short of this standard.”
If this technology is to be used then there needs to be a common standard regulatory regime across the UK that ensures that it is used fairly and is subject to scrutiny. Without that the system will not command public confidence and there will continue to be doubts as to how it is being used.