Pages

Friday, March 20, 2026

Facial recognition cameras showing racial bias

The Guardian reports that Essex police have paused the use of live facial recognition (LFR) technology after a study found cameras were significantly more likely to target black people than people of other ethnicities.

The paper says that the move to suspend use of the AI-enabled systems was revealed by the Information Commissioner’s Office (ICO), which regulates the use of the technology deployed so far by at least 13 police forces in London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex:

The ICO said Essex police had paused LFR deployments “after identifying potential accuracy and bias risks” and warned other forces to have mitigations in place. LFR systems are either mounted to fixed locations or deployed in vans. In January, the home secretary, Shabana Mahmood, announced the number of LFR vans would increase five-fold, with 50 available to every police force in England and Wales.

Essex commissioned University of Cambridge academics to conduct a study, which involved 188 actors walking past cameras being actively deployed from marked police vans in Chelmsford. The results were published last week and showed about half of the people on a watchlist were correctly identified and incorrect identifications were extremely rare, but the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

This “raises questions about fairness that require continued monitoring”, the report concluded. One of its authors, Dr Matt Bland, a criminologist, told the Guardian and Liberty Investigates: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”

The problem differs from the more common public concern about the technology which is that it identifies innocent people. Last month it emerged that police arrested a man for a burglary in a city he had never visited 100 miles away after retrospective face scanning software confused him with another person of south Asian heritage.

Possible reasons for the latest issue with LFR include overtraining of the algorithm on the faces of black people. Experts believe it could be rectified by adjusting system settings. A separate study of the same technology by the government’s National Physical Laboratory found black men were most likely to be correctly matched by the system and white men least likely, but the effect was not statistically significant.

In the light of this study I would expect other police forces to also suspend use of these cameras and for the rollout of more LFR vans to be put on hold. After all, we did tell them this might happen.

No comments:

Post a Comment

I am happy to address most contributions, even the drunken ones if they are coherent, but I am not going to engage with negative sniping from those who do not have the guts to add their names or a consistent on-line identity to their comments. Such postings will not be published.

Anonymous comments with a constructive contribution to make to the discussion, even if it is critical will continue to be posted. Libellous comments or remarks I think may be libellous will not be published.

I will also not tolerate personation so please do not add comments in the name of real people unless you are that person. If you do not like these rules then start your own blog.

Oh, and if you persist in repeating yourself despite the fact I have addressed your point I may get bored and reject your comment.

The views expressed in comments are those of the poster, not me.