Thursday, March 12, 2020
Is facial recognition technology discriminatory?
Those of us who have been concerned by the increasing use of facial recognition technology by police forces will welcome the intervention of the Equalities and Human Rights Commission in the debate on this issue, today.
As the Guardian reports, the equalities watchdog has said that mass screening of the public at shopping centres or events like pop concerts, by police officers using facial recognition software, must be halted because it could amplify racial discrimination and stifle free expression.
They want future use of the technology to be suspended until its impact has been independently scrutinised and laws governing its application improved:
Police in London and south Wales have been at the forefront of using automated facial recognition (AFR) technology, which uses cameras to capture images of faces and double-checks these against databases of wanted suspects.
Scotland Yard has this year deployed cameras to scan shoppers in Stratford, east London, and at Oxford Circus in London, while South Wales police used the technology at a Slipknot concert at the Cardiff City football club stadium in January, as well as to monitor football fans.
The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police wrongly stopped five people and correctly stopped one.
Prof Peter Fussey, an expert on surveillance from Essex University who conducted the only independent review of the Metropolitan police’s public trials on behalf of the force, has found it was verifiably accurate in just 19% of cases.
But last September, the high court refused a judicial review of South Wales police’s use of the technology. Judges ruled that although it amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.
Rebecca Hilsenrath, chief executive of the EHRC, a statutory non-departmental public body covering England and Wales, established under the Equality Act, is absolutely right when she says that the use of this technology should be "suspended until robust, independent impact assessments and consultations can be carried out, and so that we know exactly how this technology is being used and are reassured that our rights are being respected.”
As the Guardian reports, the equalities watchdog has said that mass screening of the public at shopping centres or events like pop concerts, by police officers using facial recognition software, must be halted because it could amplify racial discrimination and stifle free expression.
They want future use of the technology to be suspended until its impact has been independently scrutinised and laws governing its application improved:
Police in London and south Wales have been at the forefront of using automated facial recognition (AFR) technology, which uses cameras to capture images of faces and double-checks these against databases of wanted suspects.
Scotland Yard has this year deployed cameras to scan shoppers in Stratford, east London, and at Oxford Circus in London, while South Wales police used the technology at a Slipknot concert at the Cardiff City football club stadium in January, as well as to monitor football fans.
The Oxford Circus deployment on 27 February scanned 8,600 faces to see if any matched a watchlist of more than 7,000 individuals. During the session, police wrongly stopped five people and correctly stopped one.
Prof Peter Fussey, an expert on surveillance from Essex University who conducted the only independent review of the Metropolitan police’s public trials on behalf of the force, has found it was verifiably accurate in just 19% of cases.
But last September, the high court refused a judicial review of South Wales police’s use of the technology. Judges ruled that although it amounted to interference with privacy rights, there was a lawful basis for it and the legal framework used by the police was proportionate.
Rebecca Hilsenrath, chief executive of the EHRC, a statutory non-departmental public body covering England and Wales, established under the Equality Act, is absolutely right when she says that the use of this technology should be "suspended until robust, independent impact assessments and consultations can be carried out, and so that we know exactly how this technology is being used and are reassured that our rights are being respected.”
Comments:
<< Home
Is not the main fault with current facial recognition technology a failure to discriminate where shades of skin colour are concerned?
Post a Comment
<< Home