- Stats released by the UK’s biggest police force show it used live facial recognition to scan the faces of 8,600 people in a busy shopping area in London last week.
- The system threw up eight alerts from the Metropolitan Police database, but only one of these was a correct identification.
- The Met Police announced early this year that it would start rolling out facial recognition in London despite pushback from advocacy groups.
- Visit Business Insider’s homepage for more stories.
British police used facial recognition to scan more than 8,000 people’s faces in one of London’s busiest shopping districts, and it yielded far more false than correct positives.
As spotted by British privacy advocacy group Big Brother Watch, statistics released by London’s Metropolitan Police show it deployed Live Facial Recognition (LFR) technology on February 27 at Oxford Circus, a busy area in the center of London frequented by tourists and shoppers.
The figures show the police scanned roughly 8,600 people, and the technology threw up eight matches from its database of faces. Seven of these were false positives however, resulting in five incorrect “engagements.” A spokesperson for the Met told Business Insider that in the case of these five “engagements” officers spoke to the individuals flagged and ascertained that they were not individuals wanted by the police.
One of the faces flagged by the system did result in an arrest of a 35-year old woman who was later charged with three counts of assault on police.
Big Brother Watch wrote on Twitter that this meant 86% of the alerts the system threw up were false, and 71% of these misidentifications resulted in the police stopping and questioning someone.
Big Brother Watch added: "This blows apart the Met's defence that facial recognition surveillance is in any way proportionate or that the staggering inaccuracy is mitigated by human checks.
"This is a disaster for human rights, a breach of our most basic liberties and an embarrassment for our capital city."
The Met spokesperson said the area the police were scanning was clearly marked. "LFR deployments are accompanied by clear signage alerting members of the public to the fact that LFR is in operation. We want the public to know that we are there and want to provide reassurance that we are working to make London safer," they said.
The Met announced in January this year that it would start rolling out live facial recognition technology in London, and last month Commissioner Cressida Dick pushed back against criticisms by advocacy groups that the technology poses threats to privacy and civil liberties.
Criticisms of police use of facial recognition center on its inaccuracy and invasion of privacy. In particular AI experts have pointed to racial and gender bias in facial recognition systems, which more frequently misidentify women and people of color and could contribute to overpolicing of those groups. The Met Police has said its system has been proven not to have any ethnic bias.
Some US cities including San Francisco and Oakland in California and Portland, Oregon have passed laws banning police use of facial recognition.