The new audit tool was developed by a team from the University of Cambridge’s Minderoo Centre for Technology and Democracy to assess “compliance with the legislation and national advice” on concerns such as privacy, equality, and freedom of expression and assembly.
Based on the conclusions of a recent paper, the experts have joined calls for a ban on police use of facial recognition technology in public places.
“There is a lack of meaningful reparation mechanisms for individuals and communities injured by police deployments of the technology,” said Evani Radiya-Dixit, a visiting fellow at Cambridge’s Minderoo Centre and principal author of the research.
“We are working to preserve human rights and improve accountability in the use of technology.”
The audit tool was created by researchers based on existing legislative rules, such as the UK’s Data Protection and Equality Acts, as well as the outcomes of UK court cases and comments from civil society organisations and the Information Commissioner’s Office.
They applied their ethical and legal guidelines to three instances of UK police using facial recognition technology (FRT). The Bridges court case, in which a Cardiff-based civil rights campaigner challenged South Wales Police’s use of automated facial recognition technology (FRT) to live-scan crowds and compare faces to individuals on a criminal “watch list.”
The researchers also examined the Metropolitan Police’s trials of similar live FRT use, as well as a case study from South Wales Police, in which officers utilised FRT apps on their smartphones to scan crowds in order to detect criminals.
Researchers discovered that critical information concerning police use of FRT is “hidden from view” in all three cases, including minimal demographic data disclosed on arrests or other results, making it difficult to assess whether the tools “perpetuate racial profiling.”
In addition to a lack of openness, the researchers discovered a lack of accountability, with no clear redress for persons or communities harmed by police use, or misuse, of technology. “Police forces are not always accountable or held accountable for the abuses produced by facial recognition technology,” Radiya-Dixit said.
According to the researchers, several of the FRT uses lacked regular scrutiny from an independent ethical commission or the general public, and did not do enough to ensure there was a credible “person in the loop.”
Even the “watch list” in the South Wales Police smartphone app trial includes photos of people innocent under UK law – those previously arrested but not convicted – despite the fact that preservation of such images is illegal.
“Based on our research on police use of facial recognition, we found that all three of these deployments fail to fulfil the minimum ethical and legal criteria,” Radiya-Dixit stated.
“Over the last five years, police forces around the world, including in England and Wales, have adopted facial recognition technologies,” said Prof Gina Neff, Executive Director of the Minderoo Centre for Technology and Democracy. Our goal was to see if these installations followed accepted procedures for the safe and ethical use of these technologies.”
“By developing a one-of-a-kind audit system, we were able to investigate the concerns of privacy, equity, accountability, and oversight that should accompany any use of such technology by the police,” Neff explained.
Officers are increasingly under-resourced and overwhelmed, according to the study, and FRT is considered as a quick, effective, and low-cost method of locating persons of interest.