IBM has reentered the facial recognition market, marking a surprising reversal from its stance three years ago when it announced its withdrawal from the technology due to concerns related to racial profiling, mass surveillance, and other human rights violations.
In June 2020, during the wave of Black Lives Matter protests in the United States following the tragic murder of George Floyd, IBM’s CEO, Arvind Krishna, penned a letter to Congress declaring the company’s decision to cease offering “general-purpose” facial recognition technology. He emphasized the urgency of the fight against racism, stating, “IBM firmly opposes and will not condone the use of any technology, including facial recognition technology from other providers, for purposes such as mass surveillance, racial profiling, violations of fundamental human rights and freedoms, or any other application inconsistent with our core values and Principles of Trust and Transparency.” Later that same year, IBM reinforced its commitment by advocating for U.S. export controls to address concerns that facial recognition could potentially be employed abroad “to suppress dissent, infringe upon the rights of minority populations, or undermine basic expectations of privacy.”
The Recent £54.7 Million Contract of IBM with the British Government
Despite these prior declarations, IBM recently inked a £54.7 million contract with the British government in the preceding month. This agreement pertains to developing a national biometrics platform that will provide a facial recognition feature for immigration and law enforcement personnel. The Verge and Liberty Investigates, a UK investigative journalism unit, scrutinized the details of this contract.
According to the contract notice for the Home Office Biometrics Matcher Platform, the project’s initial phase focuses on creating a fingerprint-matching capability. Subsequent stages will introduce facial recognition for immigration purposes, termed “an enabler for strategic facial matching for law enforcement.” The ultimate phase of the project involves the delivery of a “facial matching for law enforcement use-case.”
The platform will enable the comparison of photos of individuals with images stored in a database, sometimes referred to as a “one-to-many” matching system. In September 2020, IBM characterized such “one-to-many” matching systems as the kind of facial recognition technology most susceptible to being employed for widespread surveillance, racial profiling, or other breaches of human rights.
IBM spokesperson Imtiaz Mufti refuted any suggestion that their involvement in the contract contradicted their commitments from 2020. He stated, “IBM no longer offers general-purpose facial recognition and, consistent with our 2020 commitment, does not support using facial recognition for mass surveillance, racial profiling, or other human rights violations.”
“The Home Office Biometrics Matcher Platform and associated Services contract is not used in mass surveillance. It supports police and immigration services in identifying suspects against a database of fingerprint and photo data. It is not capable of video ingest, which would typically be needed to support face-in-a-crowd biometric usage.”
Concerns Over IBM’s Involvement in Facial Recognition Technology and Human Rights Violations
On the other hand, human rights advocates argue that IBM’s involvement in the project contradicts its commitments made in 2020. According to Kojo Kyerewaa from Black Lives Matter UK, “IBM has demonstrated its willingness to prioritize pursuing a Home Office contract over respecting the memory of George Floyd. This will not be forgotten.”
Matt Mahmoudi, a tech researcher at Amnesty International who holds a PhD, stated, “The research across the globe is clear; there is no application of one-to-many facial recognition that is compatible with human rights law, and companies — including IBM — must therefore cease its sale, and honour their earlier statements to sunset these tools, even and especially in the context of law and immigration enforcement where the rights implications are compounding.”
The use of facial recognition technology by law enforcement agencies in the United States has been associated with wrongful arrests and has faced legal challenges in the United Kingdom. In 2019, an independent report on the London Metropolitan Police Service’s implementation of live facial recognition technology revealed that there was no “explicit legal basis” for the force’s utilization of this technology, raising concerns about potential violations of human rights law. In August of the following year, the UK’s Court of Appeal ruled that the use of facial recognition technology by South Wales Police infringed on privacy rights and violated equality laws. Following this verdict, the police force temporarily suspended its use of facial recognition but has since resumed its implementation.
Furthermore, several technology companies have implemented partial restrictions on using their facial recognition services by law enforcement. After IBM announced its departure from the facial recognition sector, Amazon and Microsoft declared moratoriums on selling their facial recognition services to police departments in the United States.
Tech Giants Take Action on Facial Recognition for Law Enforcement
Amazon initially declared a one-year moratorium on the use of its Rekognition software by police in June 2020 and subsequently extended it indefinitely in the following year. A company spokesperson confirmed that this moratorium explicitly prohibits “the use of Amazon Rekognition’s face comparison feature by police departments in connection with criminal investigations.”
Similarly, in June 2020, Microsoft stated that it would not sell facial recognition software to US police departments until a national law regulates the technology’s use. When contacted for comments by The Verge and Liberty Investigates, a Microsoft spokesperson referred to the company’s official website, which explicitly states that using the Azure AI Face service “by or for state or local police in the US is prohibited by Microsoft policy.”