Amazon’s Face Recognition Tool Confused 28 Lawmakers With ArresteesThe ACLU compared members of Congress with 25,000 mug shots using the same software Amazon is pitching to law enforcement.Amazon’s facial recognition tool incorrectly matched the faces of 28 lawmakers with people in mug shots and disproportionately misidentified people of color in a test by the ACLU.
The civil rights organization, which has called on Amazon to exit the facial recognition industry, compared images of members of Congress with a database of 25,000 mug shots. Of the 28 misidentified lawmakers, 39 percent were people of color, including Reps. John Lewis (D-Ga.), Lacy Clay (D-Miss.) and Luis Gutiérrez (D-Ill.).
Customers can upload images and video to the publicly available Rekognition tool for analysis of people’s faces, objects and text. The ACLU said it used the tool’s default settings, but did not immediately respond to a request for additional details about its test.
The group cautioned law enforcement against using Amazon Rekognition, citing the possibility of bias.
“It’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins,” the report says. “People of color are already disproportionately harmed by police practices, and it’s easy to see how Rekognition could exacerbate that.”
Given the threats that facial recognition can pose to protesters, immigrants and minorities, the ACLU called on Congress to enact a moratorium on law enforcement using the technology.
Amazon defended Rekognition by saying that the tool has been used to prevent human trafficking, find missing children and prevent package theft. The company also said that the ACLU’s test results could have been improved by using a high confidence threshold ― the probability that a given prediction is correct. The ACLU report did not say what confidence threshold it used.
“When using facial recognition for law enforcement activities, we guide customers to set a higher threshold of at least 95% or higher,” Amazon said.
The revelation of biases in facial recognition technologies is not new. A February study by researchers from the Massachusetts Institute of Technology and Stanford University found that the technology had higher error rates in analyzing darker skinned people and women.
In May, the Congressional Black Caucus wrote a letter to Amazon CEO Jeff Bezos expressing concern about the company selling Rekognition to law enforcement agencies. The caucus said that communities of color are more aggressively policed and it fears that implementing Rekognition without further research could exacerbate problematic policing practices.
“We are worried deployment of technology like the one you have developed has a high propensity for misuse,” the letter says. “Surveillance of perfectly legitimate and constitutionally protected activity will only further erode the public’s trust in law enforcement.”
Bezos and Amazon did not respond to the letter.
Amazon has already pitched the facial recognition service to local law enforcement, including Oregon and Orlando. Orlando ended a pilot program with Rekognition in June after several civil rights group spoke out against its use. The police department told Floridapolitics.com that it wanted to uphold privacy laws and protect people’s rights.