Amazon is instituting a one-year moratorium on police use of Rekognition, its facial recognition software, the company announced on Wednesday.
“We’ve advocated that governments should put in place stronger regulations to govern the ethical use of facial recognition technology,” Amazon wrote in its blog post announcing the change. “Congress appears ready to take on this challenge. We hope this one-year moratorium might give Congress enough time to implement appropriate rules.”
Amazon says that groups like the International Center for Missing and Exploited Children will continue to have access to the technology.
Rekognition is cloud-based computer vision software that enables customers to match photos based on visual similarities. It can recognize objects like dogs, chairs, and beaches. It can also be used to compare human faces.
In a typical law enforcement use, a police department might upload thousands of mug shots to Amazon’s servers. Then when the police need to identify someone in a new photograph—perhaps taken from a surveillance camera—they can use Rekognition to scan the mugshot database and find the photo with the closest match. The software provides a similarity score that indicates how confident the software is of each match.
Amazon launched the product in 2016 and has been pitching the technology to law enforcement for at least two years. It’s not clear how many police departments have adopted the technology. Orlando’s police department, for example, experimented with the software and then abandoned its use.
The company has faced sustained criticism for pitching the software to police departments.
In July 2018, the ACLU uploaded 25,000 mugshots to Amazon’s servers and then used Rekognition to compare them to photos of the 535 members of Congress. Hilariously, they found that 28 members of Congress—including six black legislators—were incorrectly matched to mug shots. That got the attention of several members of Congress, who pressed Amazon for more details about the technology.
Amazon argued that the ACLU’s test was unfair. Amazon lets users set a minimum similarity threshold to count a photo as a match. The ACLU left this value at the default of 80 percent, which led to the 28 false positives. But Amazon argued that this threshold was too low for law enforcement use. A few days after the ACLU’s study, Amazon recommended that law enforcement agencies should set the similarity threshold at 99 percent. At this level, Amazon said, no members of Congress matched with a large database of mug shots.
Racial bias has been a common criticism of facial recognition systems, including Amazon’s software. Facial recognition tends to have a higher accuracy rate when used on white people than on other racial groups. This may reflect the greater use of white faces in training sets.
IBM announced Tuesday that it was exiting the facial recognition business altogether, citing privacy and racial equity concerns.
A few jurisdictions have taken steps to restrict the use of facial recognition technology by law enforcement. The Bay Area has been a leader in this respect, with San Francisco, Oakland, and Berkeley all banning the use of facial recognition software by police. We can expect ongoing Black Lives Matter protests to give the concept of limiting police use of facial recognition momentum in other areas—and perhaps nationally.
This week, House Democrats introduced legislation that would require federal law enforcement agents to wear body cameras. The bill prohibits the application of facial recognition software to body camera footage without a warrant.