Amazon 'In Denial' Its Facial 'Rekognition' Software is Racially Biased
Amazon’s facial recognition program Rekognition remains under fire for racial bias following the American Civil Liberties Union’s (ACLU) testing of the software, which could potentially be used by law enforcement to identify suspects.
When running a check between members of Congress and a public mugshot database, Black and Latino members of Congress disproportionately matched with the mugshots — even the photo of legendary civil rights leader Rep. John Lewis (D-Ga.) was a match with a photo of a criminal.
The ACLU downloaded 25,000 mugshots from a “public source,” according to Jake Snow, an attorney with the organization. They ran photos of 535 members of Congress through Rekognition last week asking it to match them up with any of the photos of criminals. The faces of 28 members matched, about a five percent error rate.
Snow told Ars Technica the ACLU “used the default level of confidence that Amazon uses, its 80 similarity score.”
But Matt Wood, Amazon’s general manager for deep learning and artificial intelligence, is claiming the default setting used affected the results the ACLU received.
“The 80 percent confidence threshold used by the ACLU is far too low to ensure the accurate identification of individuals; we would expect to see false positives at this level of confidence,” Wood wrote in a blog post.
“We recommend 99 percent for use cases where highly accurate face similarity matches are important (as indicated in our public documentation).”
He also said not to throw away an oven that might burn pizza.
“While being concerned it’s applied correctly, we should not throw away the oven because the temperature could be set wrong and burn the pizza…”
Snow said in response to Wood’s chatter about threshold that Amazon is “grasping at straws” and is in “denial.”
“In a matter of 48 hours, Amazon has gone from its own system default of an 80 percent match rate to saying yesterday it should be 95 percent, and then saying today it should be 99 percent,” Snow said.
“At no time has Amazon taken any responsibility for the very grave impact that their face surveillance product has on real people.”
Months before the ACLU’s test, the Congressional Black Caucus (CBC) had been vocal about the racial bias of Rekognition.
In May, the CBC wrote a letter to Amazon CEO Jeff Bezos stating the lawmakers are “troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protestors.”
In June, Brian Brackeen, a Black chief executive of a software company developing facial recognition, wrote that law enforcement’s use of Rekognition in the identification of suspects would negatively affect people of color.
“To deny this fact would be a lie,” he wrote.
This has to do with civil rights, said Snow.
“Amazon is grasping at straws in an attempt to distract from critical civil rights issues,” Snow said.
“Amazon should take steps to fix the damage its ill-advised face surveillance product may have already caused and to prevent further harm.”