Amazon Facial Recognition Software Used By Law Enforcement Has Racial Bias, Per Report

Facial technology software from Amazon used by some law enforcement agencies has shown inaccuracies, particularly when it comes to women of color. Sometimes the technology mistakes dark-skinned women as men.

The system worked fine when recognizing men. It's identifying women that became a glaring problem for the software.

An MIT report says Amazon's Rekognition misidentified women as men 19 percent of the time, and that darker-skinned women were more often pegged as men in 31 percent of the study.

To save face, the general manager of artificial intelligence for Amazon Web Services, Matt Wood, said test results are based on facial analysis, not recognition. Wood said the latest test results assign generic tributes aside from outlying factors like wearing glasses, wearing a mustache or being another gender.

"It's not possible to draw a conclusion on the accuracy of facial recognition for any use case — including law enforcement — based on results obtained using facial analysis," said Wood, who added the study didn't use Amazon's latest version of Rekognition, and said it found no false positive matches. "The results in the paper also do not use the latest version of Rekognition and do not represent how a customer would use the service today."

This isn't the only facial recognition software to inaccurately go through a tech giant. Last year a pair of researchers from MIT and the University of Toronto found that similar facial recognition software attempts hit snags. Microsoft and IBM eventually needed to update their software after consistent error rates.

Joy Buolamwini of the MIT said Amazon must address its flaws before launching it mainstream.

"In light of this research, it is irresponsible for the company to continue selling this technology to law enforcement or government agencies," Buolamwini wrote in a blog post.

"If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free," Buolamwini wrote.

As Amazon plans to peddle the software to law enforcement agencies, Amazon employees and civil rights groups alike have been outspoken against it, saying the software could one day power mass surveillance.

"This technology is being implemented in ways that materially benefit society, and we have received no indications of misuse," a statement from Amazon read.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer


Scott McDonald is a Newsweek deputy night editor based in Cape Coral, Florida. His focus is assigning and writing stories ... Read more

To read how Newsweek uses AI as a newsroom tool, Click here.

Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go