Published in AI

Rekognition claimed 24 Congress members were criminals

by on27 July 2018


So that works then

The US Congress is up in arms when Amazon’s face recognition software, called Rekognition, identified more than two dozen members of Congress as people arrested for crimes.

Apparently the press is calling them "false identifications" and came when the ACLU of Northern California tasked Rekognition with matching photos of all 535 members of Congress against 25,000 publicly available mugshot photos.

The test cost the ACLU just $12.33 to perform.

In total, Rekognition misidentified 28 members of Congress, listing them as a “match” for someone in the mugshot photos. Eleven of the misidentified members of Congress were people of color, a highly alarming disparity. Tests have shown that face recognition is routinely less accurate on darker skinned people and women. For Congress as a whole, the error rate was only five percent, but for non-white members of Congress, the error rate was 39 percent. Strangely no one considered the possibility that Amazon might have got it right.

But it appears that among those misidentified were six members of the Congressional Black Caucus, who wrote an open letter to Amazon CEO Jeff Bezos in June after the ACLU released internal documents concerning the use of Rekognition by police.

Amazon has encouraged law enforcement agencies to adopt Rekognition, with police in Orlando recently opting to continue their pilot of the tech. In a blog post, the ACLU of Northern California explained how misidentification in the hands of law enforcement could be deadly, particularly for black people who tend to be shot by police.

If law enforcement is using Amazon Rekognition, it’s not hard to imagine a police officer getting a ‘match’ indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins.

In a statement, Amazon suggested ACLU’s results with the software “could probably be improved by following best practices,” saying the company recommends law enforcement agencies set a confidence threshold of at least 95 percent when deciding if there’s a match. Of course, there’s no legal requirement for police to do so. In the UK, police in South Wales recently defended using face recognition software with a staggering 90 percent false positive rate.

 

Last modified on 27 July 2018
Rate this item
(0 votes)

Read more about: