Study Finds Bias in Facial-Recognition Technology

Home Travel Study Finds Bias in Facial-Recognition Technology
Study Finds Bias in Facial-Recognition Technology

A recent study of the facial-recognition technology that officials plan to add to more airports and other ports of entry in the United States found there are more false identifications for African-American and Asian users.

According to The Washington Post, the National Institute of Standards and Technology (NIST) found that many of the systems tested exhibited bias, with African-American and Asian people being falsely identified between 10 and 100 times more often than Caucasians.

MORE Travel Technology

The data revealed Native Americans were misidentified more than any other group. The NIST study also found that women were falsely identified more than men, and senior citizens had more than 10 times the issues faced by middle-aged adults.

As part of the testing process, the NIST accessed more than 18 million photos of about 8.5 million people from the U.S. and 189 facial-recognition algorithms from 99 developers. The agency did not check systems developed by Amazon, Apple, Facebook and Google, as they did not submit their algorithms for the study.

The Massachusetts Institute of Technology (MIT) was one of the first groups to study the bias in facial-recognition technology and found several companies had low accuracy rates when examining female and darker-skinned faces.

The report has caused concern for civil rights groups like the American Civil Liberties Union (ACLU) that are concerned the technology could impact the freedom of movement and speech and increase unjust surveillance.

“One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse,” ACLU analyst Jay Stanley told The Washington Post. “Government agencies including the F.B.I., Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”