Florida Tech Seeking Answers on Facial Recognition Bias

Recently, the use of automated face recognition technology has increased, with applications ranging from unlocking a cell phone to crossing a country’s border. The technology has garnered considerable attention due to reports of problems when recognizing people with darker skin.

A study involving Florida Tech researchers seeks to better illuminate why variations in face recognition accuracy occur and what might be done to mitigate them.

Michael King, Associate Professor in Florida Tech’s Department of Computer Engineering and Sciences, along with doctoral students Krishnapriya Kottakkal Sugathan and Kushal Vangara are co-authors of the research paper, “Characterizing the Variability in Face Recognition Accuracy Relative to Race.” The paper, which was presented at the Second International Workshop on Bias Estimation in Face Analytics June 17 in Long Beach, Calif., also includes Professor Kevin Bowyer and doctoral student Vitor Albiero at the University of Notre Dame as co-authors.

Using MORPH, the largest publicly available longitudinal face database, the researchers computed and analyzed over a billion scores representing the degree of similarity for faces appearing in two different images.  The analysis included scores produced by four face recognition algorithms and focused on images of people that were labeled as either African American or Caucasian.  They found the score distributions for genuine (i.e. photos of a person compared with other images of that person) and impostor (i.e. images of a person compared with photos of others) comparisons to be statistically significantly different among the demographic groups. During the study, the team saw, when applying a decision threshold, the algorithms tended to falsely match images of African Americans at a higher rate than images of Caucasians and tended to correctly match African-American images at an error rate lower than Caucasian images.

King notes, “for the dataset used in this study, we observed the ability to separate the impostor distribution from that of the genuine distribution was nearly the same for both demographic groups.  The increase in the number of false matches is therefore attributed to the fact that both the genuine and impostor scores for African Americans tend to be higher, corresponding to greater similarity.”

The team also investigated images that were compliant to the International Civil Aviation Organization’s (ICAO) guidelines for face images used in travel documents. By using the ICAO guidelines, the team examined whether processing only photos of faces that had passed the ICAO image quality check would affect accuracy. The results showed fewer errors in accuracy between photos of African Americans and Caucasians.

For King and other researchers, there are more questions as to why variations in accuracy occur than concrete answers.

“Right now, the longstanding principle of setting decision thresholds relative to the subject population and environment where the technology will be deployed is one of the major factors – and perhaps the main factor – in minimizing the number of false matches that may occur,” he said.

The research is also examining the possibility that matching differences among people with darker skin tones could be attributed to the quality of the image itself. In images with a lower rate of matching, dimly lit faces may have been a partial cause.

Even as work continues to determine the causes of facial bias in biometrics such as facial recognition, he believes the use of face recognition technology will expand.

King also notes that “due to observed variations in accuracy linked to race and gender, some use-cases of automatic face recognition technology related to national security and public safety have become contentious. For those applications, it is critical to ensure sound practices are used for the end-to-end identification process, instead of focusing solely on the recognition accuracy of an algorithm.”

The paper can be found at https://arxiv.org/ftp/arxiv/papers/1904/1904.07325.pdf.

###

Show More
Back to top button
Close