Facial recognition technology has progressed to point where it now interprets emotions in facial expressions.
This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions.
Other programs scan the faces in crowds to identify threats to public safety.
Unfortunately, this technology struggles to interpret the emotions of black faces. My new study, published last month, shows that emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces.
This isn’t the first time that facial recognition programs have been shown to be biased. Google labeled black faces as gorillas.
Cameras identified Asian faces as blinking. Facial recognition programs struggled to correctly identify gender for people with darker skin.
My work contributes to a growing call to better understand the hidden bias in artificial intelligence software.
This article was posted: Friday, January 18, 2019 at 6:29 am