Quantcast
Connect with us

Emotion-reading tech fails the racial bias test

Published

on

Facial recognition technology has progressed to point where it now interprets emotions in facial expressions. This type of analysis is increasingly used in daily life. For example, companies can use facial recognition software to help with hiring decisions. Other programs scan the faces in crowds to identify threats to public safety.

Unfortunately, this technology struggles to interpret the emotions of black faces. My new study, published last month, shows that emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces.

ADVERTISEMENT

This isn’t the first time that facial recognition programs have been shown to be biased. Google labeled black faces as gorillas. Cameras identified Asian faces as blinking. Facial recognition programs struggled to correctly identify gender for people with darker skin.

My work contributes to a growing call to better understand the hidden bias in artificial intelligence software.

Measuring bias

To examine the bias in the facial recognition systems that analyze people’s emotions, I used a data set of 400 NBA player photos from the 2016 to 2017 season, because players are similar in their clothing, athleticism, age and gender. Also, since these are professional portraits, the players look at the camera in the picture.

I ran the images through two well-known types of emotional recognition software. Both assigned black players more negative emotional scores on average, no matter how much they smiled.

ADVERTISEMENT

For example, consider the official NBA pictures of Darren Collison and Gordon Hayward. Both players are smiling, and, according to the facial recognition and analysis program Face++, Darren Collison and Gordon Hayward have similar smile scores – 48.7 and 48.1 out of 100, respectively.

Basketball players Darren Collision (left) and Gordon Hayward (right).
basketball-reference.com

However, Face++ rates Hayward’s expression as 59.7 percent happy and 0.13 percent angry and Collison’s expression as 39.2 percent happy and 27 percent angry. Collison is viewed as nearly as angry as he is happy and far angrier than Hayward – despite the facial recognition program itself recognizing that both players are smiling.

ADVERTISEMENT

In contrast, Microsoft’s Face API viewed both men as happy. Still, Collison is viewed as less happy than Hayward, with 98 and 93 percent happiness scores, respectively. Despite his smile, Collison is even scored with a small amount of contempt, whereas Hayward has none.

Across all the NBA pictures, the same pattern emerges. On average, Face++ rates black faces as twice as angry as white faces. Face API scores black faces as three times more contemptuous than white faces. After matching players based on their smiles, both facial analysis programs are still more likely to assign the negative emotions of anger or contempt to black faces.

ADVERTISEMENT

Stereotyped by AI

My study shows that facial recognition programs exhibit two distinct types of bias.

First, black faces were consistently scored as angrier than white faces for every smile. Face++ showed this type of bias. Second, black faces were always scored as angrier if there was any ambiguity about their facial expression. Face API displayed this type of disparity. Even if black faces are partially smiling, my analysis showed that the systems assumed more negative emotions as compared to their white counterparts with similar expressions. The average emotional scores were much closer across races, but there were still noticeable differences for black and white faces.

This observation aligns with other research, which suggests that black professionals must amplify positive emotions to receive parity in their workplace performance evaluations. Studies show that people perceive black men as more physically threatening than white men, even when they are the same size.

ADVERTISEMENT

Some researchers argue that facial recognition technology is more objective than humans. But my study suggests that facial recognition reflects the same biases that people have. Black men’s facial expressions are scored with emotions associated with threatening behaviors more often than white men, even when they are smiling. There is good reason to believe that the use of facial recognition could formalize preexisting stereotypes into algorithms, automatically embedding them into everyday life.

Until facial recognition assesses black and white faces similarly, black people may need to exaggerate their positive facial expressions – essentially smile more – to reduce ambiguity and potentially negative interpretations by the technology.

Although innovative, artificial intelligence can perpetrate and exacerbate existing power dynamics, leading to disparate impact across racial/ethnic groups. Some societal accountability is necessary to ensure fairness to all groups because facial recognition, like most artificial intelligence, is often invisible to the people most affected by its decisions.The Conversation

Lauren Rhue, Assistant Professor of Information Systems and Analytics, Wake Forest University

ADVERTISEMENT

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Report typos and corrections to: [email protected].
READ COMMENTS - JOIN THE DISCUSSION
Continue Reading

Breaking Banner

‘He’s cooked’: Sam Donaldson warns Trump the Senate may vote to convict him after impeachment trial

Published

on

Veteran newsman Sam Donaldson on Monday evening told CNN viewers not to assume that Senate Republicans would refuse to remove President Donald Trump from office during an impeachment vote.

"Breaking news," CNN Don Lemon alerted. "A CNN source saying that the effort to pressure Ukraine for political help alarmed John Bolton so much that the told an aide to alert White House lawyers that Giuliani was a hand grenade who will blow everyone up. And a source familiar with Fiona Hill’s testimony says the former Russia adviser told lawmakers she was she saw wrongdoing in the Ukraine policy and reported it."

Continue Reading

Breaking Banner

Rudy Giuliani admits ‘Fraud Guarantee’ paid him $500,000 to work for indicted associate

Published

on

Rudy Giuliani admitted being paid a half a million dollars by an associate currently being held in federal custody, Reuters reported Monday.

"President Donald Trump’s personal attorney, Rudy Giuliani, was paid $500,000 for work he did for a company co-founded by the Ukrainian-American businessman arrested last week on campaign finance charges, Giuliani told Reuters on Monday. The businessman, Lev Parnas, is a close associate of Giuliani and was involved in his effort to investigate Trump’s political rival, former Vice President Joe Biden, who is a leading contender for the 2020 Democratic Party nomination," Reuters reported.

Continue Reading
 

Breaking Banner

John Bolton ripped Rudy Giuliani as a drug dealer and ‘hand grenade’: report

Published

on

Then-National Security Advisor John Bolton was reportedly shocked by the shadow foreign policy being conducted by Rudy Giuliani, a top former National Security Council official testified to Congress on Monday, The New York Times reports.

"The effort to pressure Ukraine for political help provoked a heated confrontation inside the White House last summer that so alarmed John R. Bolton, then the national security adviser, that he told an aide to alert White House lawyers, House investigators were told on Monday," the newspaper reported. "Mr. Bolton got into a sharp exchange on July 10 with Gordon D. Sondland, the Trump donor turned ambassador to the European Union, who was working with Rudolph W. Giuliani, the president’s personal lawyer, to press Ukraine to investigate Democrats, according to testimony provided to the investigators."

Continue Reading
 
 
Help Raw Story Uncover Injustice. Join Raw Story Investigates for $1 and go ad-free.
close-image