Racial discrimination of airline customers on Twitter

BY GINA MANTICA

Videos of United States (U.S.) airline staff dragging passengers off planes suggest some airlines have less-than-great customer service. Airlines’ reputations precede them on Twitter: customer service representatives only respond to about half of all online inquiries or complaints.  New research by a team of researchers from Boston University, the University of Pittsburgh, and the University of Rochester suggests that racial bias could lead to inequalities among these responses.

Avi Seidmann, a Research Fellow at the Hariri Institute and a Professor of Information Systems at the Questrom School of Business, Boston University, with a team of two other researchers (Priyanga Gunarathne, Assistant Professor of Business Administration at the Katz Graduate School of Business at the University of Pittsburgh, and Huaxia Rui, Xerox Professor of Computers and Information Systems at the University of Rochester) applied advanced econometrics and artificial intelligence (AI) to uncover the racial discrimination of airline customers on Twitter. They found that complaints from Black customers are less likely to receive a response from airlines on Twitter, compared to similar white customers. This is the first quantitative evidence of business-to-customer racial bias (B2C bias) on a digital platform, where the perpetrators are individual employees who act on behalf of a company and the victims are customers. The results were published recently in Information Systems Research.

To investigate how companies respond to customers of different races on social media, the research team compiled more than 57,000 complaints to seven major U.S. airlines on Twitter from September 2014 to May 2015. The team determined the race of customers from their Twitter profile pictures using a facial recognition AI algorithm. They found that Black customers are 12% less likely than similar white customers to receive a response from airline representatives when they complain. The researchers did not find any differences in the airlines’ responsiveness to Asian or Hispanic customers when compared with similar white customers.

Avi Seidmann and colleagues published the first quantitative evidence of business-to-customer racial bias on a digital platform.

Furthermore, the researchers found that racial bias is absent when profile pictures do not reveal customers’ racial identities. This further supports the idea that U.S. airline representatives discriminate against people based on the color of their skin in their profile pictures. “We have validated our results using rigorous falsifications tests with deep learning models which were trained using the last 2,000 tweets posted by each user,” added Seidmann.

Seidmann and colleagues were surprised by their findings, but hope that their pioneering investigation can inspire legal and corporate responsibility actions to correct and prevent such harmful customer service practices on digital platforms.  “Many U.S. companies today vow to fight systemic racism, but implicit human biases could be difficult to eradicate,” said Gunarathne. The team recommends businesses conceal customer profile pictures from service employees when they are responding to inquiries on Twitter to minimize discriminatory practices.

Many customers claim that they are discriminated against, but companies don’t often act unless they are provided with hard evidence.  The researchers developed a scientific methodology for measuring individual discrimination by businesses on social media. Policymakers and government regulators can use this tool to ensure that companies follow best practices for eliminating inequalities in customer service.  “We provide the building blocks. It is not a complete solution,” said Rui, “We want to make sure that we maintain a level playing field by reducing conscious and unconscious bias.”

Seidmann’s Boston University research team is now planning several exciting extensions to the study and are looking for businesses who are interested in research collaborations in this space. “At the regulatory levels, the current study can dramatically raise the enforcement of racial biases elimination in commercial online business-to-customer settings, which can hopefully eliminate such biases altogether,” said Seidmann.


Interested in learning more about the research happening at the Hariri Institute? Sign up for our newsletter here.