Social-media users may not be nearly as free from racial bias as they believe, researchers in the Department of Marketing of the College of Business at the University of Nevada, Reno found in an analysis of consumer reactions to more than 70,000 Instagram posts by 10 major brands.
In fact, marketing experts themselves may be misled about the degree of racial bias among social-media users, said Jae-eun Namkoong, an associate professor of marketing and one of the authors of the study recently published in the Journal of Business Ethics.
“At a surface level, it may appear that featuring more Black faces in posts, and observing greater engagement as a result, is simply a positive indication that consumers are not biased against racial minorities,” Namkoong said. “However, our in-depth analyses reveal that racial bias often manifests in subtle, yet consistent ways.”
The researchers relied on a machine-learning algorithm that sorted through Instagram posts of the 10 big brands and categorized the nearly 50,000 faces it found by race, gender, age, beauty, the smiles of pictured individuals and their positions in the posts — front and center, say, or standing off to the side.
Next, the researchers examined the number of likes generated by each post.
Namkoong said the analysis discovered that Instagram posts featuring more Black individuals garner greater engagement when the pictured individuals display low-threat facial features, such as smiles or being older or more feminine. This is important, Namkoong said, because of the threat-related biases against Black people, especially young Black men, that are persistent among many,
Greater engagement results, too, when Black individuals pictured in the posts conform more closely to mainstream, Caucasian-centric beauty standards.
Further, the researchers found that the engagement of social-media users is sensitive to how prominently Black individuals are pictured in a post. Featuring a face in the center of a Instagram post often leads to lower engagement, and the trend is especially true for Black faces, Namkoong said.
“It’s reminiscent of shows or advertisements with diverse casts, where racial minorities are present in number but often confined to supporting roles or portrayed as ‘sidekicks,’” she said.
Often, social-media users themselves aren’t aware of their biases, said James Leonhardt, a professor of marketing and co-author of the study.
“We are generally averse to holding explicitly biased beliefs,” Leonhardt said, “Instagram users, who skew young, liberal and more diverse, are unlikely to believe their platform behavior is racially biased. Yet our findings show biases persist and have real consequences.”
The biases touch consumers throughout their online activity, said Namkoong.
“When I book a pet sitter on Rover, reserve a room on Airbnb, or search for a healthcare professional on Google, one of the first things I see is the face of the service provider,” she said. “Even if I try my best to be fair, can I completely free myself from ageism, sexism, racism and a host of other biases? Most likely not, because the truth is, many of these biases exist and operate implicitly.”
Leonhardt said the researchers’ findings can provide insights to marketing professionals and consumers alike.
“A takeaway for everyone is to be suspicious of our belief-behavior alignment and our ability to independently control our own beliefs,” he said. “Awareness makes the implicit explicit, allowing for better alignment in the beliefs and behaviors of brands, marketers, and consumers.”
Along with Namkoong and Leonhardt, authors of the study included Eunsoo Choi of the School of Psychology at Korea University in Seoul and Joon H. Ro of San Francisco.