Losing The Race

Key Takeaways:
Can algorithms be programmed to be racist? As the Head of Video at kind magazine, and a minority woman, I make an effort to ensure the decisions I make are representative of a diverse and inclusive culture. Still, some creators think the social media platforms using machine learning are programmed against them. Behind the screens, creators and academics believe that unmonitored technology enables—and amplifies—racist behaviour.
Systemic racism is embedded in the publishing world’s processes where value is often measured against a creator’s online influence. Likes, shares, views, and comments are thrown into an equation that spits out the creator’s engagement rate—helping advertisers, distributors and publishers determine credibility and compensation.
I’ve watched Black creators ask their followers to turn on notifications for their content because they think their content is not reaching their followers. I’ve also experienced white creators claim they’ve been shadow-banned. If advertisers and publishers are paying based on views and engagement, the algorithm has influenced that equation, ultimately deciding whose content gets seen, who becomes popular, and who makes money.
Let’s level set on what a social media algorithm is. According to the Digital Marketing Institute: “It’s a mathematical set of rules specifying how a group of data behaves. In social media, algorithms help maintain order and assists in ranking search results and advertisements.”
In her Wall Street Journal article “Social-Media Algorithms Rule How We See The World. Good Luck Trying to Stop Them,” Joanna Stern writes, “It’s hard to pinpoint exactly when we lost control of what we see, read—and even think—to the biggest social media companies.”
In the article, she quotes Hany Farid, a computer science professor at the University of California, Berkeley, “There are bad people doing bad things on the internet—QAnon, white supremacists—it’s not that Facebook, YouTube, and other social media sites allow it on their platform. It’s that they amplify it.” According to Farid, the problem is that: “Computers are in charge of what we see and they’re operating without transparency.”
It seems like the computers aren’t promoting the work—or opinions—of people of colour.
“Serendipity has been replaced by curated content,” writes Thodora Lau and Uday Akkaraju in the Harvard Business Review. “What if the internet becomes a guarded space where only a select group of individuals get heard?”
South Africa-based, Yaaseen Barnes, a comedian, echoes the sentiment in this Tweet, “Instagram will make it seem like you follow 4 accounts only… the way they keep pushing them every day. WHERE ARE THE OTHER ACCOUNTS I FOLLOW? I followed them for a reason, show me their posts.”
Last year, advertisers boycotted Facebook’s platforms, to protest how the social media company monitors their platforms.
As a response, Facebook announced they were creating teams dedicated to examining how Black and minority users in the United States are affected by algorithms on Instagram and Facebook. According to the Wall Street Journal, the equity and inclusion team’s mandate is to analyze how the company’s algorithms and machine learning affect minority creators compared to white creators.
About a month before this announcement, a group of Black creators filed a lawsuit against YouTube, claiming that their content was being systematically removed without explanation. According to the Washington Post, the lawsuit alleges that the platform, “rig[s] the game, by using their power to restrict and block Plaintiffs and other similarly situated competitors, based on racial identity or viewpoint discrimination for profit.”
In the suit, YouTube creator Catherine Jones, who runs the channel Carmen Caboom, says the channel removed her content for containing nudity and hate speech. She claims that her content does not contain either of those things. Is her content being inaccurately reported by bad actors?
“These algorithms try to predict whether content uploaded to YouTube is in violation of their terms and service. With 20,000 hours of video uploaded to YouTube every hour, it’s simply not possible to have humans reviewing every second for these violations,” says Anthony Niblett, Associate Professor, and Canada Research Chair in Law, Economics & Innovation at the University of Toronto. “Some violations will not be caught, and other content, that doesn’t violate, will be tagged as violations.”
The question, he says, is whether or not these errors are biased against various groups.
Content creator and model, Risa Newman, thinks they are. “The reason why I hate TikTok is because of the shadow banning it does on Black creatives,” she said. “It’s also happening on Instagram—when you type a name in the search bar that you want to find, you can’t find them.”
Shadow banning is defined by Hubspot as, “The act of blocking a user’s content on social media sites, in such a way that the user doesn’t know it’s happening. If you’re shadow banned on Instagram, your content won’t appear on anyone’s feed unless they already follow you.” Many cannabis companies in Canada say this has also happened to them.
This is a way for Instagram to filter out accounts that don’t comply with their terms. Instagram, meanwhile, has not admitted to shadow banning.
“It’s so hard seeing all these creators struggling to get views when white folks get the spotlight,” Newman says.
Niblett weighs in, “even if the algorithms are not expressly designed to identify race or ethnicity of creators, the algorithms can still have a discriminatory effect if race or ethnicity is correlated with features that the algorithms sort by.”
Instagram’s parent company only took action against their internal biases when there was an outcry from Facebook employees about race.
In the Wall Street Journal, Instagram’s head of product said, “The racial justice movement is a moment of real significance [...] any bias in our systems and policies runs counter to providing a platform for everyone to express themselves.”
The article says that before the initiative was launched, employees at Facebook were stopped from studying racial impacts associated with the platform without permission from the most senior team of executives. An internal analysis found users whose activity suggested they were Black were 50% more likely to have their accounts disabled than other users. After tweaking the algorithm’s criteria to reduce this, the Wall Street Journal reported further research was prohibited by Facebook executives.
Jade Owhadi, an activist, educator, and humanitarian, does not identify as a minority but speaks out about a lot of minority issues. Her Instagram account, which has 11.3 thousand followers, was disabled. “I’d been posting about white supremacy and going extra hard with everything that’s going on in the States,” she says. “They said it was a violation of their terms and conditions, then switched it up a few hours later. They said I was pretending to be someone else.”
When asked if she thought that people who disagreed with her views reported her posts, she said, “Definitely. I think that people were reporting my page, comments, pictures, over and over again in an effort to shut me down, and eventually, Instagram took the page down.” Jade says she grew up in a “very racist” town in Texas, a town she says is still influenced by the KKK. “That’s where a lot of my followers are,” she says.
She uses her account as a means of communication. “A lot of my former students are on my Instagram, that’s how I keep up with them and their families.” Jade works with underprivileged children in developing countries and says that her greatest passion is speaking for those who don’t have a voice.
Weighing in on her own experiences, Newman says: “Black creators need to start creating their own platforms and supporting each other that way.”
Shortly after this interview, CNBC published an article about Clubhouse. The article titled “How Black users are saving Clubhouse from becoming a drab hangout for tech bros,” explores how app users from diverse backgrounds are changing the user experience for the better. The article references Aniyia Williams, founder of Black & Brown Founder, an organization that supports Black and Latinx entrepreneurs, saying, “This sudden burst of innovation in Clubhouse exemplifies the role Black people often play in America as culture makers and trendsetters.”
When Clubhouse first launched, the majority of its users were from Silicon Valley. There were conversations on the future of AI or Bitcoin, but now you’re more likely to find cultural debates or people looking for love in one of the many dating rooms.
In the story, Williams says, “Ingenuity is the other side of being oppressed. At the end of the day, that’s the thing that unites Black people[...] being a have-not forces you to think and see the world differently.”
Ingenuity is the other side of being oppressed.
Are Black people being silenced on the platforms that they made cool?
Jeremy Green, a content creator who grew up in a biracial family in Orange County, California, says his father empowered him to speak up for Black rights. “It could be systemic racism, but it could be systemic classism, and that takes priority over racism or sexism,” he says, adding that Instagram has become a place where, “People become very consumed very quickly when you’re seeing people who make millions off their looks alone.”
He says audiences would rather escape into superficial content than consume content that is informative when it comes to the real issues facing the world.
Owhadi agrees. “Instagram tends to show selfies a whole lot more than when you’re posting something educational,” adding that she posts a selfie to her account before posting about humanitarian aid work she’s doing so that more people will see her post.
Still, some professionals conclude that algorithmic bias on social media can’t exist. “There is a 0% chance that biases can be programmed into an algorithm,” says independent data and analytics consultant, Kumar Latchman.
“Academically speaking, algorithms look for patterns of behaviour,” Latchman says the purpose of these platforms is to monetize our attention. They want to keep us as engaged as they can for as long as possible. That alone is their main motivator.
“Algorithms actively give us back what they think we like,” he adds. “If creators of any subject are being disproportionately targeted in the form of reports, the algorithm will respond to that.”
Algorithms actively give us back what they think we like.
When asked about potential solutions, Latchman says, “You’re always going to have unintended outcomes, the human intervention to solve it, is coincidentally, the bias.”
The regulation of social media companies is something that will continue to be hotly debated. “Companies should be more transparent about the algorithms they use and how these algorithms make predictions,” says Niblett. How to regulate social media companies is a topic that is becoming more and more part of the discourse. “I would be uncomfortable with the world where a government is telling private companies what content they must publish,” he adds.
“If Facebook is setting up internal systems to identify whether errors they make are disproportionately affecting particular subgroups of society and if they are identifying ways to ameliorate and minimize these biases, then that would be beneficial,” Niblett says.
Green says, “look at @chakabars, he’s been running [humanitarian] programs and he's won awards at the BET level, but he's not verified, that speaks loudly." Chaka Zulu, who goes by @chakabars on Instagram, has one million followers and posts about racial issues. Green says there are accounts just like his who post the same type of information—but face no recourse.
On the Facebook solution, Green says, “It helps if we have people from our community sitting in executive positions at these social media companies, they will be able to empathize with our community.”
Facebook, Google and TikTok were contacted via email for comment on this article, at time of press, kind Magazine had not received a response from any of the social media platforms.