Defeating Social Media Algorithms in the Name of Social Justice

I recently watched a video on TikTok from user mikaila simone; the video is a spoken word poem about the trauma black Americans face thanks to institutional racism, and how that trauma continues even after their deaths. It’s an incredibly affecting video, and I opened the comments, hoping to see how others were reacting to simone’s video. What I found, to my deep confusion, were supportive comments absolutely laden with emojis, and a series of completely incomprehensible comments, from multiple users, in all caps. One comment read “A WHILE AGO I WAS LIKE OH YEAH YEAH OH YEAH NO NO REASON WHY NOT I JUST DON’T WANT TO ME GO TO SLEEP AND THEN GO EAT AND THEN GO TO SLEEP AND THEN GO.” These aren’t even song lyrics; these words are, quite literally, gibberish.

Image

And they’re helping advance social justice.

For those who aren’t familiar, the For You Page is an algorithmically designed, endless scrolling feed that will show you videos from across the platform. The For You Page has, as far as I can tell, three primary types of video: videos from people you have followed or liked, AKA known wins; videos “associated” with the people you’ve followed and liked, because other users have liked both the users you follow and the other videos; and a random combo of completely unknown as well as objectively popular videos, weighted by the first and second categories but, seemingly, more random, designed to help the app continually refine the first and second categories.

Looking at Youtube without being logged in can be a good insight into how algorithms function.

Most social media platforms function by showing you content you’ve chosen to “follow” or “subscribe” to in some fashion, and when it makes suggestions for new content for you, it’s largely weighed by the hashtags on the content you like, and popularity. You can see the usual algorithm by looking at how Youtube’s front page looks if you’re not logged in. It will suggest popular videos from key, popular content categories; when I checked on it on March 14th, 2021, the suggested videos included a prank video, a movie clip, a movie clip with humorous edits, a Minecraft video, a dog, and a viral video of a reporter doing a good basketball throw. As you click on these videos, Youtube will then suggest more videos like them, having received evidence that you like them; if you click on a prank video, you’ll get suggestions for more prank videos.

Plenty of people understand the algorithmic nature of social media, that it tries to show you things you might want to see. More and more, however, the relevant discussion is not how we affect the algorithm, but how the algorithm effects us. One of the better discussions of this is this New York Times article, outlining how Youtube’s suggested videos sidebar can help radicalize people and contribute to a “siloing” effect, where people develop extreme opinions because they’re surrounded by people who agree with them. Just like how watching prank videos begets you more prank videos, watching far-right political content begets more and more far-right content, and can leave a viewer with the false belief that their views are widely held, or that the users they’re watching are making common sense, good faith arguments. Algorithms tend to funnel people down increasingly niche paths of interest, which is fine when it’s, say, videos about modified Gameboy Color consoles, less so when that interest involves devaluing minorities.

The algorithm on social media platforms often works against social justice for this reason. If you’re not actively choosing to engage with political, especially left-leaning, content, then it can entirely miss you. Even if you are, the nature of political beliefs means that there are a hundred splinter groups whose niche content you could fall into. The social media user then either gets sucked into niche groups, whose beliefs may be harmless–or not. Even if you’re self-aware of this issue, avoiding being misled takes a lot of time and energy. Also, being self-aware in and of itself can make you vulnerable to groups that trade on the idea of “criticism” being the cornerstone of their ideology. The New York Times article linked above details how enjoying “critical analysis” videos can quickly lead to buying into far-right ideology, and Trans Exclusionary Radical Feminists, or TERFs claim to be performing “gender criticism.”

So being political on social media is fraught, and can seriously distract or derail social justice; and being non-political means opting out of social justice, which often means implicitly supporting ongoing issues, simply because, unfortunately, oppression of minorities is baked into most modern civilizations. It’s been proven that the only way to fight racism is to be actively anti-racist, and the same is true of transphobia, homophobia, ableism, and misogyny. In the same way that a puppy will pee in the house if it’s not trained, making the world more equal involves active effort.

Sometimes social justice can work with algorithms; this is the point of hashtags like #MeToo and #BlackLivesMatter, and why it is crucially important to use and spread these hashtags. By, essentially, making equity a viral topic, it’s more likely to spread, and the conversation more likely to reach many people. Similarly, this is why it’s often considered key to push high-profile figures on social media to sign onto these movements, and to make statements advocating for them. A Twitter user who otherwise doesn’t follow politics will notice if Taco Bell says Black Lives Matter. Furthermore, they will notice if it’s not only Taco Bell, but Tony Hawk, the Oscars, and a good percentage of the Youtube Minecraft players they follow. Oppressive beliefs, as we mentioned, often flourish in spaces where they are normalized, because humans are inclined to think something is simply “how things are” if everyone they know says it’s so. By pushing instead for anti-racist messages to be widespread, we can shift perceptions on “how things are,” or at least, what is acceptable in polite society. Few people want to look bad in comparison to Taco Bell.

Image

However, the incomprehensible comment I saw on a TikTok spoken word poem was not about the algorithm working as intended; it was about breaking the algorithm for a good cause. Another comment on the same video gave users advice on how to make the comments more effective: using all caps, making a long comment on the video, and avoiding words or phrases like “boost.” These are all, according to users, methods to help drive engagement, to spread this video further so that mikaila simone’s poem reaches more users.

Whether these tricks work is debatable. This press release is about as explicit as TikTok has ever gotten on what precisely drives their algorithm, and it only mentions that comments as a whole boost a video’s rate of recommendation, not whether the content of the comments has any effect. I couldn’t find any third-party reporting on this phenomenon, either. It is known, though, that TikTok does extensively monitor content. The app has an artificially intelligent filter that will comb text, while human moderators will comb through more popular videos as well as responding to user-submitted reports. This is why TikTok has so many “accountants”; some users can get away with referring to “seggs,” but sex workers themselves have come up with the “accountant” euphemism because human moderators seem reluctant to remove videos if the user is only implying something. (I once saw a video about custom made BDSM fetish paddles where the creator referred to them as “artisanal cheese boards,” and indeed, displayed some of them covered in cheese.)

There’s also evidence that TikTok suppresses anything that seems like unapproved branding or false engagement. So basically, it’s proven that TikTok has tools to monitor engagement, and it’s not unreasonable to suppose that TikTok might use the same tools as part of their video-boosting algorithm. Mikaila simone’s video, at time of writing, has over 400,000 views, over 220,000 likes, and close to 5,000 comments; that’s definitely successful. However, it’s not hard to notice that TikTok users may simply be noticing things a popular TikTok video would naturally already have; a more popular piece of content is more likely to inspire longer, passionate (allcaps) comments, and less likely to have people calling to “boost” the video in the comments (since it’s already popular). It’s a chicken and the egg situation; do these kinds of comments make a popular video, or do popular videos inspire these kinds of comments? With comments themselves known to drive engagement, it’s more murky to determine whether the content of these comments matters in any real way.

However, the intent here is singularly kind-hearted; the intent is to use the power of users to raise up mikaila simone’s voice, and give her a wider plaftorm to speak on an important issue. One of the oft-discussed tenets of anti-racist action is ensuring that people of color have a platform to speak their truths, specifically making room for them over white voices. Several commenters on mikaila simone’s video commented with variations on the phrase “i don’t understand, but i stand with you/hear you/grieve with you.” The intent of boosting the platform and in vocalizing that her voice is more important is to ensure that her voice is heard, because society as a default is less inclined to listen to people of color when they’re open about their experiences.

This is not the first time that social media algorithms have been purposefully gamed by users in order to advance social justice. In 2020 fans of the Korean pop group BTS became famous for using their powers of fandom for good to disrupt the organizing and recruitment of Qanon followers and other racist ideologies. BTS as a band has contributed billions of dollars to the South Korean economy, and they have an “army” of followers several times larger than most actual armies; and East Asian music fandoms purposefully create highly organized fan communities to help drive the success of their favorite idols and recruit new fans. International fans, especially those of BTS, have taken this model and adopted it not only for the betterment of BTS as a band, but for charitable works, especially in the world of social justice. BTS fans raised one million dollars for the Black Lives Matter movement last June in a mere 25 hours. BTS fans have disrupted attempts by American police districts to monitor their citizens, gamed Trump rally ticket systems to empty out the stands. The BTS fan’s most common weapon is the fancam–a short edited clip set to music of their favorite member or favorite performance, often with filters applied to make the video sparkly or covered in hearts. When racist conspiracy groups attempted to use the #WhiteLivesMatter and #QAnon hashtags on Twitter to spread their racist message, BTS fans spammed these hashtags with fancams and calls for other fans to join them in the assault on these tags. This approach came in large part because advocates for equality have increasingly called for deplatforming, i.e., not giving bigotry a space on platforms. Facebook and Twitter have famously dragged their heels on this issue, so the solution KPop stans came up with was a necessary alternative. If white supremacy is allowed on Twitter, then fighting back means disrupting the hashtags the white supremacists use to organize and recruit.

In the wake of the January insurrection on the Capitol, it’s more clear than ever that denying a platform to bigotry is vital, and that it’s, if anything, even more important to ensure that minorities have a platform to speak freely on the issues that effect them. Social media often, unfortunately, only encourages the worst tendencies in human behavior, which are to be blind to injustices, to uphold the voices of majorities over minorities, and to susceptible to bad reasoning and biases that can lead us astray. Social justice activism on social media is increasingly focusing on not just saying that, for example, Black Lives Matter, but in ensuring that that message isn’t lost in the feed.

Add a Comment

Your email address will not be published. Required fields are marked *