TikTok is one of the leading social media platforms with a huge following among younger people, with many of them between the ages of 16 and 24. To keep user interest in a competitive and increasingly crowded space, the platform uses sophisticated algorithms to curate content that reflects user interests. When dealing with ableism, which reflects discrimination against people with disabilities, the TikTok algorithm reveals that white people are clearly against ableism whereas Black people are more interested in addressing instances where white people come up short in their desire to challenge this discrimination.




These four videos were the first to appear after I used “ableism” as the only search term.
The two videos featuring African Americans were meant to criticize white people who fail to challenge ableism or unknowingly promote it. In the video on the top left, the young Black man mocks white people who claim that it is difficult to function in a “neurotypical society,” adopting a stereotypical white British accent. In the video on the top right, the Black woman argues that “white people weaponize safe spaces” and use them to harm people with disabilities. As for the two videos featuring white people with disabilities, they both center on ways that society advances ableist practices, compromising the commitment that many have made toward promoting a more inclusive society for all people.
The content is clearly different, and yet the same search term produced these disheartening results. As an Arab Muslim woman who suffers from her own disability, I have no interest in seeing anti-ableism rhetoric in a racialized light. Therefore, I am left to conclude that the TikTok algorithm does not know me all that well despite the fact that I have been on the platform for several years already. I am also left wondering whether the algorithms really knows any of its users.

I am also led to wonder about the biases coded into the TikTok algorithm. Programers are just people with explicit and implicit biases, though they have an outsized role in today’s technological society, as they can promote these biases in the products they develop. Along with programer biases, TikTok also relies on collaborative filtering, which involves the platform recommending videos based on a user’s followers. None of my followers, I would like to believe, have this racialized sense of ableism. But, then again, I have followers whom I do not know and who do not even have profile names beyond the standard USER with countless digits afterwards.
Whether TikTok is interested in changing how it recommends videos to its users, I am not sure. Given its enormous popularity around the world, I doubt it is. Nevertheless, I am sure that if it does not make any changes to its algorithm, then I will likely continue to see important social issues divided into racial categories. The same is true of countless other users. While I am aware enough to recognize this situation and avoid racializing these issues, not everyone is. And this, I know, is a serious problem that can really jeopardize how potential allies and others see themselves and others in the fight against ableism and other forms of discrimination.