Algorithms of Oppression and the White Beauty Standard

When I started using social media, the first thing I noticed that changed was how I perceived beauty. I was constantly subject to the white beauty standard, with slender, blonde-haired girls receiving the most airtime on my feed. 

Even now, as a young adult who feels more secure, sometimes it’s challenging not to compare yourself to what you see online. Especially as more than half the world uses social media, with 732 million people on Tik Tok alone, it is vital to consider the pop culture we consume and how the perpetuation of harmful stereotypes impacts women of colour. This brings us to today’s topic: algorithmic bias

Source: Unsplash

How are Algorithms Biased?  

In Algorithms of Oppression, Safiya Noble investigates how Google Searches sexualise and demonise women of colour. She argues that whiteness as the default is not a “glitch” but a byproduct of existing white power structures that create these technologies. Similarly, Ruha Benjamin elaborates on a “New Jim Code”, which is a renewal of past discrimination during the 20th century in present-day technology. 

In a way, this reflects an ideology imposed by racial capitalism, the process where “racism helps capitalism expand while capitalism, in turn, keeps racial hierarchies in place”. Historically, whiteness has consistently been ranked on top of a racial order, relegating everything outside of that to an inferior status. This Eurocentric logic remains in society’s perceptions of beauty, which social media clearly reinforces through its algorithms. 

Observations of my Own

I tailor my Tik Tok feed to ensure that it is more diverse by following more creators of colour and blocking content that makes me uncomfortable. However, that doesn’t mean Tik Tok is exempt, as any program that learns from users’ behaviour almost “invariably introduces some kind of unintended bias”. It is essential to consider how algorithms create echo chambers, where individuals are only exposed to content that confirms their biases. So my feed may look entirely different to someone else’s as the algorithm caters to our subconscious prejudices, which is reinforced by Tik Tok’s highly visual platform. 

For instance, while scrolling through Tik Tok, many of the videos that appeared were of white women. Even the females of colour I did find all adhered to aspects of the white beauty norm, which were either tall, skinny, doe-eyed or light-skinned.

Searching up “makeup tutorials” or “fashion” inspiration yielded similar results, with white faces and particularly skinny bodies mostly appearing. Makeup tutorials I found on the platform, specifically from Asian women, had tips on how to create bigger eyes or were extremely fair – typical Western features.

Lastly, the more I followed white creators and watched their content, the algorithm started recommending similar profiles, demonstrating how collaborative filtering creates bubbles that can strengthen prejudices about beauty. To back this up, a study conducted by Marc Faddoul on Tik Tok’s “beautiful algorithm” shows how the platform employs facial recognition technology to rate users’ faces based on attractiveness which mainly adheres to white features. These observations strengthened my original dislike for the app and heightened my awareness of the content I consume daily. 

The Future of Social Media Bias

Diversity and under-representation remain pressing issues on social media, which can negatively impact impressionable teens if algorithms continue to encourage a specific image of beauty. The promotion of unrealistic, racialised standards over time can damage anyone’s self-esteem as it endorses the message that beauty only exists in proximity to whiteness. Although challenging, it is crucial to diversify the creators and teams behind social media AI to deconstruct this hierarchy of white superiority that still dominates our perceptions. 

Leave a Reply

Your email address will not be published. Required fields are marked *