An AI Program Imagined Barbies From Around the World and Wouldn’t You Know It, They Turned Out Super Racist
As Barbies are all the rage right now (thank you Greta Gerwig!), a lot of brands are taking advantage of the hype and getting into the Barbie fever by creating products, merchandise, or … offensive posts?
Recently, BuzzFeed posted an article in which they used an AI program to create imagined versions of Barbie from countries around the world. While the idea is harmless in theory, people were quick to realize that the Barbies were incredibly offensive as they played into racial, xenophobic stereotypes.
While a lot of outlets view AI as a cheap source of easy content, AI is rooted in human biases. It’s a bit odd that no one thought to double-check the Barbies to make sure they weren’t upholding some pretty ugly views of their home countries. And people online were not afraid to let BuzzFeed know that they really messed up.
AI Barbies uphold wildly racist stereotypes
According to BuzzFeed’s post (which has now been taken down due to the intense backlash), the post featured what “Barbie would look like in every country in the world.” The images were generated by Midjourney, a generative AI model that converts language prompts into images, per Buzzfeed.
In the opening paragraphs of the post, BuzzFeed wrote that the Barbies “reveal biases and stereotypes that currently exist within AI models,” noting that they are “not meant to be seen as accurate or full depictions of human experience.”
However, that disclaimer did not stop people from absolutely ripping into the dolls and criticizing the outlet for creating something that’s so harmful to the people they’re trying to portray.
One of the worst Barbies came from the country of Sudan and can be seen holding a literal gun.
This offensive for many reasons, one of which is that the country is often reduced to a war-torn stereotype and the people within it as violent. While the country is currently going through a war between the Sudanese Armed Forces (SAF) and the paramilitary Rapid Support Forces (RSF), not everyone from Sudan is involved in the conflict and to suggest otherwise is incredibly racist.
Another issue that was raised was that a lot of the Barbies that are supposed to be from traditionally ethnic countries were whitewashed. For example, the Thailand Barbie has pale skin and blonde hair. While the people of Thailand are incredibly diverse, it’s a Southeast Asian country where features skew more towards dark hair and tanned skin as it’s closer to the equator.
These weren’t the other complaints people had about these offensive Barbies. Below are some more of Twitter’s reactions:
This isn’t the first time an AI program has shown a propensity to create more racially insensitive images, but it’s very disappointing to see a major outlet not care about the impact this kind of content can have on the people they’re trying to represent. I would like to say that this won’t happen again, but we all know that the more outlets lean on AI and it goes unchecked, the more offensive content will be made.
This piece was written during the 2023 WGA and SAG-AFTRA strikes. Without the labor of the writers and actors currently on strike, the work being covered here wouldn’t exist.
(featured image: Warner Bros. Pictures)
Have a tip we should know? tips@themarysue.com