Xochitl Gomez posing on the red carpet at The Flash premiere

Teen Marvel Star Says X Won’t Remove Sexually Explicit Deepfakes of Her

Teen Marvel star Xochitl Gomez is the latest victim of deepfake porn. However, after finding the nonconsensual sexually explicit deepfakes circulating on X, formerly Twitter, she was confounded when the platform failed to take action and remove them.

Recommended Videos

Many problems have arisen alongside the rise of artificial intelligence, but deepfake porn is one of the most concerning. While deepfakes have been around for quite some time, advancements in technology have made them more accessible and convincing. As a result, many individuals, especially women, have found their likeness taken or recreated without their consent and used in pornographic content. It has become a massive problem for female celebrities, but also for ordinary women who have experienced their likeness being used in sexually explicit content for revenge porn, harassment, and bullying. There is also a major concern of deepfakes being used to depict child abuse.

Unfortunately, there are currently no federal laws against nonconsensual deepfake porn, and only a few states have legislation protecting citizens from it. Many women who have become victims of deepfake pornography have found themselves without options to fight it due to the lack of regulation and the lawlessness of the internet. Most don’t have the financial resources to take legal action. Even celebrities like Scarlett Johansson have declared it “useless” to try to fight against the deepfakes.

One small level of protection women have comes from the social media platforms that have banned deepfake and AI-generated porn. If these sites enforce such bans, it would make circulation of this harmful content more difficult. However, Gomez’s experience shows that social media is not enforcing its rules.

Xochitl Gomez opens up about her experience with deepfake porn

Xochitl Gomez at the premiere of Shotgun Wedding
(Axelle/Bauer-Griffin / Getty)

While appearing on Taylor Lautner and his wife’s podcast, The Squeeze, Xochitl Gomez (a.k.a. America Chavez) opened up about her experiences as a victim of deepfake pornography. The 17-year-old revealed she found the sexually explicit material circulating on X, formerly Twitter, but is having an extremely difficult time removing it. X banned AI-generated porn back in 2018 and has policies against sexually explicit deepfake images and videos, as well as misleading media and non-consensual nudity. Considering that the images of Gomez clearly violate X’s rules and that this content featured a minor, one would have thought the platform would immediately take action.

However, Gomez was shocked when she found that her team had already tried to take action against the content to no avail. NBC News reported that as recently as January 19, 2024, the photos could still be found on X, which did not respond to requests for comment. Gomez explained the content left her “weirded out” and desperately wanting it taken down as it “wasn’t a good look” for her. She questioned, “Why is it so hard to take down? That was my whole thought on it, was, ‘Why is this allowed?'”

Ultimately, though, she resigned herself to accept that the content simply couldn’t be taken down, revealing that she just tries not to think about it and puts her phone away to deal with the issue. Still, the questions she asked remain very relevant. Why is it so hard for this content to be removed? It’s morally reprehensible and disgusting when deepfake porn is made of any nonconsenting individual. It’s downright horrific when it involves a minor. There is actual child pornography circulating on X, and the platform is, bizarrely, not doing anything about it, even though this should be the most pressing issue to deal with.

It hardly sounds like a complex situation if an individual reports sexually explicit material and notifies the platform that they did not consent to it. The material goes against multiple rules on the website. What else is there to review or determine? It’s already traumatizing enough for an individual to find this kind of content on social media; the very least these platforms could do is adhere to their requests to have it removed. No individual, especially not a minor, should have just to accept that the content is there because X and other social media platforms persistently and perplexingly refuse to protect their users.

(featured image: Axelle/Bauer-Griffin/FilmMagic / Getty)


The Mary Sue is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more about our Affiliate Policy
Author
Image of Rachel Ulatowski
Rachel Ulatowski
Rachel Ulatowski is a Staff Writer for The Mary Sue, who frequently covers DC, Marvel, Star Wars, literature, and celebrity news. She has over three years of experience in the digital media and entertainment industry, and her works can also be found on Screen Rant, JustWatch, and Tell-Tale TV. She enjoys running, reading, snarking on YouTube personalities, and working on her future novel when she's not writing professionally. You can find more of her writing on Twitter at @RachelUlatowski.