***CONTENT WARNING: This post discusses true crime, child abuse, and murder.***
You’re scrolling through TikTok when a dead-eyed woman appears on your For You page, describing in an eerie monotone voice the gruesome way she murdered her baby. Scroll a little further and another woman with the same uncannily wrong eyes appears, this time outlining the horrific events that led up to her murder. The use of deepfakes has made it to True Crime TikTok, and the results are as repulsively disrespectful as you can imagine.
There’s already a lot that’s been said about the ethics of for-profit true crime productions. From the proliferation of pseudo-science that targets and pathologizes marginalized people to survivors watching their trauma turned into entertainment, the genre is fraught with ethical issues and questions about the line between journalism and exploitation. While some true-crime productions can and have done good work, from helping exonerate the wrongly accused to forcing attention onto the epidemic of missing and murdered Indigenous women (MMIW), many creators are just out to monetize human suffering and people’s ghoulish desire to stare at it. This brings us to the adoption of AI deepfake technology to appropriate the victims’ voices and images to turn a profit.
One of the big issues with generative AI is the way it violates both intellectual property and people’s right to control their own image. We’ve seen this with the creation of AI porn, which made big news when it happened to Taylor Swift but has been affecting women who aren’t famous (and even children) for years. Unlike the pornographic AI images used to abuse and humiliate women and girls, these true crime deepfakes aren’t trying to be realistic because they don’t need to be. The point isn’t to convince you that you’re watching a murder victim from beyond the grave. Instead, it’s utilizing a combination of shock value and a false sense of intimacy between the viewer and the victim (or the perpetrator), to keep people watching until the full, monetizable 60 seconds of the video has played out.
Creating what amounts to a digital sock puppet of a murder victim to profit off their horrific torture, rape, and death is immensely disrespectful and harmful to their surviving family and friends. There’s no excuse for this, and no ethical explanation that could possibly justify using a real person’s image this way. It’s unequivocally horrific behavior.
Sometimes it seems like true crime fans and content creators forget that at the heart of their stories are real people with real lives, who have experienced unimaginable suffering. They are not just entertaining cautionary tales that listeners can ingest to self-soothe with at night. This proliferation of murder victim deepfakes being used to “tell their stories” by people completely unconnected to the victim or the family is in many ways the natural conclusion of this dehumanization If the victim ceases to be a real person to you, then why would it bother you to see their digital corpse being puppeted about in this way? It’s shocking, sure, but the victim isn’t really “real”, so there’s no outrage behind that shock.
We need to seriously think about how we use AI, how we interact with other people’s tragedies, and the ease with which we dehumanize others in the search for entertainment. Otherwise, this is only going to get worse.
(featured image: iStock/Getty Images Plus)
Published: Feb 25, 2024 04:23 pm