The documentary Roadrunner: A Film About Anthony Bourdain is in the spotlight, and it’s not only because celebrity chef and No Reservations TV presenter Bourdain was immensely loved. It’s got people talking because the documentary includes an AI-generated “deepfake” version of the icon’s voice speaking words he never said in life.
Morgan Neville, the director of the film, admitted in an interview with The New Yorker that he had “created an A.I. model of his voice.” He apparently really wanted Bourdain’s voice for the three quotes in question, and it’s raised some eyebrows at the integrity of such a move and whether the move is ethical.
When I wrote my review I was not aware that the filmmakers had used an A.I. to deepfake Bourdain’s voice for portions of the narration. I feel like this tells you all you need to know about the ethics of the people behind this project. https://t.co/7s1mdDOfzl pic.twitter.com/zv2pEvtTim
— Sean Burns (@SeanMBurns) July 15, 2021
For me, and for many others, this feels like the height of entitlement. Who is the director to pick and choose what Bourdain’s left behind and generate the words in his own voice? No one should be able to mess with that. Artificially creating what we perceive to be part of someone’s legacy is feels like overreaching, even if Bourdain’s estate reportedly approved of the AI readings.
Anthony Bourdain was a celebrity death that hit me hard. I hate the cavalier attitude the director has towards his use of AI to deep fake Bourdain’s narration. This review is also illuminating. I won’t be seeing this film. https://t.co/xEi3MFzUBd
— SecretAgentRomancingThane🏳️🌈 (@secretagentr) July 16, 2021
And now Bourdain’s widow Ottavia Bourdain has disputed Neville’s claim that he approached her for permission, according to a new report from The Washington Post.
I certainly was NOT the one who said Tony would have been cool with that. https://t.co/CypDvc1sBP
— Ottavia (@OttaviaBourdain) July 16, 2021
Others openly questioned whether Bourdain would even allow something like this. Yes, “he expressed and encouraged empathy” wherever he went and his hallmark was openness and honesty, but would he allow his own voice reading his email to be recreated because a director felt like it was within their right to do so? Especially considering the words are those of someone struggling with mental health issues, with the AI voice reading these words out of his email: “…and my life is sort of shit now. You are successful, and I am successful, and I’m wondering: Are you happy?”
The thought of someone deepfaking my unauthorized letters to serve their own narration of who I was is truly a special kind of hell to me. I think about bourdain all the time. This is such a bummer, my god
— arabelle sicardi 英梅 (@arabellesicardi) July 15, 2021
Combine this with the narrative that Neville apparently played with in the film of placing responsibility for Bourdain’s death on Asia Argento, who he was in a relationship with when he died—and you’ve got an even messier documentary rooted in making women the enemy, something we can be pretty sure in saying that Bourdain wouldn’t have approved.
In a review of the documentary by Sean Burns for WBUR, he says, “Other talking heads have conjured a catty Yoko Ono narrative about Argento, who conveniently was not interviewed for the film. The last half-hour of the picture is tabloid trash, pointing fingers in a fashion that will sound sickly familiar to anyone who remembers the ugliness leveled at Courtney Love in the wake of Kurt Cobain’s suicide.”
yes the AI situation is unethical and deeply questionable but truly “asia argento’s problems resulted in anthony bourdain’s suicide” and acting like “she was his last addiction” is even more gross and fucked up so basically fuck morgan neville
— the morally corrupt juan barquin (@woahitsjuanito) July 15, 2021
If Neville could do this with a beloved icon like Bourdain, what’s stopping other creators from following in his footsteps and using AI voices as a means of making money or shaping their narrative in the alleged voice of the subject? We’re likely going to be seeing a lot more of this kind of thing going forward (though the backlash to Neville might give others pause).
For many of us watching on the sidelines, this is a step into creepy territory where the passing of someone cherished by many is disrespected as a means of telling a posthumous story and profiting from it. And it feeds into a growing problem of how people’s images are used after their deaths through new technology.
Between this, the Whitney/Tupac hologram, & NFTs, even non-“famous” people are now gonna have to come up with elaborate & ironclad image & likeness clauses in their wills all because YT people don’t know how to leave you tf alone—even when you’re dead 🤨https://t.co/0QjXN4qe42
— Clarkisha Kent (@IWriteAllDay_) July 15, 2021
On top of it all, the documentary did itself no favors in the careless way the ethics of the situation were handled. It contains no mention of the use of the AI voice, with the director even telling The New Yorker that it’s so convincing that viewers would never know which lines had been created:
Throughout the film, Neville and his team used stitched-together clips of Bourdain’s narration pulled from TV, radio, podcasts, and audiobooks. “But there were three quotes there I wanted his voice for that there were no recordings of,” Neville explained. So he got in touch with a software company, gave it about a dozen hours of recordings, and, he said, “I created an A.I. model of his voice.” In a world of computer simulations and deepfakes, a dead man’s voice speaking his own words of despair is hardly the most dystopian application of the technology. But the seamlessness of the effect is eerie. “If you watch the film, other than that line you mentioned, you probably don’t know what the other lines are that were spoken by the A.I., and you’re not going to know,” Neville said. “We can have a documentary-ethics panel about it later.”
David Leslie, ethics lead at the Alan Turing Institute, told the BBC that this demonstrates the importance of at least providing audiences with such a warning when AI is being used for a purpose like this, to avoid giving viewers the idea that they’re being intentionally deceived. Without Neville’s interview, audiences may not have known this was happening.
At the end of the day, the death of Bourdain hit those who loved him really hard—and there were millions of people worldwide who had read and watched him for years and loved him, too. There is something so unsettling about an AI voice being conjured up that many will not see the documentary because of it.
And who can blame them? I wouldn’t either, especially when it feels like the beginning of directors setting a precedent where their vision is more important than any ethical issues a movie like Roadrunner: A Film About Anthony Bourdain might bring up and shrugging it off as something we can discuss later.
(image: Craig Barritt/Getty Images for The New Yorker)
Want more stories like this? Become a subscriber and support the site!
—The Mary Sue has a strict comment policy that forbids, but is not limited to, personal insults toward anyone, hate speech, and trolling.—
Published: Jul 16, 2021 02:04 pm