A 14-year-old teen sadly took his own life after falling in love with an AI chatbot. Following the rapid development of AI (artificial intelligence), companies like Character.AI have seen a massive boom in users.
On such sites, people can chat with AI chatbots inspired by fictional characters and celebrity personalities. The majority of those who use these chatbots are aware the things they are talking to are not human, but the more advanced the technology gets, the more people fall for the lifelike way they speak. Sadly, this was the case for 14-year-old Sewell Setzer III.
Teen kills himself after falling in love with AI chatbot
For months, 14-year-old Sewell Setzer III from Florida had been chatting with a Daenerys Targaryen (from the hit series Game of Thrones) chatbot on Character.AI. Prior to his death, he was chatting with the bot, which sent him an eerie message telling him to “Come home to me as soon as possible, my love.”
When Setzer hinted that he would be coming home to her, the bot replied, “…please do, my sweet king.” Seconds later, Setzer shot himself with his stepfather’s handgun.
Before the saddening incident, Setzer had been engaging with the bot “Dany,” based off the Game of Thrones character. Several chats were sexually explicit, while others showed that Setzer was battling suicidal thoughts. According to the suit, Setzer’s mental health quickly declined after downloading Character.AI, with his family claiming his grades started to slip, he started getting into trouble in school, and he became distant from those around him. He started attending therapy in late 2023 and was diagnosed with anxiety and disruptive mood disorder. However, despite pinpointing Character.AI as a supposed reason for his mental health decline, the teen appeared to still actively use the platform.
The teen’s mother, Megan Garcia, has since filed a lawsuit against Character.AI for her son’s death. She claims the app fueled his AI addiction and, in turn, sexually and emotionally abused him. She also believes the platform had the responsibility of alerting proper authorities after he expressed suicidal thoughts. Garcia claimed that children Setzer’s age do not have the mental maturity to understand that these are not humans, which is why he believed the relationship between him and Dany was real. This begs the question: With the rapid development of AI, is it dangerous that these chatbots are becoming more human-like with their responses?
Character.AI responds to new of teen’s death
The company responded to news of the teen’s death on their X account. In their tweet, they stated they were heartbroken by his passing and sent their deepest condolences to the family. Additionally, they announced new safety features to prevent such cases from happening again, such as character moderation and changes that would prevent minors from encountering sensitive subjects.
Reactions to the situation have been mixed. Many believe that Character.AI should not be blamed for the teen’s death, and it is the parents job to supervise what their children are doing online. Therefore, suing the company is pointless.
Others questioned how Setzer, a minor, got his hands on a parent’s gun when it should be kept in a place he can’t reach.
Rest in peace, Sewell Setzer III.
Published: Oct 25, 2024 11:06 am