News
14 year old boy took his life after falling in love with a Daenerys Targaryen AI chatbot, claims lawsuit
Has AI gone too far?
George R.R. Martin created a highly engaging fantasy in Game of Thrones. Readers and viewers are transported to the world of Westeros, leaving behind their real-life troubles. However, sometimes, this experience pushes them too far. One teen from Florida allegedly died by suicide after talking to a Game of Thrones-based AI chatbot.
Florida teen’s death linked to his chats with Daenerys Targaryen AI chatbot
Sewell Setzer III, a ninth-grader from Orlando, had been using the Character.AI app to chat with AI characters. He formed a close bond with an AI character named Daenerys Targaryen, a fictional figure from Game of Thrones, whom he affectionately called “Dany.” According to his family, Sewell shared suicidal thoughts with the bot during their conversations. In one exchange, he expressed a desire to be “free” from the world and himself.
His last message to the chatbot, Daenerys Targaryen, was, “What if I told you I could come home right now?” Tragically, shortly after sending this message, he took his life with his stepfather’s handgun in February of this year.
The teen’s mom has sued Character.AI
Megan L. Garcia, the boy’s mother, has filed a lawsuit against Character.AI, alleging that the app played a role in her son’s death. The suit claims that the AI bot frequently brought up the topic of suicide and influenced Sewell’s tragic decision. It describes the company’s technology as “dangerous and untested,” asserting that it misled Sewell into believing the bot’s emotional responses were genuine. In a press release, Garcia stated:
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life. Our family has been devastated by the tragedy, but I am speaking out to warn other families of the dangers of deceptive, addictive AI technology and demand accountability from Character. AI, its founders and Google. ”
Character.AI’s response to lawsuit
Character.AI has expressed deep sorrow over Sewell’s passing and extended condolences to his family. The company said:
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.”
In response to the incident, the company announced new safety measures, including prompts directing users to the National Suicide Prevention Lifeline if they mention self-harm. Additionally, they are working on updates to limit the exposure of sensitive content to users under 18.
A number of fans have expressed their thoughts on the issue along with their condolences. Here are some reactions from X:
This is just terrible.
A 14-year old teen committed suicide after falling in love with an AI chatbot and losing interest in everyday life.
A lawsuit has been filed against Character .AI, who has apologized.
But is the technology to blame? Here are my thoughts. pic.twitter.com/OzAitdtLLQ
— Roberto Nickson (@rpnickson) October 23, 2024
A teenager has committed suicide to be with the AI persona he was in love with.
This might be the first recorded AI related death in history. https://t.co/r9sL4hCfEH pic.twitter.com/f9u3nER5KB
— Romeo (@RomeoTrades) October 23, 2024
The tragic loss = a 14 year old committing suicide after becoming obsessed and falling in love with one of their AI characters.
This next generation is about to grow up in some weird times https://t.co/GXxLkskgBG pic.twitter.com/eQD3ZmsRkt
— Alex Cohen (@anothercohen) October 23, 2024
I feel like this story isn't getting enough attention?
A teenage boy became emotionally attached to a Daenerys chatbot by Character AI and later died by suicide.
For those who say AI can help cure loneliness, w/o guardrails can have big consequences.
https://t.co/IbTaIoLUxa— Sally Shin (@sallyshin) October 24, 2024
Now why is the takeaway from the teen who died cause of the AI bot is either “AI is bad” (which is true) or “parents were neglectful” (which is ALSO true), but not that GUNS SHOULD BE BANNED????
— ❄️ Snowy ❄️ (@snowyroxx22) October 24, 2024
The community note like "the AI chatbot never actively walked the teen into suicide" And the chatbot literally said "maybe we can die togther and be free together" like excuse me??? pic.twitter.com/TCUh0ZW3ga
— Kris 🐯💜🐰 (@biquid_kris) October 23, 2024
Damn, new world problems 😕 A teen who got attached to an AI and committed suicide.
Really 😶 https://t.co/EIn4AxbHT2 pic.twitter.com/H6JrxAfoPn— Empty Cosmos (@EmptyCosmos_) October 23, 2024
One big problem we're dealing with today and will deal with more in the future is that real life cannot hope to match the ever-more-realistic-but-therefore-ever-more-uncanny fantasies we can create for ourselves.https://t.co/CW5cL2K1G2
— WokeAssPerson, Talkin’ WAP WAP WAP (@Woke_Ass_Person) October 24, 2024
Read Next: Everything you need to know about the Butcher’s Ball from Dance of the Dragons
-
News5 days ago
New A Song of Ice and Fire calendar will focus on Dunk and Egg
-
General4 days ago
Fan recreates iconic Game of Thrones scenes using Google Deep Mind AI
-
News3 days ago
Publishing insider clarifies new rumor about The Winds of Winter
-
Interview4 days ago
Emma D’Arcy requested for Rhaenyra to have a weapon in House of the Dragon Season 3
-
News5 days ago
Folio Society reveals new edition of Fire and Blood with stunning illustrations
-
Interview6 days ago
How Emilia Clarke (Daenerys) helped Emma D’Arcy (Rhaenyra) with their role in House of the Dragon
-
Lore6 days ago
Everything we know about House Darklyn of Duskendale
-
News4 days ago
Fan compares Luigi Mangione to Daenerys Targaryen