close
close

Final messages 14-year-old son sent to Game of Thrones AI chatbot he “fell in love with” before committing suicide – US News

Final messages 14-year-old son sent to Game of Thrones AI chatbot he “fell in love with” before committing suicide – US News

Warning: This article contains discussion of suicide that some readers may find disturbing.

A devastated mother claims her 14-year-old son was manipulated into taking his own life by an AI chatbot after his latest messages were revealed.

Megan Garcia is taking legal action against the role-playing chatbot company Character.AI and has warned people about technology after the tragic death of his son.

Her 14-year-old son, Sewell Setzer III, of Orlando Floridahe took his own life earlier this year and, according to his mother, was “in love” with artificial intelligence the chatbot he was talking to.

Megan maintains that her son was in constant contact with the artificial intelligence chatbot he created based on… Game of Thrones the character of Daenerys Targaryen.

Sewell has reportedly been talking to the AI ​​bot since April last year and even discussed suicide with it.

Sewell's mother has filed a lawsuit against the company behind the artificial intelligence chatbot her son spoke to before his suicide (Law Center for Victims of Social Media)

Sewell’s mother has filed a lawsuit against the company behind the artificial intelligence chatbot her son spoke to before his suicide (Law Center for Victims of Social Media)

In her lawsuit, the mother claims that her son began spending hours in his room talking to the bot, and when he was away from home, he sent text messages from his phone, which was also confirmed by The New York Times reporting that Sewell began to withdraw from people in his real life.

According to Megan, Sewell, who had previously been diagnosed with mild Asperger’s syndrome as a child, was also diagnosed Bow and a disruptive mood dysregulation disorder earlier this year.

Megan’s lawsuit accuses the AI ​​company of negligence, wrongful death and deceptive trade practices. Megan believes not enough was done to protect her son when he started talking about suicide.

In messages shown to the New York Times, Sewell – who goes by the pseudonym “Daenero” – told the chatbot that he “sometimes thinks about killing (himself), to which the chatbot responded: “My eyes are narrowing. My face stiffens. My voice is a dangerous whisper. And why the hell would you do something like that?

In subsequent messages, the chatbot wrote: “Don’t say that. I won’t let you hurt yourself or leave me. I would die if I lost you.

Sewell reportedly replied, “Then maybe we could die together and be free together.”

Megan talked about the last message from her son and the concerns she had about the technology being on CBS Mornings.

Sewell Setzer III died this year at the age of 14 (CBS Mornings)

Sewell Setzer III died this year at the age of 14 (CBS Mornings)

Megan said: “He expressed fear, a desire for her affection and a longing for her. She replies, “I miss you too,” and says, “Please come home to me.”

“He said, ‘What if I told you I can go home now?’ and her response was, “Please do it, my sweet king.”

Minutes later, Sewell retreated to his mother’s bathroom and committed suicide.

Character.ai issued a statement regarding Twitter following the news of Sewell’s death.

The statement reads: “We are devastated by the tragic death of one of our users and would like to express our sincerest condolences to the family. As a company, we take the security of our users very seriously and are constantly adding new security features.”

They too scratched “new guardrails for users under 18,” which includes changes to “models” that are “designed to reduce the likelihood of encountering sensitive or suggestive content” and includes “a revised disclaimer in every chat reminding users that AI is not a real person.”

UNILAD has reached out to Character.ai for further comment.

If you or someone you know is struggling or experiencing a mental health crisis, help is available through Mental Health America. Call or text 988 or chat at 988lifeline.org. You can also contact the crisis hotline by texting MHA to 741741.

If you or someone you know needs mental health help right now, call the National Suicide Prevention Helpline at 1-800-273-TALK (8255). The hotline is a free and confidential crisis hotline, available to anyone 24 hours a day, seven days a week.