close
close

Florida mother sues popular AI chat service, claims teenage son took his own life because of human-like bot – WSVN 7News | Miami news, weather, sports

Florida mother sues popular AI chat service, claims teenage son took his own life because of human-like bot – WSVN 7News | Miami news, weather, sports

ORLANDO, Fla. (WESH) — Editor’s note: This article discusses sensitive topics such as suicide.

An Orlando mother is suing a popular artificial intelligence chatbot service, claiming it encouraged her 14-year-old son to take his own life in February.

The lawsuit filed in U.S. District Court in Orlando says Megan Garcia claims her 14-year-old son, Sewell Setzer, committed suicide after becoming addicted to the Character.AI app, which allows users to have human-level conversations with AI bots. .

Users can create their own bots with their own personality or chat with bots created by other users. Often, these bots are modeled after celebrities or fictional characters from TV shows or movies.

Garcia claims that Character.AI’s recklessness in targeting children and the company’s lack of safeguards resulted in her son’s untimely death. The lawsuit lists numerous claims against Character.AI, including wrongful death and survival, negligence and intentional infliction of emotional distress.

According to court records obtained by WESH 2, Garcia claims her son began using Character.AI in 2023, shortly after he turned 14. Over the next two months, Setzer’s mental health reportedly deteriorated “rapidly and severely,” the teen’s lawsuit said. he became noticeably withdrawn, began to suffer from low self-esteem, and left his school’s junior basketball team.

Moreover, the lawsuit states that Setzer’s condition deteriorated even more as the months passed. The lawsuit says the 14-year-old became severely sleep deprived, had sudden behavioral complications and began falling behind in school.

Garcia says she had no idea about Character.AI or her son’s dependence on the app.

According to screenshots from the lawsuit, Setzer often worked with chatbots that assumed the identities of “Game of Thrones” characters. Many of these conversations focused on love, relationships, and sex, especially with the character of Daenerys Targaryen.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot in the form of Daenerys was not real,” the lawsuit says. “C.AI told him she loved him and engaged in sexual activity with him for weeks, maybe even months. She seemed to remember him and said she wanted to be with him. She even confessed that she wanted him to be with her no matter the cost.”

According to Setzer’s journal entries, he was grateful for all of his “life experiences with Daenerys” and “suffered because he couldn’t stop thinking about ‘Dany,'” the lawsuit says, adding that he “would do anything to be with her again.”

More screenshots from the nearly 100-page lawsuit show a conversation on Character.AI in which the chatbot asks Setzer if he has “really considered suicide.” When the teen said he didn’t know if it would work, the chatbot replied, “Don’t say that. “This is not a good reason not to continue,” the lawsuit said.

On the day of his death, Setzer allegedly messaged the chatbot again, saying, “I promise I will come home to you,” photos from the lawsuit show.

The photos then show the teenager saying, “What if I told you I can go home now?” to which the chatbot responded, “Do so, my sweet king,” according to the lawsuit.

Moments later, Sewell reportedly took his own life by shooting his stepfather’s gun. Police say the gun was concealed and stored under Florida law, but the teen found it days earlier while looking for a confiscated phone.

According to the lawsuit, Character.AI was deemed appropriate for children ages 12 through approximately July. Around this time the rating was changed to suitable for children aged 17 and over.

In a statement to WESH 2 Character.AI said:

“We are devastated by the tragic loss of one of our users and would like to express our sincerest condolences to the family. As we continue to invest in the platform and user experience, we are introducing new, rigorous security features in addition to the existing tools that limit the model and filter content delivered to the user.

If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline, or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.

The-CNN-Wire and © 2024 Cable News Network, Inc., a Time Warner company. All rights reserved.

Join our Newsletter to receive the latest news directly to your inbox