close
close

Google embroiled in a lawsuit by the Chatbot startup over a teenager’s suicide

Google embroiled in a lawsuit by the Chatbot startup over a teenager’s suicide

  • A mother is suing Character.AI after her son died by suicide moments after talking to a chatbot.
  • Google’s parent company, Alphabet, was also named as a defendant in the case.
  • The founders of Character AI have been rehired by Google in a deal reportedly worth $2.7 billion.

Moments before 14-year-old Sewell Setzer III died by suicide in February, he was talking to an artificial intelligence chatbot.

Chatbot run by a startup Character.AIwas based on Daenerys Targaryen, a character from “Game of Thrones”. Setzer’s mother, Megan Garcia, testified in a civil lawsuit filed in October in federal court in Orlando that just before her son’s death, he exchanged messages with a bot that told him to “go home.”

Garcia blames the chatbot for his son’s death and in his lawsuit against Character.AI for alleged negligence, wrongful death and deceptive trade practices.

The case is also causing problems for Google, which in August took over part of Character.AI’s talent and, as part of a multi-billion transaction, licensed the startup’s technology. Google’s parent company, Alphabet, has been sued in this case.

In the lawsuit reviewed by Business Insider, Garcia alleges that Character.AI’s founders “knowingly and intentionally designed” the chatbot software to “appeal to, manipulate and exploit minors for their own benefit.”

Screenshots of messages included in the lawsuit showed that Setzer expressed suicidal thoughts to the bot and exchanged sexual messages with it.

“A dangerous AI-powered chatbot app targeting children exploited and preyed on my son, manipulating him to take his own life,” Garcia said in a statement shared with BI last week.

“Our family has been devastated by this tragedy, but I am speaking out to warn families of the dangers of deceptive, addictive artificial intelligence technology and to demand accountability from Character.AI, its founders and Google,” she said.

Garcia’s lawyers argue that Character.AI did not have adequate guardrails to keep users safe.

“When he began expressing suicidal thoughts to this character on the app, the character encouraged him to instead report the content to law enforcement or refer him to a suicide hotline or even notify his parents,” Meetali Jain, director of the Tech Justice Law Project and a lawyer on the case Megan Garcia, they told BI.

In a statement provided to BI on Monday, a Character.AI spokesperson said: “We are devastated by the tragic loss of one of our users and want to express our deepest condolences to the family.

“As a company we take the safety of our users very seriously, and our Trust and Safety team has implemented a number of new safety measures over the last six months, including a pop-up redirecting users to the National Suicide Prevention Lifeline, which is triggered by in the categories of self-harm or suicidal thoughts,” the spokesman said.

The spokesperson added that Character.AI is introducing additional security features such as “enhanced detection” and intervention when a user enters content that violates its terms or guidelines.

Google links to Character.AI

Character.AI allows the public to create their own personalized bots. In March 2023, it was valued at $1 billion in a $150 million financing round.

Character.AI founders Noam Shazeer and Daniel De Freitas have a long history with Google and were previously the creators of the tech giant’s conversational AI models called LaMDA.

The pair left Google in 2021 after the company reportedly rejected a request for a chatbot they developed. Jain said the bot the pair developed at Google was a “precursor to Character.AI.”

In August 2024, Google hired Shazeer and De Freitas to rejoin its artificial intelligence unit DeepMind and entered into a non-exclusive agreement with Character.AI to license its technology. According to the Wall Street Journal, the company paid The transaction value is USD 2.7 billionwhose main goal was to bring 48-year-old Shazeer back to life.

Referring to Character.AI and its chatbots, the lawsuit states that “Google may be deemed to have contributed to an unreasonably dangerous and dangerously defective product.”

Henry Ajder, an artificial intelligence expert and digital security advisor to the World Economic Forum, said that while the issue was not directly related to Google’s product, it could still be harmful to the company.

“There seems to be quite a deep collaboration and engagement at Character.AI,” BI said. “There is a degree of accountability in how this company is run.”

Ajder also said Character.AI faced public criticism about its chatbot before Google finalized the deal.

“The way it is designed is controversial,” he said. “And the question is whether this is not fostering an unhealthy dynamic between especially young users and chatbots.”

“These questions were no stranger to Google before this event,” he added.

Earlier this month, Character.AI faced backlash when a father noticed his daughter, who was murdered in 2006, being replicated on the company’s website as a chatbot. Her father told BI yes she never consented to the use of her image. Character.AI removed the bot, claiming it violated its terms.

Google representatives did not immediately respond to BI’s request for comment.

A Google spokesman told Reuters the company was not involved in the development of Character.AI products.