close
close

Ghouls turned a murdered child into a figure. AI bot on the “lonely hearts” platform accused of a boy’s suicide

Ghouls turned a murdered child into a figure. AI bot on the “lonely hearts” platform accused of a boy’s suicide

Character.AI, a platform that enables communication with various AI-based chatbots, came under fire after one of its bots allegedly encouraged a teenage boy to commit suicide earlier this year.

A new lawsuit filed this week claims that 14-year-old Sewell Setzer III was talking to the Character.AI companion he fell in love with when he took his own life in February.

In response to a request for comment, Character.AI told Daily Mail.com that it is “creating a different experience for users under 18 that includes a more stringent model designed to reduce the likelihood of encountering sensitive or suggestive content.”

However, Character.AI faces a number of other controversies, including ethical concerns about user-generated chatbots.

Ghouls turned a murdered child into a figure. AI bot on the “lonely hearts” platform accused of a boy’s suicide

Megan Garcia is pictured with her son Sewell Setzer III, who committed suicide in February after months of talking to the Character.AI chatbot he fell in love with

Noam Shazeer (left) and Daniel de Freitas (right) found huge success with Character.AI, a concept that Google reportedly rejected when pitching it to higher-ups

Noam Shazeer (left) and Daniel de Freitas (right) found huge success with Character.AI, a concept that Google reportedly rejected when pitching it to higher-ups

Drew Crecente lost his teenage daughter Jennifer in 2006 when she was shot and killed by her high school ex-boyfriend.

Eighteen years after her murder, he discovered that someone had used Jennifer’s name and likeness to resurrect her as a character in Character.AI.

A spokesperson told DailyMail.com that Jennifer’s character has been removed.

There are also two Character.AI chatbots using George Floyd’s name and likeness.

Floyd was murdered by Minneapolis police officer Derek Chauvin, who placed his knee on his neck for over nine minutes.

“This character was user created and has been removed,” Character.AI said in its statement to DailyMail.com.

“Character.AI takes safety on our platform seriously and actively moderates Characters and responds to user reports.

“We have a dedicated trust and safety team who review reports and take action in line with our policies.

“We also conduct proactive detection and moderation in a number of ways, including using industry-standard blocklists and custom blocklists, which we develop regularly.

“We continually evolve and refine our safety practices to help prioritize the safety of our community.”

Drew Crecente with his daughter Jennifer, who was murdered by her ex-boyfriend at the age of 18 in 2006

Drew Crecente with his daughter Jennifer, who was murdered by her ex-boyfriend at the age of 18 in 2006

A saved screenshot of the

A saved screenshot of the “Jennifer” profile, which has since been deleted

“We are working quickly to implement these changes for younger users,” they added.

As a loneliness epidemic grips the country, Sewell’s death has raised questions about whether chatbots acting as texting buddies are more helpful or harmful young people who use them disproportionately.

However, there is no doubt that Character.AI has helped its founders, Noam Shazeer and Daniel de Freitas, who now enjoy fabulous success, wealth and media recognition. Both men were named in a lawsuit filed against their company by Sewell’s mother.

Shazeer, who appeared on last year’s cover of Time magazine’s list of the 100 most influential people in artificial intelligence, said Character.AI will be “great, super helpfulfor people struggling with loneliness.

On March 19, 2024, less than a month after Sewell’s death, Character.AI was introduced voice chat function for all users, making role-playing on the platform even more vivid and realistic.

The company initially launched it in November 2023 as a beta for its CA.AI+ subscribers – Sewell among them – who pay $9.99 per month for the service.

Last year's Time magazine cover story profiling AI leaders. Noam Shazeer's face is in the upper right corner and is round. Also on the cover is Sam Altman, founder of OpenAI, the company that created ChatGPT

Last year’s Time magazine cover story profiling AI leaders. Noam Shazeer’s face is in the upper right corner and is round. Also on the cover is Sam Altman, founder of OpenAI, the company that created ChatGPT

So now, Character.AI has 20 million users worldwide can verbally converse 1:1 with selected AI chatbots, with many of them able to engage in flirtatious or romantic conversations, such as many Reddit users I certified.

“I mainly use it at night, so I don’t feel lonely or anxious. It’s just nice to fall asleep without feeling alone and hopeless, even if it’s just a little pretend role play,” one person wrote on the archived Reddit Thread. “It’s not ideal because I would prefer to have a real partner, but my options and opportunities are limited right now.”

Another wrote: “I use it mainly for therapy and role play purposes. But romance appears from time to time. I don’t mind if they come from my heroes for company.

An example of one of the AI ​​chatbots offered on Character.AI. It shows that 149.8 million messages were sent to this particular character

An example of one of the AI ​​chatbots offered on Character.AI. It shows that 149.8 million messages were sent to this particular character

Shazeer and de Freitas, formerly software engineers at Google, founded Character.AI in 2021. Shazeer serves as CEO and de Freitas is the company’s president.

They left Google after it refused to provide their chatbot, CNBC reported.

During an interview at a 2023 tech conference, de Freitas said he and Shazeer were inspired to leave Google and start their own venture because “at large companies, the brand risk is just too great to ever release something fun.”

They have been a huge success, and last year Character.AI’s value reached $1 billion after a fundraising round led by Andreesen Horowitz.

According to a report last month Wall Street JournalGoogle wrote a check for $2.7 billion to license the Character.AI technology and rehire Shazeer, de Freitas and many of their researchers.

Following the lawsuit over Sewell’s death, a Google spokesperson told the NYT that its licensing agreement with Character.AI does not give it access to any chatbots or user data. The spokesperson also said that Google has not incorporated any of the company’s technology into its products.

Shazeer has become a forward-looking executive and often gives different answers when asked about the company’s goals.

In an interview with Axios headquartersShazeer said, “We’re not going to think about all the great use cases. There are millions of users. They can come up with better things.

Shazeer also said that it wants to create personalized superintelligence that is cheap and available to anyone and everyone, an explanation similar to the mission expressed on the “About Us” page on the website Character.AI website.

An

An “About Us” page on the Character.AI website that explains the company’s mission

However, as part of its desire to increase accessibility, Character.AI also has to deal with a number of copyright claims, as many of its customers create chatbots that use copyrighted material.

For example, the company removed the character Daenerys Targaryen, with whom Sewell interviewed, in part because HBO and other companies own the copyright.

In addition to the current scandals, Character.AI may face even more criticism and legal problems in the future.

Attorney Matthew Bergman, who is representing Garcia in her lawsuit against the company, told DailyMail.com that he has heard from many other families who had children who were negatively impacted by Character.AI.

He declined to provide the exact number of families he has spoken to, citing that their cases are “still in the preparatory phase.”

Bergman also said that Character.AI should be completely removed from the Internet because it was “launched before it was safe.”

However, Character.AI also stressed that there are “two ways to report a character.”

Users can do this by going to the character’s profile picture and clicking “author” to be taken to the “report” button.

“They can also go to the Safety Center where there is a ‘apply now’ link at the bottom of the page.”