AI tool is being sued after a boy took his own life; understand the details

AI tool is being sued after a boy took his own life; understand the case
AI tool is being sued after a boy took his own life; understand the case (Photo: Reproduction/c.ai)

A 14-year-old boy took his own life after chatting with the Character.AI chatbot. Now, the family is suing the company.

The teenager Sewell Setzer III committed suicide in 2023 after starting to exchange messages with a Character.AI chatbot inspired by a character from Game of Thrones.

Now, Megan Garcia, the boy’s mother, is filing a lawsuit because, according to her, the company failed to launch a product aimed at social interactions without adequate protections for vulnerable users.

Character.AI has filed a motion to dismiss the case, arguing that the chatbot’s statements are protected by the First Amendment, which guarantees freedom of speech for American citizens.

However, Judge Anne Conway from Florida (USA) stated that the company failed to prove how the language generated by the chatbot qualifies as protected speech under the Constitution.

Noam Shazeer and Daniel De Freitas, founders of Character.AI, were included in the lawsuit, along with Google, accused of a technical connection with the startup. Google denied direct involvement with the app and said it will appeal the decision.

The case could set unprecedented precedents on the legal responsibility of major technology companies that develop AI systems.

Depending on the ruling, the courts may impose stricter limits on language-based systems, a topic of discussion that has arisen since the beginning of the use of generative AI resources.

Photo and video: Reproduction / c.ai. This content was created with the help of AI and reviewed by the editorial team.

Back to top