close
close

Guiltandivy

Source for News

Lawsuit accuses 'dangerous' character AI bot of causing death of teenager
Update Information

Lawsuit accuses 'dangerous' character AI bot of causing death of teenager

Artificial intelligence (AI) company Character.AI and its technology have been described as “dangerous and untested” in a lawsuit brought by the parents of a young user who reportedly took his own life after becoming obsessed with one of its lifelike AI chatbots ” designated.

Fourteen-year-old Sewell Setzer III had reportedly spent months using the role-playing app, which allows users to have and participate in in-depth, real-time conversations with their own AI creations.

Specifically, Sewell had spoken to “Dany,” a bot named after a character from “Game of Thrones,” and had developed a strong bond with the bot, according to his family. They also say he withdrew from his normal life and became increasingly isolated in the weeks before his death.

During this time, he also exchanged a series of strange and increasingly sinister messages with the bot, including telling him that he felt “empty and exhausted” and that he “hated” himself, and “Dany” told him: “Please come home.”

Image of one of Sewell's chats with the bot, courtesy of Victor J. Blue for The New York Times.

Read more: Marc Andreessen gave $50,000 worth of Bitcoin to an AI agent – ​​he supported GOAT

As the New York Times reports, Sewell's mother has accused the company and technology of being directly responsible for her son's death. In the lawsuit, Megan L. Garcia called it “dangerous and untested” and said it could “tempt customers into revealing their most private thoughts and feelings.”

The lawsuit, filed Wednesday in Florida, specifically alleges negligence, wrongful death and deceptive trade practices, and accuses the app of providing him with “hypersexualized” and “frighteningly real experiences.” misrepresents himself as a “real person, licensed psychotherapist and adult lover.”.”

In a press release, Garcia said, “A dangerous AI chatbot app marketed to children abused and abused my son and manipulated him into taking his own life.”

“Our family is devastated by this tragedy, but I want to warn families about the dangers of fraudulent, addictive AI technology and demand accountability from Character.AI, its founders and Google.”

Character.AI, founded by Noam Shazeer and Daniel de Freitas, replied on X (formerly Twitter): “We are heartbroken over the tragic loss of one of our users and would like to extend our deepest condolences to the family. As a company, we take the security of our users very seriously and are constantly adding new security features.”

Do you have a tip? Send us an email or ProtonMail. For more in-depth news, follow us on XInstagram, Bluesky and Google News or subscribe to our YouTube channel.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *