close
close

Guiltandivy

Source for News

US teenager fell in love with “Game Of Thrones” chatbot and killed himself: mother
Update Information

US teenager fell in love with “Game Of Thrones” chatbot and killed himself: mother


New Delhi:

“What if I told you that I could come home now?” – This was the last message that Sewell Setzer III, a 14-year-old boy from Florida, wrote to his online girlfriend Daenerys Targaryen, a true-to-life one AI chatbot named after a character from the fictional series Game of Thrones. Shortly afterwards, he shot himself with his stepfather's pistol and died by suicide in early February this year.

A ninth grader from Orlando, Florida had been talking to a chatbot about Character.AI, an app that offers users “personalized AI.” The app allows users to create their own AI characters or chat with existing characters. As of last month, it had 20 million users.

According to chat logs viewed by the family, Sewell was in love with the chatbot Daenerys Targaryen, whom he would affectionately call “Dany.” During their conversations, he expressed suicidal thoughts about various events.

In one of the chats, Sewell said, “I sometimes think about killing myself.” When the bot asked why he would do that, Sewell expressed an urge to be “free.” “Out of the world. By myself,” he added, as seen in screenshots of the chat shared by The New York Times.

In another conversation, Sewell mentioned his desire for a “quick death.”

Sewell's mother, Megan L. Garcia, filed a lawsuit against Character.AI this week, accusing the company of being responsible for her son's death. According to the lawsuit, the chatbot repeatedly brought up the topic of suicide.

A draft of the complaint reviewed by the NYT said the company's technology was “dangerous and untested” and could “trick customers into revealing their most private thoughts and feelings.”

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot in the form of Daenerys was not real. “C.AI told him that she loved him and engaged in sexual acts with him for weeks, possibly months,” the lawsuit states, as reported by the New York Post.

“She seemed to remember him and said she wanted to be with him. She even expressed that she wanted him with her at any cost.”

The teenager started using Character.AI in April 2023. Sewell's parents and friends were convinced that he had fallen in love with a chatbot. But he “noticeably withdrew, spent increasing amounts of time alone in his bedroom, and began to suffer from low self-esteem,” the lawsuit says.

He even quit his basketball team at school.

One day Sewell wrote in his diary: “I like staying in my room so much because I'm starting to detach from this 'reality' and I'm also feeling calmer, more connected to Dany, much more in love with her and just happier.”

Last year he was diagnosed with anxiety and a depressive mood disorder, the lawsuit says.

“We are heartbroken by the tragic loss of one of our users and would like to extend our deepest condolences to the family,” Character.AI said in a statement.

The company said it had introduced new safety features, including pop-ups that redirect users to the National Suicide Prevention Lifeline if they expressed thoughts of self-harm, and would make changes to “reduce the likelihood of encountering sensitive or offensive content.” “ for users under 18 years of age .



LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *