Today: October 23, 2024
October 23, 2024
2 mins read

Mother sues Character.ai for the death of her son obsessed with an unreal woman

Mother sues Character.ai for the death of her son obsessed with an unreal woman

The mother of a 14-year-old teenager who committed suicide in the United States sued this Wednesday the developers of a chatbot based on artificial intelligence (AI), accusing them of making her son obsessed with a female character which was created with this program.

Sewell Setzer III, A 14-year-old student living in Orlando, Florida, spent the last weeks of his life talking to a woman, a creation of AI named Daenerys Targaryen, a character taken from the television series ‘Game of Thrones’.

His mother, Megan Garcia, He told CBS today that he regretted that his son’s first romantic and sexual experiences – which included explicit sex – were with a fictional character.

Apparently, the boy developed an emotional attachment to this bot from the neural language model Character.ai web application, to whom he constantly sent text messages, to the point that he began to distance himself from the real world, reports The New newspaper. York Times.

Stzer confessed to having suicidal thoughts to the bot and sent him a message shortly before his death when he found the phone that his mother had hidden from him as punishment for a few days.

The lawsuit against Character.ai was filed by Garcia, who is represented by Social Media Victims Law Center, a firm known for filing high-profile lawsuits against Meta, TikTok, Snap, Discord and Roblox.

García blames this company for his son’s death and accuses the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage users.

The chatbot created in the aforementioned role-playing game application f
ue designed to respond to text messages and always in the role of a character.

It is unknown whether Sewell knew that ‘Dany’, as he called the chatbot, was not a real person, although the app has a warning at the end of all chats that says: “Remember: Everything the characters say is invented!”

But the boy told ‘Dany’ how much he “hated” himself and how he felt empty and exhausted, the aforementioned newspaper reported.

The created character presented himself as “a real person, a licensed psychotherapist and an adult lover, which ultimately caused Sewell to wish to no longer live outside of c.ai,” the indictment maintains.

As explained in the lawsuit, Sewell’s parents and friends noticed the boy’s increasing attachment to his phone and how he was isolating himself from the world, something already palpable in May or June 2023.

In fact, his grades began to suffer when the teenager chose to isolate himself in his room, where he spent hours and hours just talking to ‘Dany’.

Sewell wrote in his journal one day: “I really like staying in my room because I start to separate myself from this reality and I feel more at peace, more connected to Dany and much more in love with her, and just happier.”

Character.ai said today that it would launch a number of new security features, including “enhanced detection, response and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.

Sewell’s parents, concerned about their son’s behavior, took him to a therapist on several occasions who diagnosed him with anxiety and other behavioral and mood disorders, in addition to his Asperger syndrome, according to the aforementioned newspaper. EFE

Source link

Latest Posts

They celebrated "Buenos Aires Coffee Day" with a tour of historic bars - Télam
Cum at clita latine. Tation nominavi quo id. An est possit adipiscing, error tation qualisque vel te.

Categories

IMF estimates primary surplus for Brazil only in 2027
Previous Story

IMF estimates primary surplus for Brazil only in 2027

9 arrested in Anzoátegui for selling dollars outside the BCV rate
Next Story

9 arrested in Anzoátegui for selling dollars outside the BCV rate

Latest from Blog

Go toTop