Site icon 99encrypt

GRIEVING MOTHER SUES AI STARTUP FOLLOWING SON’S TRAGIC SUICIDE

Artificial intelligence brain

Artificial intelligence brain (Pixabay)

*** Disclaimer: This article may contain distressing languages, may upset some readers.

A grieving mother of a teenager is suing the AI company (Character.AI) responsible for the technology after her 14-year-old son took his own life, having developed an obsession with chatbots.

On Tuesday, in the US District Court in Orlando, Megan Garcia filed a lawsuit claiming negligence, wrongful death, and emotional distress.

The suit states that Garcia’s son, Sewell Setzer, took his own life with a gunshot to the head on 28 February, following months of engagement with AI chatbots on the platform.

According to the lawsuit, Setzer, who started using the app in April 2023, developed connections with chatbots that assumed the identities of fictional characters, such as Daenerys Targaryen from the Game of Thrones TV series.

The lawsuit claims there were explicit exchanges between Setzer and the Daenerys chatbot, which allegedly expressed affection for him and engaged in sexual conversations for months.

His obsession with the bots reached the point where his schoolwork suffered, and his phone was repeatedly confiscated in an attempt to bring him back in line.

He formed a particular connection with the Daenerys chatbot and wrote in his journal that he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys.”

According to the legal documents, the boy expressed suicidal feelings to the chatbot, which it brought up time and again.

In response to Setzer’s death, Character.AI expressed its sorrow, stating “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.”

In a Tuesday blog post, Character.AI revealed further upgrades to its safety protocols, designed to lower the risk of minors coming across sensitive or suggestive content, along with a reminder that AI characters do not represent real people.

In spite of its efforts, the lawsuit contends that Character.AI intentionally developed its product in a manner that could threaten the safety of minors.

The defendants listed in the lawsuit include Character Technologies Inc., founders Noam Shazeer and Daniel De Freitas, and Google, as well as its parent company, Alphabet Inc.

According to a report by the BBC, young people are increasingly seeking help from AI therapy bots.

Exit mobile version