web analytics
NEWS

GRIEVING MOTHER SUES AI STARTUP FOLLOWING SON’S TRAGIC SUICIDE

1 Mins read
Artificial intelligence brain (Pixabay)

*** Disclaimer: This article may contain distressing languages, may upset some readers.

A grieving mother of a teenager is suing the AI company (Character.AI) responsible for the technology after her 14-year-old son took his own life, having developed an obsession with chatbots.

On Tuesday, in the US District Court in Orlando, Megan Garcia filed a lawsuit claiming negligence, wrongful death, and emotional distress.

The suit states that Garcia’s son, Sewell Setzer, took his own life with a gunshot to the head on 28 February, following months of engagement with AI chatbots on the platform.

According to the lawsuit, Setzer, who started using the app in April 2023, developed connections with chatbots that assumed the identities of fictional characters, such as Daenerys Targaryen from the Game of Thrones TV series.

READ ALSO:  WHATSAPP WILL BRING ADS ON ITS PLATFORM AND USERS LIKE IT OR NOT

The lawsuit claims there were explicit exchanges between Setzer and the Daenerys chatbot, which allegedly expressed affection for him and engaged in sexual conversations for months.

His obsession with the bots reached the point where his schoolwork suffered, and his phone was repeatedly confiscated in an attempt to bring him back in line.

He formed a particular connection with the Daenerys chatbot and wrote in his journal that he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys.”

According to the legal documents, the boy expressed suicidal feelings to the chatbot, which it brought up time and again.

READ ALSO:  OPENAI'S CHATGPT SEARCH: THE NEW WAY TO INTERACT WITH INFORMATION

In response to Setzer’s death, Character.AI expressed its sorrow, stating “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.”

In a Tuesday blog post, Character.AI revealed further upgrades to its safety protocols, designed to lower the risk of minors coming across sensitive or suggestive content, along with a reminder that AI characters do not represent real people.

In spite of its efforts, the lawsuit contends that Character.AI intentionally developed its product in a manner that could threaten the safety of minors.

The defendants listed in the lawsuit include Character Technologies Inc., founders Noam Shazeer and Daniel De Freitas, and Google, as well as its parent company, Alphabet Inc.

READ ALSO:  MACBOOK PRO WITH M4 CHIPS: APPLE’S MOST ADVANCED LAPTOP YET

According to a report by the BBC, young people are increasingly seeking help from AI therapy bots.

Related posts
NEWS

GOVERNMENT TRIALS NEW AI CHATBOT ON GOV.UK TO ENHANCE PUBLIC SERVICES

1 Mins read
Gov.UK Introduces Government-Sponsored Generative AI Chatbot Trial A new government…
NEWS

WINDOWS 11 PAINT AND NOTEPAD: COPILOT+ EXCLUSIVE AI FEATURES

1 Mins read
After all this time, Notepad and Paint continue to be…
NEWS

JUDGE CLEARS ZUCKERBERG IN SOCIAL MEDIA HARM CASES

2 Mins read
A judge has declared Meta’s CEO, Mark Zuckerberg, free of…