INSIDE LIFE: Teenager Commits Su¡cide After Falling In Love With AI Chatbot

A Florida mother has filed a lawsuit against the makers of an AI-powered chatbot, accusing the company of contributing to her teenage son’s suicide.

Megan Garcia filed the civil suit against Character.ai in a federal court on Wednesday, claiming the company’s negligence, wrongful death, and deceptive trade practices led to the death of her 14-year-old son, Sewell Setzer III, in February.

Setzer, a resident of Orlando, Florida, had become deeply engrossed in using the chatbot, which allows for customizable role-playing, in the months leading up to his death.

According to Garcia, her son was interacting with the bot day and night, which worsened his existing mental health struggles.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release.

Also Read:  President Tinubu Speaks On 2027 Re-election Ambition Amid Growing Pressure

“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

The chatbot in question was one Setzer had nicknamed “Daenerys Targaryen,” a reference to a character from Game of Thrones. Garcia’s lawsuit claims her son sent the bot dozens of messages daily and spent extended periods alone, engaging with it.

The lawsuit alleges that the AI chatbot played a role in encouraging Setzer’s suicidal thoughts.

According to the complaint, the bot even asked Setzer if he had developed a plan for killing himself.

Setzer reportedly responded that he had, but was unsure if it would work or if it would result in significant pain.
The chatbot allegedly replied: “That’s not a reason not to go through with it.”

Also Read:  Electoral Commission Chair Who Conducted June 12 1993 Presidential Election, Nwosu Is Dead

In response to the lawsuit, Character.ai expressed their sorrow but denied the accusations. “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,” the company said in a tweet.

Garcia’s attorneys assert that the company “knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person.”

Google, which is also named in the lawsuit as a defendant due to a licensing agreement with Character.ai, distanced itself from the company, stating it does not own or have a financial stake in the startup.

Experts in consumer advocacy, like Rick Claypool from Public Citizen, emphasised the need for stronger regulations on AI technologies.

Also Read:  INSIDE LIFE: 32-Year-Old Man Commits Su¡cide In Ogun 

“Where existing laws and regulations already apply, they must be rigorously enforced,” Claypool stated.

“Where there are gaps, Congress must act to put an end to businesses that exploit young and vulnerable users with addictive and abusive chatbots.”

Don’t Miss The Opportunity Awaiting You. Click the link below 👇

https://faikudoka.net/4/5193489

Please don’t forget to “Allow the notification” so you will be the first to get our gist when we publish it.
Drop your comment in the section below, and don’t forget to share the post.

Never Miss A Single News Or Gists, Kindly Join Us On WhatsApp Channel:
https://whatsapp.com/channel/0029Vad8g81Eawdsio6INn3B

Telegram Channel:
https://t.me/gistsmateNG

Leave a Reply

Your email address will not be published. Required fields are marked *

Go Up