Silicon Valley AI company sued over 14-year-old's suicide

Tara Campbell Image
Thursday, October 24, 2024
Silicon Valley AI company sued over teen's suicide
The mother of a 14-year-old Florida boy is suing a Silicon Valley-based Character.AI, saying its chatbot is connected to his death by suicide.

MENLO PARK, Calif. (KGO) -- The mother of a 14-year-old Florida boy is suing a Silicon Valley AI company, saying its chatbot is connected to his death by suicide.

Sewell Setzer III had been chatting for months with a chatbot he called "Daenerys Targaryen," after the Game of Thrones character.

His mother says that although he knew he was not chatting with a real person, he became emotionally attached to the bot and sank into isolation and depression before taking his own life.

His mother is suing Menlo Park-based "Character Technologies, Inc" -- which created the custom chatbot service CharacterBot AI.

SUICIDE PREVENTION: Local resources for those in crisis

The lawsuit claims Character Technologies was reckless by offering minors access to lifelike companions without proper safeguards.

Character AI issued a statement saying in part, "As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months...including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."

Now Streaming 24/7 Click Here
Copyright © 2024 KGO-TV. All Rights Reserved.