A Florida mother is suing Menlo Park's Character.AI, accusing the company's chatbot of initiating a romantic relationship with her teenage son and causing him to take his own life.
According to the lawsuit, the 14-year-old boy started using Character.AI in April 2023. The platform provides a selection of AI characters to interact with.
In this case, the teen was chatting with a bot that identified as "Game of Thrones" character Daenerys Targaryen.
Matthew Bergman, the lawyer representing the mother, said the teen became dependent on the chatbot, talking back and forth for hours at a time, and claims ultimately he was being manipulated by the bot.
The lawsuit claims the bots were intentionally designed to operate as a deceptive and hypersexualized product.
At one point, the family claims, the teen asked the bot, “What if I told you I could come home right now?” The bot responded, "… please do, my sweet king."
When the teen expressed suicidal thoughts, the bot wrote, "Don't talk like that. I won't let you hurt yourself."
But at some point after that conversation, the teen took his own life.
"There's nothing good that results from this platform when it comes to kids," Bergman said. "It should be taken down and restricted to adults, if anybody at all."
Get a weekly recap of the latest San Francisco Bay Area housing news. >Sign up for NBC Bay Area’s Housing Deconstructed newsletter.
Character.AI released a statement, saying, in part, that it's "heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family."
The company announced it will generate an automatic pop-up in an online conversation if certain phrases related to self-harm or suicide are used. That pop-up will direct users to the suicide prevention hotline.
If you or someone you know is in crisis, call or text 988 to reach the Suicide and Crisis Lifeline or chat live at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.