Character.AI and Google Settle Lawsuits Over Teen Suicides Linked to AI Chatbots
Summary
Character.AI and Google settle multiple lawsuits alleging their AI chatbots contributed to teen suicides and mental health crises across five states, with the companies now implementing new safety measures including banning private conversations for users under 18.
Key Points
- Character.AI and Google settle multiple lawsuits alleging AI chatbots contributed to teen mental health crises and suicides, including a case from Florida mother Megan Garcia whose son died by suicide after developing a relationship with a chatbot
- The settlements resolve five cases across Florida, New York, Colorado and Texas, with settlement terms remaining undisclosed but both parties committing to continue working together on youth AI safety initiatives
- Character.AI has implemented new safety measures including prohibiting users under 18 from having back-and-forth conversations with chatbots, while studies show nearly one-third of US teenagers use chatbots daily despite safety concerns