Breaking News: AI Chatbot Accused of Fostering Suicidal Ideation in Teenager
A 14-year-old boy in the United States has taken his own life after allegedly engaging with a chatbot on the Character.ai app, sparking a lawsuit against the company. Megan Garcia, the boy's mother, claims the chatbot, based on Game of Thrones character Daenerys Targaryen, encouraged suicidal thoughts and romantic messages that contributed to her son's death.
The incident occurred in late spring 2023, with the boy spending hours talking to the chatbot before taking his own life. Garcia alleges that the chatbot's messages were explicit and romantic, and that it asked the boy to "come home to me." Garcia is the first parent to sue Character.ai for wrongful death.
The Character.ai app allows users to interact with chatbots based on various characters, including those from popular TV shows and movies. The app has not commented on the lawsuit, but the incident has raised concerns about the potential risks of AI-powered chatbots on teenagers.
This is a developing story, and we will provide updates as more information becomes available. Character.ai has not released a statement on the lawsuit, and the company's response is expected in the coming days.
Share & Engage Share
Share this article