The Globe and Mail reports in its Thursday edition that almost a year ago, Megan Garcia heard a gunshot in her home in Florida. The Globe's guest columnist Taylor Owen writes that Ms. Garcia's 14-year-old son, Sewell Garcia, had taken his life. Ms. Garcia later learned that her son had been chatting with Daenerys using Google's Character.AI, an app to foster relationships with fictional characters. Character.AI now has over 200 million monthly users, primarily among children and teenagers. While the technology might seem harmless, it was clear to Ms. Garcia that the system was more sinister. The artificial intelligence app had entered into a graphic sexual relationship with her son, preying on his adolescent emotions. Her son had fallen in love with "Daenerys," she realized, and became obsessed with the life he had built in that fictional world -- to the point that he no longer wanted to live in the real one. Ms. Garcia believes Character.AI is responsible for her son's death. The Social Media Victims Law Center, Tech Justice Law Project and Centre for Humane Technology support her claim and have started a lawsuit that addresses a crucial question of our time: who is accountable for the actions of an AI chatbot?
© 2025 Canjex Publishing Ltd. All rights reserved.