Mother of US teen blames Google, AI chatbot for son's death by suicide: Report

Megan Garcia's lawsuit blamed Google and Character Technologies for her son's suicide, claiming a chatbot's influence. She sought damages and restrictions on chatbot access for minors. The companies argued the chatbot interactions are protected speech and deny any responsibility for user safety.

Sugam Singhal
Updated19 Mar 2025, 10:00 AM IST
Photo for representational purpose only (Illustration: WSJ)
Photo for representational purpose only (Illustration: WSJ)

Megan Garcia, mother of a 14-year-old boy who died by suicide over a year ago, has blamed Google and an artificial intelligence firm for her son's death. “He would still be alive today if it weren't for a chatbot urging him to take his own life,” she told Bloomberg.

In a 116-page lawsuit filed on October 23 last year in a federal court in Orlando, Megan Garcia has sought unspecified monetary damages from Google and Character Technologies. She has also asked the court to order warnings that the platform isn't suitable for minors and to limit how it can collect and use their data.

Megan Garcia's allegations

Megan Garcia called her son Sewell Setzer III a promising high school student-athlete. According to her, all this changed in April 2023 when he started role-playing on Character.AI, which lets users build chatbots that mimic popular culture personalities – both real and fictional.

“Google contributed financial resources, personnel, intellectual property, and AI technology to the design and development" of Character.AI’s chatbots, Megan Garcia's lawyers said in the complaint.

The suit also alleged that the Alphabet unit helped market the startup’s technology through a strategic partnership in 2023 to use Google Cloud services to reach a growing number of active Character.AI users, now more than 20 million.

Megan Garcia said she wasn’t aware that over the course of several months, the app hooked her son with “anthropomorphic, hypersexualized and frighteningly realistic experiences” as he fell in love with a bot inspired by Daenerys Targaryen, a character from the HBO series, Game of Thrones.

While Megan Garcia claimed to have taken away her son's phone in February 2024 after he started acting out and withdrawing from friends, he shot himself in the head after conferring with the Daenerys bot. He had found his stepfather's hidden pistol five days before the incident.  

Google's response

Both companies, Google and Character Technologies, have asked the judge to dismiss claims that they failed to ensure the chatbot technology was safe for young users, arguing there’s no legal basis to accuse them of wrongdoing.

Character Technologies contended in a filing that conversations between its Character.AI platform's chatbots and users were protected by the US Constitution’s First Amendment as free speech. It also argued that the bot explicitly discouraged the teenager from committing suicide.

A Character.AI spokeswoman declined to comment on pending litigation but said, "There is no ongoing relationship between Google and Character.AI,” and the startup had implemented new user safety measures over the past year, Bloomberg reported.

Catch all the Technology News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.

Business NewsTechnologyMother of US teen blames Google, AI chatbot for son's death by suicide: Report
MoreLess