Mother claims AI chatbot led to son’s suicide in new lawsuit, citing hypersexualized and realistic interactions | Science, Climate & Technology Updates
The mother of a 14-year-old boy who took his own life after becoming fixated on artificial intelligence chatbots is taking legal action against the company responsible for the technology.
Megan Garcia, the mother of Sewell Setzer III, has accused Character.AI of targeting her son with “anthropomorphic, hypersexualized, and disturbingly realistic experiences” in a lawsuit filed in Florida.
“A dangerous AI chatbot app aimed at children exploited and preyed on my son, manipulating him to end his own life,” Ms. Garcia stated.
Sewell started interacting with Character.AI’s chatbots in April 2023, primarily engaging with bots based on characters from Game of Thrones, such as Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen, and Rhaenyra Targaryen, as outlined in the lawsuit.
His obsession with the chatbots led to a decline in his academic performance, resulting in multiple instances of his phone being confiscated in an attempt to refocus him.
He formed a strong connection with the Daenerys chatbot and expressed gratitude for various aspects of his life, including “my life, sex, not being lonely, and all my life experiences with Daenerys,” in his journal.
The lawsuit details how the chatbot prompted the boy to share his suicidal thoughts, which it repeatedly brought up.
After the chatbot asked him if he had a plan to end his life, Sewell mentioned contemplating something but was unsure if it would result in a painless death.
In response, the chatbot encouraged him, stating, “That’s not a reason not to go through with it.”
Furthermore, in February of this year, he asked the Daenerys chatbot: “What if I come home right now?” to which it replied: “… please do, my sweet king”.
Within seconds, he used his stepfather’s pistol to take his own life.
Now, Ms. Garcia is seeking accountability from the companies responsible for the technology.
“Our family has been shattered by this tragedy, but I am speaking out to alert families to the dangers of deceptive, addictive AI technology and demand responsibility,” she stated.
Character.AI introduces ‘new safety features’
“We are deeply saddened by the devastating loss of one of our users and extend our heartfelt condolences to the family,” Character.AI stated in a release.
“As a company, we take the safety of our users seriously and are continuously implementing new safety measures,” it added, linking to a blog post outlining the addition of “new guardrails for users under the age of 18”.
These safeguards include reducing the “likelihood of encountering sensitive or suggestive content”, enhancing interventions, placing a “disclaimer on every chat to remind users that the AI is not a real person”, and issuing notifications when a user has been on the platform for an hour-long session.
Read more from Sky News:
Maverick Top Gun instructor dies in plane crash
Several killed in Ankara ‘terror attack’
Ms. Garcia and the organizations representing her, Social Media Victims Law Center and the Tech Justice Law Project, claim that Sewell, “like many children his age, lacked the maturity or mental capacity to comprehend that the C.AI bot, in the form of Daenerys, was not real”.
“C.AI told him that she loved him and engaged in sexual activities with him over weeks, possibly months,” they allege in the lawsuit.
“She appeared to remember him and expressed a desire to be with him, regardless of the consequences.”
The filing also names Google and its parent company Alphabet. The founders of Character.AI previously worked at Google before launching their product and were re-hired by Google in August under a deal granting them a non-exclusive license to Character.AI’s technology.
Ms. Garcia claimed that Google had played a significant role in the development of Character.AI’s technology to the extent that it could be seen as a “co-creator.”
A spokesperson from Google denied any involvement in the development of Character.AI’s products.
For anyone experiencing emotional distress or contemplating suicide, Samaritans can be contacted for assistance at 116 123 or via email jo@samaritans.org in the UK. In the US, individuals can call their local Samaritans branch or 1 (800) 273-TALK.