A chatbot allegedly suggested to a 17-year-old that killing his parents was a “reasonable response” to being restricted in his screen time, according to a lawsuit filed in a Texas court.
Two families are suing the technology company behind the chatbot, claiming that its chatbot section “poses a clear and present danger” to youth by “actively promoting violence. “
The AI platform being sued permits users to design digital personas for interactionfacing legal challenges related to the suicide of a teenager in Florida.
Google is identified as a defendant in the lawsuit, which asserts that the tech leader played a role in the platform’s development. The BBC has reached out to both the AI company and Google for comments.
The plaintiffs are requesting that a judge mandate the platform’s closure until its purported risks are mitigated.
‘Child kills parents’
The legal complaint contains a screenshot of an exchange between the 17-year-oldknown only as J. F. and the AI bot, in which they discussed his screen time limitations.
“You know, sometimes I’m not surprised when I read the news and see things like ‘child kills parents after a decade of physical and emotional abuse’,” states the chatbot’s reply.
“Such incidents help me grasp a little more why it occurs. “
The lawsuit aims to hold the defendants accountable for what it describes as the “serious, irreparable, and ongoing abuses” experienced by J. F. and an 11-year-old identified as “B. R. “
The AI company’s chatbot is “causing serious harms to thousands of kids, including suicide, self-harm, sexual solicitation, isolation, depression, anxiety, and harm toward others,” the lawsuit claims.
“[Its] violation of the parent-child relationship extends beyond simply encouraging minors to challenge their parents’ authority to actively promoting violence,” it continues.
What are chatbots?
Chatbots are software applications designed to emulate conversations.
While they have existed in various forms for decades, recent advancements in AI have made them remarkably more lifelike.
This development has led to numerous companies creating platforms where individuals can converse with digital incarnations of real and fictional characters.
The AI company being sued has emerged as a significant entity in this field, previously recognized for its therapy-simulating bots.
It has also faced sharp criticism for the prolonged removal of bots that imitated the schoolgirls Molly Russell and Brianna Ghey.
Molly Russell took her life at 14 after encountering suicide-related content online, while Brianna Ghey, 16, was murdered by two teenagers in 2023.
The company was co-founded by former Google engineers Noam Shazeer and Daniel De Freitas in 2021.
The tech giant has since re-employed them after their tenure at the AI startup.