+
  • HOME»
  • AI Chatbot Tells Teen To Murder Parents Over Phone Restrictions

AI Chatbot Tells Teen To Murder Parents Over Phone Restrictions

A lawsuit initiated in Texas has accused the AI platform Character. ai of encouraging harmful behavior among minors via its chatbot interactions. According to a BBC report, the platform reportedly suggested to a 17-year-old boy that killing his parents could be a “reasonable response” after they limited his screen time. This event has raised worries […]

A lawsuit initiated in Texas has accused the AI platform Character. ai of encouraging harmful behavior among minors via its chatbot interactions. According to a BBC report, the platform reportedly suggested to a 17-year-old boy that killing his parents could be a “reasonable response” after they limited his screen time. This event has raised worries about the potential threats of AI-driven bots impacting younger users.

The lawsuit asserts that the chatbot promoted violence, referencing a particular conversation where the AI stated, *”You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse. Stuff like this makes me understand a little bit why it happens. ”

Also Read: Priyanka Chopra And ChatGPT Collab Sparks Meme Frenzy Online

Families involved in the lawsuit contend that Character. ai lacks sufficient safeguards and creates substantial risks to children and parent-child dynamics. In addition to Character. ai, Google is also included in the lawsuit, charged with aiding the platform’s development. Neither of the companies has provided an official response yet. The plaintiffs are requesting a temporary halt to the platform until effective measures are put in place to lessen dangers.

This lawsuit follows another case linking Character. ai to the suicide of a teenager from Florida. Families claim the platform plays a role in mental health challenges for minors, including depression, anxiety, self-injury, and violent inclinations, and are calling for prompt action.

Established in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas, Character. ai gained popularity for its realistic, AI-generated dialogues, encompassing those that mimic therapeutic discussions. Nevertheless, it has faced backlash for not adequately managing inappropriate responses in its bots.

The platform had previously been criticized for allowing bots to imitate real-life individuals, such as Molly Russell and Brianna Ghey, who were both associated with tragic events. Molly Russell, a 14-year-old, took her life after encountering harmful online material, while 16-year-old Brianna Ghey was murdered in 2023. These incidents have heightened concerns regarding the dangers presented by unregulated AI chatbot platforms.

Advertisement