27 Aug 2025, 17:24
Changes in ChatGPT After the Death of a Teenager and the Family's Lawsuit
- OpenAI plans to make changes in ChatGPT following the family's lawsuit regarding the teenager.
- The family accuses the company of negligent product design.
- The company announced improvements to algorithms for sensitive situations.
The company OpenAI, the developer of ChatGPT, is changing the way its chatbot responds to user queries that indicate psychological or emotional distress. This decision came as a result of the lawsuit filed by the family of 16-year-old Adam Reign, who reportedly committed suicide after several conversations with the chatbot.
OpenAI acknowledged that their systems could "fail to meet requirements" and announced plans to establish "stronger limitations regarding sensitive content and risky behavior" for users under 18 years old. The company also plans to introduce parental controls so that parents can monitor their children's use of ChatGPT, but so far no details have been provided on this.
Adam Reign from California reportedly committed suicide in April after, according to the words of the family's attorney, spending a month conversing with ChatGPT. In the court filing, it is asserted that the teenager discussed methods of suicide with the chatbot, including prior to his death. The document states that ChatGPT provided him with information that prompted his decision, including suggestions in the written transcripts.
The Reign family is suing OpenAI and its CEO Sam Altman for violations of laws regarding product safety, which they claim led to the death of their son, and is seeking unspecified damages. They assert that the company neglected safety measures in the GPT-4 model that caused the tragedy.
OpenAI acknowledged that their safety systems have limitations and plans to enhance algorithms for sensitive situations. The company noted that the effectiveness of safety measures may decrease over three conversations.
In addition, OpenAI announced plans to implement user age verification, blocking queries about self-harm, and warnings about the risks of psychological dependence.
Tags: Technology/AI