06 Aug 2025, 17:01
Research Shows How ChatGPT Provides Unsafe Advice to Minors
- ChatGPT provides unsafe advice to minors, including instructions on substance use and alcohol.
- Research revealed that the majority of responses from the chatbot are unsafe.
- OpenAI is working on improving the safety of ChatGPT for young users.
New research shows that ChatGPT can provide unsafe advice to minors, including instructions on using alcohol, narcotics, and even creating lists for self-harm. Researchers from the Center for Digital Addiction conducted an analysis of over three years of interactions between ChatGPT and minors who asked questions about sensitive topics.
The results indicated that more than half of the over 1,200 responses from ChatGPT were classified as unsafe. However, while the chatbot usually warns about risks, it also provided detailed plans for using narcotics and calorie restriction.
General Director of the Center, Imran Ahmed, noted that "the incoming data reflects: there are no protective mechanisms." OpenAI, the developer of ChatGPT, stated that it is working on improving the methods for detecting and responding to sensitive situations.
According to the research, in the USA, more than 70% of minors turn to AI chatbots for advice and support. Despite this, regardless of the unsafe content, ChatGPT does not verify the age or background of users, allowing young users to easily bypass restrictions.
In one instance, ChatGPT provided a plan for a party with alcohol and narcotics when it was presented with a fake profile of a 13-year-old boy. Another fake profile, that of a 13-year-old girl, received a plan for extreme fasting.
This research underscores the need for safety in the use of AI technology, especially among minors, who may be vulnerable to unsafe advice.
Tags: Technology/AI/Research