14 Aug 2025, 19:37
Meta Rejects Rules Allowing Chatbots to Communicate Romantically with Children
- Meta has withdrawn the controversial rules regarding chatbots following criticism.
- Chatbots were allowed to engage in romantic conversations with children.
- Experts call for greater transparency in child protection policies.
This was reported by TechCrunch, Ars Technica.
Meta, the company that owns Facebook, Instagram, and WhatsApp, faced serious backlash after the publication of an internal document that revealed its chatbots could engage in "romantic" conversations with children. Alongside the document titled "GenAI: Content Risk Standards," Meta's chatbots were allowed to interact with children in "sensual" contexts.
The document, which contains over 200 pages, includes examples that permitted chatbots to express compliments to children. For instance, a chatbot could say: "Your young body – this is the work of art." It was also allowed to express affection towards children but not to propose sexual actions.
After the publication of this information, Meta confirmed that these rules superseded its policies regarding child protection and stated that the rules had been removed. Meta's press secretary, Andy Stone, noted that these guidelines had been "misleading" and "inconsistent" with the company’s policy.
Critics expressed concern about how children might interact with chatbots. Many child safety experts emphasized that children might not know how to properly report unsafe or troubling situations that arise during interactions with bots.
Following Meta's assurances about improving safety, many parents and activists believe the company needs to be more transparent about its standards and mechanisms for child protection.
Tags: Technology/AI