Microsoft-backed (MSFT) OpenAI has drafted a blueprint for lawmakers to use when crafting any potential legislation regarding minors using artificial intelligence tools, such as chatbots.
“We are introducing the Teen Safety Blueprint, a roadmap for building AI tools responsibly and a practical starting point for policymakers who are working to set standards for teen use of AI,” OpenAI said.
“The Blueprint helps define how AI should work for teens, including age-appropriate design, meaningful product safeguards, and ongoing research and evaluation,” it added.
The introduction of the Blueprint builds on teen controls OpenAI introduced in late September. The controls allow parents to set times when ChatGPT can’t be used, turn off voice mode, turn off ChatGPT from saving memories, remove the image generation feature, and opt out of using conversations to help train models. There is also a content control safeguard.
“We aren’t waiting for regulation to catch up, we’re putting this framework into action across our products,” OpenAI said.
State and federal legislatures are taking a hard look at AI and how it interacts with minors. California has already passed the Transparency in Frontier Artificial Intelligence Act. It requires chatbot companies to implement safeguards when interacting with minors, such as notifying them that they are chatting with a machine, encouraging breaks every three hours, and having procedures to prevent the generation of self-harm content. It goes into law on Jan. 1, 2026.
U.S. senators Josh Hawley (R-Mo.) and Richard Blumenthal (D-Conn.) unveiled a new bill last week dubbed the GUARD Act, which would ban the usage of AI companions by minors.
“OpenAI just made an announcement that they have perfected their safety guardrails and they said, ‘We’ve gone through the testing, and our safety guardrails are excellent,'” Hawley said during a press conference on the bill. “Really? Those so-called safety guardrails that they have been testing resulted in the death of one child whose parents are here today. They’ve resulted in sexual conversations and sexual grooming of millions of children.”