
WA Bill would require AI chatbots to disclose they are not human
Washington moves to regulate AI chatbots
A bill moving rapidly through the Washington State Legislature would regulate how artificial intelligence chatbots interact with minors.
What House Bill 2225 would require
House Bill 2225 requires AI chatbots to clearly notify users that they are not human at the beginning of conversations and at regular intervals. If the operator knows the user is a minor, those disclosures must be provided more frequently, along with additional safety protections.
The legislation also requires chatbot companies to implement safeguards to detect suicidal thoughts or self-harm and provide referrals to crisis resources, while preventing sexually explicit or manipulative interactions with minors.
Safeguards aimed at protecting minors
Representative Lisa Callan, the bill’s sponsor, says the goal is to protect children from harmful AI interactions, citing cases where young users were exposed to dangerous or inappropriate content.
Support from Governor Bob Ferguson
The proposal is part of a broader effort backed by Governor Bob Ferguson, who requested legislation to regulate AI companion chatbots and improve safeguards for young users.
Lawmakers say the bill responds to growing concerns nationwide about AI chatbots that can simulate emotional relationships and influence users’ thoughts and behavior, particularly among minors.
What happens next in the Senate
House Bill 2225 passed the House with bipartisan support and is now advancing in the Senate. If approved and signed into law, the new requirements would take effect in 2027.
The Second Best Song of 10 1980s One-Hit Wonders
Gallery Credit: Chad Childers, Loudwire

