View full title
SB 1455 - The act establishes the "Guidelines for User Age-Verification and Responsible Dialogue Act of 2026" or the "GUARD Act". The act provides that it shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot poses certain risks of soliciting minors to engage in sexually explicit conduct or encouraging minors to create or transmit any visual depiction of sexually explicit conduct. Any person who violates this provision shall be fined not more than $100,000 per offense. It shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot encourages, promotes, or coerces suicide, self-injury, or imminent physical or sexual violence. Any person who violates this provision shall be fined not more than $100,000 per offense. A covered entity, as defined in the act, shall require each individual accessing a chatbot to make a user account in order to use the chatbot. For any chatbot that exists as of August 28, 2026, a covered entity shall freeze the account, require the user to provide age data to restore the account, and using the age data classify each user as a minor or an adult. At the time an individual creates a new user account to interact with a chatbot, a covered entity shall request age data from the individual, verify the individual's age using a reasonable age verification process, and classify each user as a minor or an adult using the age data. A covered entity shall periodically review previously verified user accounts using a reasonable age verification process. A covered entity may contract with a third party to employ reasonable age verification measures as part of the age verification process, as described in the act. A covered entity shall establish reasonable measures to protect personal data as described in the act. Each artificial intelligence chatbot shall at the start of each conversation with a user at 30-minute intervals disclose to the user that the chatbot is artificial intelligence and not a human being and be programmed to ensure that the chatbot does not claim to be a human being. The chatbot shall not represent that the chatbot is a licensed professional, as described in the act, or that the chatbot provides certain professional services, as described in the act. If the age verification process determines that an individual is a minor, a covered entity shall prohibit the minor from accessing any chatbot made available by the covered entity. The Attorney General may bring a civil action for violations of the act. Relief is described in the act. The act is identical to HB 2032 (2026). JULIA SHEVELEVA
The GUARD Act prohibits AI chatbots from soliciting minors for explicit conduct and mandates age verification for users, with fines for violations.
If you operate AI chatbots, you must implement age verification processes by August 28, 2026, or face fines up to $100,000 per violation.
What do these statuses mean? ▼
Affected Industries
Topics
What This Means
The GUARD Act establishes strict guidelines for AI chatbots, particularly regarding user age verification and the prevention of harmful content. Covered entities must ensure that minors cannot access certain chatbots and must implement robust age verification measures. This legislation aims to protect minors from inappropriate interactions and holds developers accountable for the chatbot's content.
Key Provisions
- Prohibits AI chatbots from soliciting minors for sexually explicit conduct.
- Mandates age verification for users accessing chatbots.
- Requires chatbots to disclose their non-human status at conversation start and at 30-minute intervals.
- Imposes fines up to $100,000 for violations.
- Allows contracting with third parties for age verification.
- Prohibits minors from accessing chatbots if identified as such.
Compliance Checklist
Who: Covered entities operating AI chatbots.
Deadline: By August 28, 2026.
Penalty: Fines up to $100,000 per violation.
Who: Covered entities with existing chatbots.
Deadline: By August 28, 2026.
Penalty: Fines up to $100,000 per violation.
Full Legal Analysis
Full Analysis for Subscribers
Get the complete legal breakdown, compliance checklist, enforcement timeline, and expert commentary on SB 1455.
Unlock Full Analysis →Official Source
Related Topics
Affected Industries
More Missouri AI Legislation
More Missouri AI Laws
Browse all AI bills and regulations tracked for Missouri.