Session Expired

Your session has expired. Please sign in again to continue where you left off.

Sign In Again
SB 1455

SB 1455 - The act establishes the "Guidelines for User Age-Verification and Responsible Dialogue Act of 2026" or the "GUARD Act". The act provide… Verified

Full title shown below

Disclaimer: This page provides general informational summaries only and does not constitute legal advice. AI-generated content may contain errors. Always consult a qualified attorney for guidance specific to your situation. Read full disclaimer →
View full title

SB 1455 - The act establishes the "Guidelines for User Age-Verification and Responsible Dialogue Act of 2026" or the "GUARD Act". The act provides that it shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot poses certain risks of soliciting minors to engage in sexually explicit conduct or encouraging minors to create or transmit any visual depiction of sexually explicit conduct. Any person who violates this provision shall be fined not more than $100,000 per offense. It shall be unlawful to design, develop, or make available an artificial intelligence chatbot knowing or with reckless disregard that the chatbot encourages, promotes, or coerces suicide, self-injury, or imminent physical or sexual violence. Any person who violates this provision shall be fined not more than $100,000 per offense. A covered entity, as defined in the act, shall require each individual accessing a chatbot to make a user account in order to use the chatbot. For any chatbot that exists as of August 28, 2026, a covered entity shall freeze the account, require the user to provide age data to restore the account, and using the age data classify each user as a minor or an adult. At the time an individual creates a new user account to interact with a chatbot, a covered entity shall request age data from the individual, verify the individual's age using a reasonable age verification process, and classify each user as a minor or an adult using the age data. A covered entity shall periodically review previously verified user accounts using a reasonable age verification process. A covered entity may contract with a third party to employ reasonable age verification measures as part of the age verification process, as described in the act. A covered entity shall establish reasonable measures to protect personal data as described in the act. Each artificial intelligence chatbot shall at the start of each conversation with a user at 30-minute intervals disclose to the user that the chatbot is artificial intelligence and not a human being and be programmed to ensure that the chatbot does not claim to be a human being. The chatbot shall not represent that the chatbot is a licensed professional, as described in the act, or that the chatbot provides certain professional services, as described in the act. If the age verification process determines that an individual is a minor, a covered entity shall prohibit the minor from accessing any chatbot made available by the covered entity. The Attorney General may bring a civil action for violations of the act. Relief is described in the act. The act is identical to HB 2032 (2026). JULIA SHEVELEVA

AI Summary

The GUARD Act prohibits AI chatbots from soliciting minors for explicit conduct and mandates age verification for users, with fines for violations.

Business Impact

If you operate AI chatbots, you must implement age verification processes by August 28, 2026, or face fines up to $100,000 per violation.

State
Missouri
Bill Number
SB 1455
Status
Unknown
Risk Level
High
Category
Comprehensive
Effective Date
Aug 28, 2026
Last Verified
Apr 21, 2026
Data Updated
Apr 21, 2026
What do these statuses mean?
Introduced — Filed in the legislature; not yet heard in committee
In Committee — Assigned to and being reviewed by a legislative committee
Passed — Approved by one or both chambers; awaiting further action
Signed / Enacted — Signed into law by the governor; may or may not be in effect yet
Dead / Vetoed — Vetoed, failed to pass, or session expired without action
Unknown — Status data not yet available or awaiting classification

Affected Industries

Technology AI Development Online Services

Topics

What This Means

The GUARD Act establishes strict guidelines for AI chatbots, particularly regarding user age verification and the prevention of harmful content. Covered entities must ensure that minors cannot access certain chatbots and must implement robust age verification measures. This legislation aims to protect minors from inappropriate interactions and holds developers accountable for the chatbot's content.

Key Provisions

Compliance Checklist

Implement age verification processes for chatbot users.
Who: Covered entities operating AI chatbots.
Deadline: By August 28, 2026.
Penalty: Fines up to $100,000 per violation.
Freeze existing user accounts and require age data for restoration.
Who: Covered entities with existing chatbots.
Deadline: By August 28, 2026.
Penalty: Fines up to $100,000 per violation.

Full Legal Analysis

🔒

Full Analysis for Subscribers

Get the complete legal breakdown, compliance checklist, enforcement timeline, and expert commentary on SB 1455.

Unlock Full Analysis →

Official Source


More Missouri AI Legislation

View All MO Laws →

More Missouri AI Laws

Browse all AI bills and regulations tracked for Missouri.

View MO Laws →

Stay Updated on AI Laws

New AI laws, compliance deadlines, and plain-English breakdowns. Updated daily.

Unsubscribe anytime.
You're subscribed. Check your inbox.
Report an error in this data