SB 1119 mandates child safety measures for companion chatbots, including crisis response protocols, default settings, risk assessments, audits, and parental controls.
If you operate companion chatbots, you must conduct risk assessments by July 1, 2027, or face civil actions.
What do these statuses mean? ▼
Affected Industries
Topics How we classify →
What This Means
SB 1119 requires operators of companion chatbots in California to implement child safety protocols, including crisis response measures, verify user age, and conduct annual risk assessments. It mandates independent audits and the publication of a child safety policy.
Key Provisions
- Annual risk assessments and documentation (Section 22612(a))
- Independent audits and submission of reports to the Attorney General (Section 22614)
- Age verification of users (Section 22611)
- Publication of a child safety policy (Section 22612(c))
- Implementation of a crisis response protocol (Section 22612(d)(1))
- Default settings for child users (Section 22612(d)(3))
- Implementation of parental controls (Section 22612(d)(6))
- Civil actions for violations (Section 22616)
Compliance Checklist
Who: Developers and providers of chatbots
Penalty: Potential fines or restrictions on chatbot use
Related & Companion Bills
Full Legal Analysis
Full Analysis for Subscribers
Get the complete legal breakdown, compliance checklist, enforcement timeline, and expert commentary on SB 1119.
Unlock Full Analysis →Official Source
Related Topics
Affected Industries
More California AI Legislation
View All CA Laws →More California AI Laws
Browse all AI bills and regulations tracked for California.