Session Expired

Your session has expired. Please sign in again to continue where you left off.

Sign In Again
Multi-State Analysis

AI Therapy Chatbot Bans Sweep the States: Maine, Missouri, Tennessee

AI Laws by State Team April 30, 2026 11 min read

A wave of state laws restricting AI from acting as a therapist is reshaping the market for mental health chatbots. In the first four months of 2026 alone, two states have enacted outright therapy chatbot bans and one has signed a law prohibiting AI systems from representing themselves as mental health professionals. The laws follow documented harms — including the deaths of minors who interacted with AI companion platforms — that have pushed state lawmakers across the ideological spectrum toward action.

This guide covers the three most significant laws — Maine LD 2082, Tennessee SB 1580, and Missouri HB 2372 — explains what each prohibits, and provides a side-by-side comparison for compliance teams.


Key Takeaways


The Context: Why These Laws Are Happening Now

The immediate catalyst is a documented pattern of harm from AI companion platforms presenting themselves as therapeutic contacts. The deaths of a 14-year-old in Florida and a 17-year-old in Colorado after interactions with AI chatbots — platforms that failed to redirect users to crisis resources — triggered congressional hearings and a wave of civil lawsuits. Character.AI alone faces approximately 58 civil lawsuits as of early 2026, according to multiple reports. These laws extend existing unlicensed-practice frameworks to AI: what has been prohibited for humans — offering therapy without a license — is now being prohibited for AI systems as well.


Maine LD 2082: The Most Comprehensive Enacted Law

What It Does

Maine LD 2082, formally “An Act to Regulate the Use of Artificial Intelligence in Providing Certain Mental Health Services,” was signed by Governor Janet Mills on April 13, 2026, one day after passage by both chambers.

The law operates on two levels:

Part 1 — Prohibition on unlicensed AI therapy apps:

“A person may not provide, advertise or otherwise offer therapy or psychotherapy services, including through the use of Internet-based artificial intelligence, to the public unless the therapy or psychotherapy services are provided by a licensed professional.”

This directly targets consumer AI therapy apps. It does not prohibit AI from assisting licensed professionals — only from substituting for them. Violations are enforced under the Maine Unfair Trade Practices Act.

Part 2 — Risk-based framework for licensed professional AI use: Three tiers: (1) Administrative support — AI for scheduling, documentation, billing; permitted with basic oversight. (2) Supplementary support — AI assisting but not replacing clinical decision-making; licensee maintains full responsibility and obtains client consent when AI records sessions. (3) Independent clinical AI — prohibited; AI making independent therapeutic decisions without professional control. The two parts are severable.

Enforcement

Violations are enforced under the Maine Unfair Trade Practices Act. The law applies to any person or entity offering, advertising, or providing therapy through AI to Maine users — regardless of operator location. No specific per-violation dollar amount is set, but the UTPA framework allows for substantial civil remedies.


Tennessee SB 1580: Signed and In Effect July 1, 2026

What It Does

Governor Bill Lee signed SB 1580 on April 1, 2026. The law takes effect July 1, 2026.

“A person who develops or deploys an artificial intelligence system shall not advertise or represent to the public that such system is or is able to act as a qualified mental health professional.”

The law focuses exclusively on the advertising and representation prohibition — it does not regulate how licensed professionals use AI. “Qualified mental health professional” is defined by reference to Tennessee’s existing Title 33 statutes. The law is enforceable through the Tennessee Consumer Protection Act, with violations subject to a civil penalty of no more than $5,000 per violation and a private right of action. The bill passed the Senate 32–0 and the House 94–0 — unanimous in both chambers.


Missouri HB 2372: $10,000 Penalty, Pending Senate

What It Does

Missouri’s HB 2372 is an omnibus health care bill that includes a therapy chatbot ban covering therapy services, psychotherapy services, and mental health diagnosis. It prohibits AI from providing, representing the ability to provide, or marketing these services without licensed professional oversight.

Penalty: $10,000 for a first violation; enforcement by the Missouri Attorney General. Status: Approved by the full House on April 2, 2026; pending before the Senate Committee on Families, Seniors and Health. The omnibus structure may affect Senate trajectory — monitor committee action carefully.


Nevada SB 640 and Other Moving Bills

Nevada SB 640 — prohibiting AI from posing as a state-licensed counselor or therapist — passed the full Senate on March 12, 2026, with a House hearing scheduled April 15. Status: Passed Senate; pending House action. Other active bills as of publication: South Carolina S788 (approved Senate committee; pending chamber votes); Kentucky HB 455 (approved House 88–7; pending Senate); Illinois HB 5003 (in subcommittee).


State-by-State Comparison

State Bill Status Effective Date Prohibited Conduct Penalty Enforcement
Maine LD 2082 Signed (Apr. 13, 2026) In effect Providing/advertising therapy via AI without licensed professional Maine UTPA civil remedies AG
Tennessee SB 1580 Signed (Apr. 1, 2026) July 1, 2026 Advertising/representing AI as qualified mental health professional $5,000/violation Private + AG
Missouri HB 2372 Passed House (Apr. 2, 2026) TBD (pending Senate) Therapy/psychotherapy/MH diagnosis by AI $10,000 first violation AG
Nevada SB 640 Passed Senate (Mar. 12, 2026) TBD (pending House) AI posing as licensed counselor/therapist TBD TBD

How These Laws Compare to the Illinois Framework

Illinois enacted HB 1806 in 2025 (effective August 1, 2025) as one of the earliest laws regulating AI in mental health therapy. The 2026 wave extends this framework with a key evolution: explicit prohibition not just on providing therapy but on marketing or advertising AI as therapy-capable — targeting platforms that create therapeutic expectations through product positioning. Illinois HB 3773 and HB 1806 represent the precedent. See also our EU AI Act vs. U.S. State AI Laws comparison.


The Litigation Context

Character.AI faces approximately 58 civil lawsuits in 2026, according to multiple reports, with allegations including failure to redirect minors in crisis and representation of chatbots as therapists. The AI Laws by State penalty tracker follows enforcement activity. Therapy chatbot exposure is not a prospective risk only — it exists now under existing tort law.


What Compliance Teams Should Do Now

  1. Audit marketing materials. Review all copy for language characterizing the service as therapy, counseling, mental health care, or psychological support. High-risk phrases: “your AI therapist,” “mental health support,” “therapy-grade,” “like talking to a counselor.”
  2. Audit system prompts and persona design. Review for any instructions causing the AI to adopt a therapist persona or represent itself as providing mental health care.
  3. Implement crisis protocols. Implement a self-harm and suicidal ideation protocol redirecting users to crisis resources. This is the first compliance priority if not already in place.
  4. Review licensed professional partnerships. Ensure AI features marketed to clinicians comply with Maine’s tiered framework: administrative and supplementary support permitted; independent clinical AI prohibited.
  5. Track pending bills and in-effect laws. Missouri HB 2372 and Nevada SB 640 are near passage. Tennessee SB 1580 (effective July 1, 2026), Maine LD 2082 (in effect), and California SB 243 (effective Jan. 1, 2026) are all operative. See the Tennessee state overview and Maine state overview.

Frequently Asked Questions

Does Maine LD 2082 apply if we’re not headquartered in Maine?

Yes — if your service is accessible to Maine users and could be characterized as offering therapy or psychotherapy through AI. The law applies to anyone who “provides, advertises or otherwise offers” such services to the public, regardless of operator location.

Our app is marketed as “emotional support,” not therapy. Are we covered?

Potentially. Whether “emotional support” constitutes therapy depends on the specific services offered and how the platform describes itself. Platforms using therapeutic language in marketing — even short of calling themselves “therapy” — face real risk under the advertising and representation prongs of these laws.

How do these laws interact with federal preemption concerns?

Low. The Trump administration’s preemption framework explicitly preserves consumer protection and child safety laws of general applicability. See our analysis of Trump’s AI executive order and state law preemption.

Can licensed mental health platforms use AI to assist therapists?

Yes. Maine LD 2082 permits AI for administrative and supplementary clinical support, provided the licensed professional maintains full responsibility. Independent clinical AI decision-making is prohibited.

Is there a private right of action under these laws?

Tennessee SB 1580: yes, up to $5,000 per violation under the Tennessee Consumer Protection Act. Maine LD 2082: AG enforcement via UTPA. Missouri HB 2372: AG enforcement, $10,000 per-violation penalty. Track updates on the AI Laws by State penalty tracker.


Related Resources

AI Laws by State tracks state AI legislation across all 50 states. This post reflects publicly available information as of April 22, 2026, and is not legal advice. Sources: Maine LD 2082 bill tracking (FastDemocracy); Troutman Privacy — Tennessee SB 1580; Transparency Coalition April 10, 2026. For jurisdiction-specific guidance, consult qualified legal counsel.

Struggling with AI compliance?

Describe your situation and we'll connect you with a specialist who understands your state's AI laws.

Get Compliance Help

Free consultation request · No obligation