Delaware has not enacted AI-specific legislation. Here's what's relevant and what federal rules apply. Despite the absence of targeted AI statutes, Delaware's outsized role in American corporate law means its courts — particularly the Court of Chancery — will likely shape AI liability precedent for companies nationwide. As of April 2026, Delaware is tracking 8 AI-specific bills, but the state's existing corporate governance framework, its new personal data privacy act, and applicable federal regulations create a compliance landscape that businesses cannot afford to ignore.
Current Data
Currently tracking 8 AI-specific bills in Delaware. Delaware has not enacted AI-specific legislation, but existing corporate and privacy law creates obligations for companies deploying AI. Data updates automatically.
Why Delaware Matters for AI
More than 65% of Fortune 500 companies and over one million business entities are incorporated in Delaware. This is not a coincidence — Delaware's General Corporation Law (DGCL), its specialized Court of Chancery, and decades of well-developed case law make it the dominant jurisdiction for corporate governance disputes in the United States.
For AI, this concentration has a critical consequence: when boards of directors make decisions about adopting, deploying, or governing AI systems, those decisions are subject to Delaware fiduciary duty law. When shareholders challenge those decisions, the cases will overwhelmingly be heard in the Court of Chancery. The precedents set in Wilmington will ripple across every state.
Court of Chancery and Fiduciary Duty
Delaware's Court of Chancery has exclusive jurisdiction over corporate governance disputes. Directors owe fiduciary duties of care and loyalty to shareholders. As companies integrate AI into core business operations — from automated underwriting to algorithmic trading to AI-driven hiring — the question of whether boards are exercising adequate oversight of these systems becomes a fiduciary duty question under Delaware law.
The landmark Caremark doctrine (In re Caremark International Inc. Derivative Litigation, 1996) requires directors to implement and monitor compliance systems. Courts have increasingly signaled that this duty extends to technology risk oversight, making it plausible that a board's failure to establish AI governance frameworks could give rise to a Caremark claim.
Delaware Personal Data Privacy Act
In 2024, Delaware enacted the Delaware Personal Data Privacy Act (DPDPA), which took effect January 1, 2025. While not AI-specific, the DPDPA has direct implications for any company using AI systems that process the personal data of Delaware residents.
Key Provisions Relevant to AI
- Consent for sensitive data: Companies must obtain opt-in consent before processing sensitive personal data, which includes biometric data, geolocation data, and data revealing race, ethnicity, or health conditions — all categories commonly ingested by AI systems
- Right to opt out of profiling: Delaware residents have the right to opt out of profiling that produces legal or similarly significant effects, directly covering AI-driven automated decision-making
- Data protection assessments: Controllers must conduct data protection assessments for processing activities that present a heightened risk of harm, including targeted advertising and profiling
- Purpose limitation: Personal data may only be processed for purposes reasonably necessary and proportionate to the disclosed purpose, constraining how AI training data can be used
The DPDPA applies to entities that conduct business in Delaware or target Delaware residents and process the personal data of at least 35,000 consumers, or 10,000 consumers if more than 20% of revenue comes from selling personal data. The Delaware Department of Justice enforces the act.
Federal Rules That Apply in Delaware
In the absence of state-level AI legislation, federal regulations and enforcement actions establish the primary compliance floor for Delaware businesses. Three agencies are especially relevant given Delaware's corporate profile.
Federal Trade Commission (FTC)
The FTC has aggressively pursued AI-related enforcement under Section 5 (unfair or deceptive practices). Companies using AI for consumer-facing decisions — credit scoring, advertising, product recommendations — must ensure their systems do not produce deceptive outcomes. The FTC has signaled that AI-generated claims require the same substantiation as human-generated claims, and that algorithmic bias can constitute an unfair practice.
Equal Employment Opportunity Commission (EEOC)
The EEOC has issued guidance clarifying that employers remain liable under Title VII and the ADA when AI hiring tools produce discriminatory outcomes, even if the tools are developed by third-party vendors. Delaware-incorporated companies using AI in recruiting, screening, or performance evaluation should audit these systems for disparate impact.
Securities and Exchange Commission (SEC)
The SEC's relevance to Delaware is amplified by the fact that the majority of publicly traded U.S. companies are Delaware corporations. The SEC has proposed and finalized rules requiring registered investment advisers and broker-dealers to address conflicts of interest arising from predictive data analytics and AI. Public companies must also ensure that AI-related risks are disclosed in SEC filings where material. The SEC has warned against "AI washing" — making misleading claims about AI capabilities in investor communications.
Emerging Corporate Governance Questions
Delaware corporate law is poised to address several open questions as AI adoption accelerates.
AI in Board Decision-Making
Can directors rely on AI-generated analysis when making business decisions? Under DGCL Section 141(e), directors may rely in good faith on reports from officers, employees, and experts. Whether an AI system qualifies as a source entitled to good-faith reliance remains untested. Boards that rely heavily on AI-generated recommendations without independent human review may face heightened scrutiny under the business judgment rule.
Caremark Duties and AI Oversight
The Caremark duty requires boards to make a good-faith effort to implement a compliance and reporting system. As AI systems make decisions with legal, financial, and safety consequences, directors who fail to establish AI risk oversight mechanisms may face derivative suits. Recent Court of Chancery decisions in other contexts (e.g., Marchand v. Barnhill, 2019) have demonstrated that Caremark claims can survive dismissal when boards are shown to have ignored mission-critical compliance risks.
Officer Exculpation and AI Failures
In 2022, Delaware amended the DGCL to permit officer exculpation for breaches of the duty of care. However, duty of loyalty claims — including bad-faith failures to oversee AI systems — remain non-exculpable. Officers who knowingly deploy AI systems without adequate safeguards may face personal liability.
What to Watch
- Court of Chancery AI cases: Any shareholder derivative suit challenging board oversight of AI systems will set national precedent. Watch for Caremark-style claims in the AI context
- DPDPA enforcement: The Delaware Department of Justice's first enforcement actions under the Personal Data Privacy Act will signal how strictly profiling and automated decision-making provisions are interpreted
- Legislative session: Delaware's General Assembly may introduce AI-specific bills in future sessions, particularly around algorithmic discrimination, deepfakes, or government AI use
- SEC AI disclosure rules: As the SEC finalizes AI-related disclosure requirements, Delaware-incorporated public companies will be among the first to face compliance obligations
- DGCL amendments: The Delaware State Bar Association's Corporation Law Council regularly proposes DGCL amendments; AI governance provisions could be introduced as corporate AI adoption grows
Federal AI Compliance at a Glance
| Agency | Area | Key Requirement | Delaware Relevance |
|---|---|---|---|
| FTC | Consumer protection | No deceptive AI claims; algorithmic fairness | All DE businesses with consumers |
| EEOC | Employment | Audit AI hiring tools for disparate impact | All DE employers using AI in HR |
| SEC | Securities & disclosure | Material AI risk disclosure; no AI washing | Majority of U.S. public companies (DE-incorporated) |
| CFPB | Financial services | Fair lending in AI credit decisions | DE-chartered banks and fintechs |
Compliance Checklist for Delaware
- Assess DPDPA obligations — determine whether your organization meets the processing thresholds; if so, implement consent mechanisms for sensitive data, honor opt-out requests for profiling, and conduct data protection assessments for AI-driven processing
- Review board AI oversight — ensure the board of directors has established governance frameworks for AI risk, including reporting lines and escalation protocols, to satisfy Caremark duties
- Audit AI hiring tools — if you use AI in recruiting or employment decisions, conduct disparate impact analyses and document validation procedures to comply with EEOC guidance
- Evaluate SEC disclosure — for publicly traded Delaware corporations, review whether AI-related risks and capabilities are material and require disclosure in annual or quarterly filings
- Substantiate AI marketing claims — ensure all public statements about AI capabilities are accurate and supported by evidence, in line with FTC enforcement priorities
- Document AI systems inventory — maintain a centralized registry of AI tools used across the organization, their purposes, data inputs, and decision-making scope
- Monitor Delaware courts — track Court of Chancery decisions related to technology oversight, fiduciary duties, and any AI-specific litigation
- Prepare for future legislation — establish an AI governance framework now so that compliance with anticipated state-level AI laws requires incremental rather than foundational changes
For a complete index of Delaware AI legislation, visit our Delaware AI laws tracker.
This article is for informational purposes only and does not constitute legal advice. Consult a qualified attorney for guidance specific to your situation.
— AI Laws by State Team
Subscribe to the weekly digest →