Hospital AI Compliance in California (2026)

Hospitals deploy AI across clinical decision support, billing, patient communications, and operational workflows. California's AB 3030, SB 1120, and AB 2013 create overlapping obligations that affect nearly every enterprise-level AI deployment.

The Hospital AI Compliance Landscape

Hospitals face the broadest AI compliance surface area of any healthcare setting. A large health system may deploy dozens of distinct AI systems — from EHR-embedded clinical decision support (CDS) to revenue cycle automation to patient portal chatbots. Each category triggers different obligations under California law.

The Medical Board of California has publicly indicated that large health systems will be priority targets in early enforcement waves, particularly around AI-generated clinical communications and automated claim denial workflows.

SB 1120: AI Cannot Deny Claims Without Human Review

SB 1120, the Physicians Make Decisions Act, is the most operationally disruptive law for hospitals that operate health plans or manage utilization review functions. The law prohibits any algorithm or AI system from serving as the final decision-maker on coverage denials for medical necessity. This means:

  • Utilization management platforms (InterQual, MCG, proprietary tools) may flag a claim for denial, but a licensed, qualified physician or other appropriate clinician must make the actual determination
  • The clinician's decision must be documented independently — rubber-stamping AI output without independent review does not satisfy the requirement
  • Hospital-operated health plans, contracted IPAs, and third-party utilization management vendors all fall within scope
  • Audit trails showing the human reviewer's credentials and the basis for their determination are essential for compliance documentation

Compliance Tip

Vendor contracts must be updated to require SB 1120-compliant workflows. A utilization management vendor that issues AI-driven denials autonomously exposes the contracting hospital to liability alongside the vendor.

AB 3030: Human-in-the-Loop for Patient Communications

AB 3030 targets generative AI used to produce clinical communications sent directly to patients. In a hospital context, the most common triggers are:

  • Discharge instruction drafts — AI-written post-hospitalization instructions sent without clinician review
  • AI-generated care plan summaries — automated summaries pushed to the patient portal
  • Lab and imaging result explanations — LLM-drafted interpretive text sent alongside results
  • Chronic care management outreach — automated AI messaging to high-risk patient populations

Hospitals have two compliance paths: establish a licensed clinician review workflow before sending AI-generated content, or add AB 3030-required disclosures stating the content was AI-generated and not reviewed by a human provider — including instructions on how to reach a clinician.

AB 489: Patient-Facing AI Avatars and Chatbots

Hospital patient portals and call center AI bots fall under AB 489, which prohibits any AI from implying it is a licensed healthcare professional. Every interaction with a patient-facing AI system must begin with a clear disclosure that the user is communicating with an AI — not a nurse, physician, or other licensed provider.

AI avatars in virtual care settings that wear scrubs, use "Nurse" or "Dr." identifiers, or otherwise suggest clinical authority violate AB 489 regardless of whether a human provider is available in the background.

AB 2013: Training Data Transparency for Internally Developed AI

Hospitals that develop proprietary AI tools — including those built on fine-tuned foundation models using patient records, clinical notes, or administrative data — must comply with AB 2013. The law requires public disclosure of the categories of data used to train generative AI systems. For hospitals, this means:

  • Publishing a training data disclosure for any AI system that generates patient-facing or clinical outputs
  • Documenting whether HIPAA-regulated data was used and how it was de-identified
  • Maintaining the disclosure even as models are updated or retrained

Recommended Compliance Checklist for Hospitals

  • Audit all utilization management and claim denial workflows for SB 1120 compliance; update vendor contracts accordingly
  • Map every AI system that generates patient-facing communications and classify each as needing human review or AB 3030 disclosures
  • Add AB 489 disclosures to all patient portal chatbots, virtual assistants, and AI call center systems
  • Review AI avatar designs for prohibited clinical camouflage (white coats, "Dr." names, nurse scrubs)
  • Publish AB 2013 training data disclosures for any internally developed AI tools
  • Implement audit logging for all AI-generated clinical communications and claim review workflows
  • Designate a compliance officer responsible for AI regulatory monitoring and documentation

Frequently Asked Questions

Does SB 1120 apply to hospital-operated health plans?

Yes. SB 1120 prohibits AI from serving as the final decision-maker on health insurance claim denials. Hospital-operated health plans and their contracted utilization management vendors must ensure a licensed, qualified clinician makes the final determination on any coverage denial — AI tools may assist but cannot issue the denial autonomously.

Which hospital AI systems require AB 3030 disclosures?

Any generative AI system used to draft patient-facing clinical communications requires an AB 3030 disclosure if a licensed clinician does not review the output before it reaches the patient. This includes AI-drafted discharge instructions, care plan summaries, post-procedure follow-up messages, and automated lab result explanations.

Does AB 2013 apply to AI tools developed in-house by a hospital?

Yes. AB 2013 applies to any entity that develops or deploys a generative AI system, including hospitals that build proprietary AI tools using internal data. Hospitals that trained models on patient records or clinical notes must publish a data transparency disclosure describing the categories of training data used.

Generate Your AB 3030 Disclosure

Use our free tool to create compliant disclosure language for your hospital's AI patient communications.

Open Disclosure Generator →