SB 1120: Why AI Can No Longer Deny Health Insurance Claims in CA
The 'Physicians Make Decisions Act' is here. AI is a tool, not a judge. ⚖️
The Problem
In recent years, health insurers have used algorithms to process claims in bulk. While efficient, this led to stories of patients being denied necessary care by a computer code, often with little recourse. SB 1120 was passed to stop this practice.
The Solution
SB 1120 mandates that any decision to deny, modify, or delay health care services based on medical necessity must be made by a licensed physician.
The physician must possess a current and valid license and be competent to evaluate the specific clinical issues involved. They cannot just rubber-stamp the AI's output; they must review the relevant medical records.
Impact on InsurTech
If your business model relies on "automated claims processing" to cut costs, you need to pivot. You can still use AI to:
- Auto-approve claims (since this benefits the patient).
- Summarize records for the reviewer.
- Flag potential issues for human review.
But you cannot use AI to issue a denial without human eyes on the file.
Conclusion
Human judgment is back in the driver's seat. AI is a powerful assistant, but in California, it is not a judge.
Frequently Asked Questions (FAQ)
Does this apply to dental or vision?
Yes, it applies to "health care services," which generally includes specialized care like dental and vision if they involve medical necessity determinations.
Can a nurse review the claim?
The law specifies a "physician" for medical necessity denials. However, other licensed professionals (like dentists or chiropractors) may review claims within their scope of practice.
What if the AI is 99.9% accurate?
It doesn't matter. The law is about accountability, not just accuracy. A human must be accountable for the denial.