Can AI Practice Medicine? California's Strict New Stance Explained
The line between 'advice' and 'diagnosis' just got a lot sharper in California. ✍️
The Definition
California Business and Professions Code defines the practice of medicine broadly. It includes diagnosing, treating, operating, or prescribing for any physical or mental condition. Crucially, it is illegal for anyone (or anything) without a license to do this.
The AI Role: Tool vs. Provider
California regulators view AI as a "tool," like a stethoscope or an MRI machine. A tool cannot practice medicine; only the human using the tool can.
- Allowed: AI analyzing data and presenting options to a doctor.
- Allowed: AI providing general educational information to a patient (e.g., "Here are CDC guidelines for the flu").
- Prohibited: AI telling a patient "You have the flu" or "Take this medication" without human review.
The Risk
If your AI crosses the line into diagnosis, you are aiding and abetting the unlicensed practice of medicine. This is a criminal offense in California. It can lead to fines, jail time, and the shutdown of your company.
Conclusion
Keep your AI in its lane. Frame all outputs as "informational only" or "for review by a clinician." Never let the AI be the final word.
Frequently Asked Questions (FAQ)
What about "triage" bots?
Triage is borderline. If it just routes patients ("Go to ER" vs "Make appointment"), it's usually okay. If it says "You likely have appendicitis," that's a diagnosis. Be very careful with the language.
Can AI prescribe meds?
Absolutely not. Only a human with a DEA number can prescribe controlled substances. Even for non-controlled meds, a human must authorize the script.
Does this apply to mental health "coaching"?
Yes. "Coaching" is often a euphemism for therapy. If the AI is treating a mental health condition (like depression), it is practicing medicine/psychology.