AB 489: Can Your AI Chatbot Call Itself a 'Doctor'?
Warning: Calling your AI a 'Virtual MD' in California could now lead to massive fines. ⚖️
Protected Titles
California law strictly protects medical titles. Only licensed individuals can call themselves "Doctor," "Physician," "Nurse," or "MD." AB 489 extends this protection to the digital realm, explicitly prohibiting AI tools from using these titles or any variation that implies medical licensure.
This means you cannot name your chatbot "Dr. AI," "Nurse Bot," or "Virtual Physician." Even playful names like "DocBot" are risky and should be avoided.
Visual Cues
It's not just about words. AB 489 also targets "clinical camouflage"—the use of visual elements to imply medical authority.
- White Coats: An avatar wearing a white coat is a strong visual signal of a doctor. This is now prohibited for AI.
- Stethoscopes: Using a stethoscope in your app icon or avatar design can be considered misleading.
- Scrubs: Even blue scrubs can imply a licensed medical professional (like a nurse).
Safe Alternatives
To stay compliant, use terms that clearly describe the tool's function without claiming licensure:
- "Health Assistant"
- "Medical Support Tool"
- "Symptom Checker"
- "Care Guide"
Always include a disclaimer: "I am an AI assistant, not a doctor. I cannot provide a medical diagnosis."
Conclusion
Marketing matters. Audit your copy, your app store listing, and your UI assets. Ensure you aren't crossing the line into misrepresentation. The goal is to be helpful, not deceptive.
Frequently Asked Questions (FAQ)
Can I use a cartoon doctor as an avatar?
It's risky. While a clearly stylized cartoon is better than a photorealistic human, if it wears a white coat and stethoscope, it could still be seen as implying medical authority. It's safer to use a robot or abstract avatar.
What if a real doctor supervises the AI?
Even if a doctor supervises the AI, the AI itself is not a doctor. You can say "Supervised by Dr. Smith," but you cannot call the AI "Dr. Smith's Bot" in a way that suggests the bot is the doctor.
Does this apply to internal tools used by clinicians?
The risk is lower for professional-facing tools, as doctors are less likely to be misled than patients. However, accurate labeling is still a best practice to avoid confusion about the tool's capabilities.