← Back to Glossary
Technology

Hallucination

Definition

When an AI produces confident but factually incorrect information.

Compliance Relevance

The primary driver for strict AB 3030 regulations. Limits AI autonomy in clinical advice.

Why It Matters

Operationalize 'hallucination checks'. Do not trust zero-shot GenAI for clinical calculations.

Expert Risk Analysis

EXTREME. In a medical context, a hallucination is not a typo—it is a potential medical error. The law views the deploying provider as the originator of the false data.

Frequently Asked Questions

How can I prevent AI hallucinations?

RAG (Retrieval-Augmented Generation), strict temperature settings (0.0), and mandatory human review are the industry standard for mitigation.

Is your Hallucination compliant?

California's new regulations are strict. Use our automated checker to see if you meet the requirements for Hallucination and other critical standards.

Check your Score