Navigating SB 942: Is Your AI Detection Tool Legally Compliant?
SB 942 introduces a unique requirement for "Covered Providers" of Generative AI: you must not only label your content, but you must also provide the public with a tool to verify it.
The "Free Detection Tool" Mandate
If your system generates synthetic content (text, image, video, or audio), you are required to provide a publicly available, free-of-charge tool that allows users to determine if a piece of content was generated by your system.
Requirements for the Tool:
- Accessibility: It must be easy to find on your website.
- Functionality: It must accurately identify content created by your model.
- Feedback: It should provide a clear "Yes/No" or probability score.
Watermarking Standards
SB 942 distinguishes between two types of disclosures, both of which are generally required:
1. Manifest Disclosure
This is a disclosure that is perceptible to the human user.
Example: A text label saying "Generated by AI" on an image, or a spoken disclaimer at the start of an audio clip.
2. Latent Disclosure (Watermarking)
This is metadata embedded directly into the file or object generation data. It must be:
- Permanent: Difficult to remove without destroying the content.
- Machine-Readable: Detectable by your public tool and other standard verification systems.
The 96-Hour Rule for Licensees
If you license your model to third parties, you have strict enforcement obligations. If you discover a third-party licensee is using your model without the required watermarking/disclosures, you must revoke their license within 96 hours of discovery.
Risk Alert
Failing to revoke a non-compliant licensee's access makes YOU liable for their violations. Automated monitoring of your API usage is highly recommended.
Conclusion
SB 942 forces AI companies to take ownership of their output. Building a robust, public-facing detection tool is no longer a "nice-to-have" feature—it is a legal necessity for doing business in California.