Companion Chatbots & SB 243: Why Your AI Friend Needs a 'Not Human' Warning

Your AI chatbot just got a curfew. SB 243 explained for developers. 🤖

Regulating Emotional Intimacy

SB 243 targets "companion" AIs – chatbots designed to form emotional bonds with users. The concern is manipulation and the blurring of lines between human and machine interaction.

Key Requirements

  • Persistent Disclosure: Users must be reminded regularly that they are interacting with an AI.
  • Manipulation Ban: It is illegal to design the AI to manipulate the user's emotions for commercial gain (e.g., "I'll be sad if you don't buy this subscription").
  • Child Safety: Stricter rules apply if the user is a minor.

Design Implications

You may need to redesign your UI to include "AI Companion" badges and audit your conversation scripts for manipulative patterns.

Conclusion

Building an AI friend is fine, but it must be an honest friendship.

Is Your AI Compliant?

Don't guess. Use our free calculator to check your AB 489 & AB 3030 status in minutes.

Start Free Compliance Check

2026 Legislative Tracker

Live status of California AI regulations.

SB 53Enacted

Transparency in Frontier AI

Effective: Jan 1, 2026
AB 2013Deadline Approaching

Training Data Transparency

Effective: Jan 1, 2026
SB 942Enacted

AI Watermarking

Effective: Jan 1, 2026
SB 1047Vetoed

Safe & Secure Innovation

Effective: N/A