🎁 Ace quick missions & earn crypto rewards while gaining real-world Web3 skills. JOIN NOW! 🔥

California Bill Targets AI Chatbots, Awaits Governor Gavin Newsom's Decision

Key Takeaways

  • ​SB 243, a California bill regulating AI chatbots for companionship, passed the legislature and awaits Governor Newsom’s approval;
  • If signed, the law would start in 2026, requiring companies to alert users, especially minors, that AI chatbots are not real people;
  • The bill includes annual reporting rules and gives individuals the right to sue companies that do not follow the safety requirements.

Ace quick missions & earn crypto rewards while gaining real-world Web3 skills. Participate Now! 🔥

California Bill Targets AI Chatbots, Awaits Governor Gavin Newsom's Decision

A proposal in California that would regulate artificial intelligence (AI) chatbots designed for personal interaction has passed the state legislature and awaits approval from Governor Gavin Newsom.

Known as Senate Bill 243, the legislation received backing from both Democratic and Republican lawmakers. Newsom must decide whether to approve or reject it by October 12.

If enacted, the law would take effect on January 1, 2026. This would mark the first instance of a US state requiring companies that develop or run AI companions to follow specific safety practices.

What is FUD in Crypto? (Fear, Uncertainty & Doubt Explained)

Did you know?

Want to get smarter & wealthier with crypto?

Subscribe - We publish new crypto explainer videos every week!

The bill outlines several new responsibilities for companies offering AI companions, programs that simulate human-like responses to fulfill users’ social or emotional needs.

One key requirement is that these systems must frequently notify users, especially minors, that they are communicating with a machine. For users under 18, these reminders would appear every three hours, along with prompts to take breaks.

Additionally, companies would need to report annually on how their systems are being used. These reports, required starting in July 2027, would need to include information on how often users are directed to mental health or emergency services.

Under the proposed law, individuals who feel they have been harmed due to a company’s failure to follow the rules would be allowed to sue. They could seek court-ordered changes, compensation (up to $1,000 per violation), and legal costs.

Recently, the US Federal Trade Commission (FTC) initiated a formal review into the potential impact of AI chatbots on children and teenagers. What did the agency say? Read the full story.

Aaron S. Editor-In-Chief
Having completed a Master’s degree in Economics, Politics, and Cultures of the East Asia region, Aaron has written scientific papers analyzing the differences between Western and Collective forms of capitalism in the post-World War II era.
With close to a decade of experience in the FinTech industry, Aaron understands all of the biggest issues and struggles that crypto enthusiasts face. He’s a passionate analyst who is concerned with data-driven and fact-based content, as well as that which speaks to both Web3 natives and industry newcomers.
Aaron is the go-to person for everything and anything related to digital currencies. With a huge passion for blockchain & Web3 education, Aaron strives to transform the space as we know it, and make it more approachable to complete beginners.
Aaron has been quoted by multiple established outlets, and is a published author himself. Even during his free time, he enjoys researching the market trends, and looking for the next supernova.

Loading...
binance
×
Verified

CLAIM $100 BONUS

Changelly Welcome Reward
Rating
5.0