What California’s First AI Safety Law Means for OpenAI, Meta, and Google
Governor Gavin Newsom signs SB 53, the first U.S. law requiring AI companies to disclose safety protocols.

By Indrani Priyadarshini

on September 30, 2025

California Governor Gavin Newsom has signed Senate Bill 53 (SB 53), establishing the first U.S. state-level law that mandates transparency and safety requirements for large artificial intelligence companies. The groundbreaking legislation, passed by the state legislature two weeks earlier, directly affects leading AI labs such as OpenAI, Anthropic, Google DeepMind, and Meta.

AI Companies Face New Safety Reporting Rules

Under SB 53, major AI firms are now required to disclose their safety protocols and provide whistleblower protections for employees. The legislation also introduces a reporting mechanism through California’s Office of Emergency Services, enabling both companies and the public to flag critical AI safety incidents. This includes crimes committed without human involvement, such as cyberattacks, and cases of deceptive model behaviour that go beyond requirements outlined in the EU AI Act.

Also Read | All Electric Vehicles in India Will Soon Have to Make Noise – Here’s Why

Mixed Reactions From the AI Industry

The bill has divided the AI sector. While Anthropic voiced support, OpenAI and Meta lobbied strongly against it, warning that state-level policies could create a “patchwork of regulation” that stifles innovation. OpenAI even issued an open letter to Governor Newsom urging him not to sign the legislation. Tech leaders in Silicon Valley have, meanwhile, funnelled significant funds into pro-AI super PACs backing candidates who favour light-touch regulation.

California’s Move Could Set National Precedent

Other states are closely watching California’s actions. In New York, lawmakers have already passed a similar bill that now awaits Governor Kathy Hochul’s decision. California’s push may inspire other states to adopt comparable AI safety measures, making SB 53 a potential model for national standards.

“California has proven that we can establish regulations to protect our communities while also ensuring that the growing AI industry continues to thrive,” Governor Newsom said. He emphasised that the bill represents a balance between innovation and public trust as artificial intelligence rapidly evolves.

Also Read | Shirish Chandra Murmu Appointed as RBI Deputy Governor

Another AI Bill on Newsom’s Desk

Governor Newsom is also reviewing Senate Bill 243, which would regulate AI companion chatbots. If enacted, SB 243 would require operators to implement strict safety protocols and hold them legally responsible for failures. The proposal, which passed both chambers with bipartisan support, is expected to generate further debate in the AI industry.

A Revised Approach After Past Veto

SB 53 marks Senator Scott Wiener’s second attempt at AI regulation. His earlier proposal, SB 1047, was vetoed by Newsom last year following intense pushback from AI companies. This time, Wiener worked directly with leading AI firms to adjust the bill’s provisions and gain broader acceptance.

What does it mean to the AI firms?

The SB 53 is a game-changer for companies like OpenAI, Meta, and Google DeepMind. By requiring AI labs to disclose their safety practices and setting up a system to report critical incidents, the law raises the bar for accountability in the sector. For OpenAI and Meta, both of which opposed the bill, this means increased scrutiny of how their models are trained, deployed, and monitored, especially when it comes to risks like cyberattacks or deceptive AI behaviour.

The added whistleblower protections could also bring internal safety concerns into the spotlight more frequently. While firms like Anthropic see the law as a positive step, others fear it could slow down innovation or lead to conflicting regulations across states. However, California’s move is likely to influence national debates, forcing Big Tech to navigate stricter compliance standards and reshaping how AI safety is handled in the U.S.