California governor Gavin Newsom vetoes AI safety bill
Share link:In this post: California’s governor has blocked contentious AI safety bill. The bill’s author, Democratic state Sen. Scott Weiner, called the veto “a setback.” The proposed bill targeted systems requiring a high computing power and more than $100 million to build.
California Governor Gavin Newsom has vetoed a contentious artificial intelligence (AI) bill, stating that it will stifle innovation and fail to safeguard the public from the technology’s “real” concerns. Supporters of the bill note that it would have created some of the first national regulations on large-scale AI models and opened the door for national AI safety laws.
On September 30, Newsom rejected SB 1047, also known as the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act, after receiving strong opposition from Silicon Valley. The bill recommended mandated safety testing for AI models and other safeguards, which tech giants feared would hinder innovation.
Newsom argued the bill focused too much on regulating existing top AI corporations while failing to protect the public from the “real” hazards posed by the new technology.
He said:
Instead, the bill applies stringent standards to even the most basic functions — so long as a large system deploys it. I do not believe this is the best approach to protecting the public from real threats posed by the technology.
– Gavin Newsom
Senator criticizes the veto, calls it a setback for public safety
Democratic state senator, Scott Weiner, the bill’s author, described the rejection as a setback for everyone who believes in oversight of massive corporations that are making critical decisions that affect the public’s safety, welfare, and the planet’s future.
SB 1047 would have required developers, including major players like OpenAI, Meta, and Google, to implement a “kill switch” for their artificial intelligence models and develop plans to mitigate extreme risks. Additionally, the bill would have allowed the state attorney general to sue AI developers in cases where their models posed ongoing threats, such as an artificial intelligence grid takeover.
This bill specifically targeted systems that cost more than $100 million to develop and need a lot of processing power. No current artificial intelligence models have hit that threshold, but some experts said that could change within the next year.
Newsom advocates for science-based AI regulations amid safety bill controversy
Newsom stated that he had consulted with leading AI safety experts to help California create effective regulations that prioritize a science-based analysis of potential risks from AI development. He emphasized that his administration would continue to push for robust safety protocols, insisting that regulators must not wait for a major catastrophe to act.
Despite vetoing SB 1047, Newsom highlighted that his administration has signed over 18 bills related to AI regulation in the past month.
Before Newsom’s decision, the bill was unpopular with politicians, consultants, and major technology companies. House Speaker Nancy Pelosi and companies such as OpenAI have stated that it would greatly impede the advancement of artificial intelligence.
The head of AI policy at Abundance Institute, Neil Chilson, cautioned that while the bill primarily targets models costing more than $100 million, its scope could easily be expanded to crack down on smaller developers too.
However, some people are open to the bill. Billionaire Elon Musk, who is building his own artificial intelligence model called “Grok,” is one of the few tech heavyweights who support the measure and broader AI laws. In an August 26 post on X, Musk stated that California should probably pass the SB 1047 AI safety bill but admitted that supporting the bill was a “tough decision.”
coin_news.disclaimer
coin_news.may_like
AAVE breaks above $140
Solana Sees Resurgence in Institutional Investment and User Growth in Q3 2024
Here’s How Much Bitcoin Trump’s VP Pick JD Vance Owns
Bitcoin ETF Inflows Pause as U.S. Election Uncertainty Rises