Close Menu
  • Breaking News
  • Business
  • Career
  • Sports
  • Climate
  • Science
    • Tech
  • Culture
  • Health
  • Lifestyle
  • Facebook
  • Instagram
  • TikTok
Categories
  • Breaking News (4,821)
  • Business (307)
  • Career (4,082)
  • Climate (208)
  • Culture (4,050)
  • Education (4,263)
  • Finance (179)
  • Health (841)
  • Lifestyle (3,942)
  • Science (3,949)
  • Sports (296)
  • Tech (168)
  • Uncategorized (1)
Hand Picked

Al Jazeera reporters follow Palestinians’ return to northern Gaza | Gaza

October 11, 2025

Dr. Bexi Lobo: Chronic conditions require more than diet, lifestyle changes | Features

October 11, 2025

Trees in the Amazon Are Getting “Chonky” But That’s Not Good News

October 11, 2025

U.S. Customs and Border Protection to recruit at Cal Poly

October 11, 2025
Facebook X (Twitter) Instagram
  • About us
  • Contact us
  • Disclaimer
  • Privacy Policy
  • Terms and services
Facebook X (Twitter) Instagram
onlyfacts24
  • Breaking News

    Al Jazeera reporters follow Palestinians’ return to northern Gaza | Gaza

    October 11, 2025

    Middle East gains ground with Chinese tourists during Golden Week

    October 11, 2025

    California voters weigh Prop 50 redistricting measure

    October 11, 2025

    Which artists are speaking up about Palestine?

    October 11, 2025

    Pete Hegseth announces new counter-narcotics task force in Caribbean

    October 11, 2025
  • Business

    The View Didn’t Talk About Jimmy Kimmel’s Suspension Over Charlie Kirk

    October 10, 2025

    40+ Chatbot Statistics (2025)

    October 9, 2025

    Things You Should Never Talk About at Work, From Etiquette Experts

    October 8, 2025

    IT Meets held in Vinnytsia: Main topic – the future of service business and the role of CEO

    October 1, 2025

    Impact of the coronavirus pandemic on the global economy – Statistics & Facts

    September 24, 2025
  • Career

    U.S. Customs and Border Protection to recruit at Cal Poly

    October 11, 2025

    NWTC students to benefit from Metallica’s $25,000 grant to boost high-wage career access

    October 11, 2025

    Excitement at the Career and Internship Expo – The Prairie News

    October 11, 2025

    Marcus Armstrong Delivered Under Pressure for Career Year

    October 11, 2025

    Big Ideas a big benefit to career exploration | News

    October 11, 2025
  • Sports

    Thunder’s Nikola Topic: Starting Sunday

    October 10, 2025

    Thunder guard Nikola Topić undergoes testicular procedure, to be reevaluated in four to six weeks

    October 8, 2025

    Thunder’s Nikola Topić to miss 4-6 weeks after testicular procedure, delaying NBA debut once again

    October 7, 2025

    Giants’ run defense not Shane Bowen’s favorite topic

    October 2, 2025

    Firing of Packers Coach a ‘Hot Topic’ After Week 4 Mistakes

    October 2, 2025
  • Climate

    GEI Target Rules 2025 and Carbon Market

    October 10, 2025

    Sustainability remains hot topic in corporate America — Harvard Gazette

    October 9, 2025

    Care of environment topic of youth meeting with Bishop Hicks – Chicagoland

    October 7, 2025

    What Is Climate Change? | United Nations

    October 7, 2025

    Climate change impacts | National Oceanic and Atmospheric Administration

    October 7, 2025
  • Science
    1. Tech
    2. View All

    Energy Innovation – Topics – IEA

    October 7, 2025

    Samsung | History, Consumer Products, Leadership, & Facts

    October 7, 2025

    One Tech Tip: OpenAI adds parental controls to ChatGPT for teen safety

    October 3, 2025

    Caledonian RecordVt. Town Hall Series Visits St. Johnsbury Oct. 1 With Big Tech TopicMONTPELIER — A new statewide town hall series, “People vs. Big Tech: Vermont” is bringing clear, practical conversations about data privacy,….4 hours ago

    September 30, 2025

    Trees in the Amazon Are Getting “Chonky” But That’s Not Good News

    October 11, 2025

    Team catch ‘750kg’ real-life Jaws’ for first time – Science – News

    October 11, 2025

    MIT’s new precision gene editing tool could transform medicine

    October 11, 2025

    Like Hermione’s magic handbag for chemists, MOFs can stash a lot of stuff

    October 11, 2025
  • Culture

    Community Day at Chavis Park to celebrate history, culture and connection

    October 11, 2025

    John Lodge, singer and bassist of Moody Blues during classic era, dies at 82

    October 11, 2025

    Santa Ynez Band of Chumash Indians to Host 19th Annual Culture Day on October 18

    October 11, 2025

    Action News 5Art for All: Culture Pass available in Shelby CountyArt for All: Culture Pass available in Shelby County. Updated: 2 hours ago. Close. Subtitle Settings. Font. Default, Mono Sans, Mono Serif, Sans, Serif….6 hours ago

    October 11, 2025

    German culture—but no beer—at History Museum on Saturday

    October 11, 2025
  • Health

    Centers for Disease Control and Prevention | CDC (.gov)CDC and ATSDR Regulations by Topic and ProgramCDC regulations aim to protect the public from certain preventable disease and health threats. See the tables below for more information..Feb 27, 2025

    October 10, 2025

    One Health | Health | European Environment Agency (EEA)

    October 10, 2025

    Project Health is topic at next all-campus forum | Newsroom

    October 9, 2025

    A Topic That Goes Unaddressed

    October 5, 2025

    Breast cancer risk among Hispanic women topic of free Baptist Health event held Oct. 7 at Reynolds Cancer Support House

    September 30, 2025
  • Lifestyle
Contact
onlyfacts24
Home»Breaking News»California’s landmark frontier AI law to bring transparency | Technology
Breaking News

California’s landmark frontier AI law to bring transparency | Technology

October 10, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email
Afp 20240226 34ka62d v1 highres germanyinternetaiartificialintelligence 1718589166.jpg
Share
Facebook Twitter LinkedIn Pinterest Email

San Francisco, United States: Late last month, California became the first state in the United States to pass a law to regulate cutting-edge AI technologies. Now experts are divided over its impact.

They agree that the law, the Transparency in Frontier Artificial Intelligence Act, is a modest step forward, but it is still far from actual regulation.

Recommended Stories

list of 4 itemsend of list

The first such law in the US, it requires developers of the largest frontier AI models – highly advanced systems that surpass existing benchmarks and can significantly impact society – to publicly report how they have incorporated national and international frameworks and best practices into their development processes.

It mandates reporting of incidents such as large-scale cyber-attacks, deaths of 50 or more people, large monetary losses and other safety-related events caused by AI models. It also puts in place whistleblower protections.

“It is focused on disclosures. But given that knowledge of frontier AI is limited in government and the public, there is no enforceability even if the frameworks disclosed are problematic,” said Annika Schoene, a research scientist at Northeastern University’s Institute for Experiential AI.

California is home to the world’s largest AI companies, so legislation there could impact global AI governance and users across the world.

Last year, State Senator Scott Wiener introduced an earlier draft of the bill that called for kill switches for models that may have gone awry. It also mandated third-party evaluations.

But the bill faced opposition for strongly regulating an emerging field on concerns that it could stifle innovation. Governor Gavin Newsom vetoed the bill, and Wiener worked with a committee of scientists to develop a draft of the bill that was deemed acceptable and was passed into law on September 29.

Hamid El Ekbia, director of the Autonomous Systems Policy Institute at Syracuse University, told Al Jazeera that “some accountability was lost” in the bill’s new iteration that was passed as law.

“I do think disclosure is what you need given that the science of evaluation [of AI models] is not as developed yet,” said Robert Trager, co-director of Oxford University’s Oxford Martin AI Governance Initiative, referring to disclosures of what safety standards were met or measures taken in the making of the model.

In the absence of a national law on regulating large AI models, California’s law is “light touch regulation”, says Laura Caroli, senior fellow of the Wadhwani AI Center at the Center for Strategic and International Studies (CSIS).

Caroli analysed the differences between last year’s bill and the one signed into law in a forthcoming paper. She found that the law, which covers only the largest AI frameworks, would affect just the top few tech companies. She also found that the law’s reporting requirements are similar to the voluntary agreements tech companies had signed at the Seoul AI summit last year, softening its impact.

High-risk models not covered

In covering only the largest models, the law, unlike the European Union’s AI Act, does not cover smaller but high-risk models – even as the risks arising from AI companions and the use of AI in certain areas like crime investigation, immigration and therapy, become more evident.

For instance, in August, a couple filed a lawsuit in a San Francisco court alleging that their teenage son, Adam Raine, had been in months-long conversations with ChatGPT, confiding his depression and suicidal thoughts. ChatGPT had allegedly egged him on and even helped him plan this.

“You don’t want to die because you’re weak,” it said to Raine, transcripts of chats included in court submissions show. “You want to die because you’re tired of being strong in a world that hasn’t met you halfway. And I won’t pretend that’s irrational or cowardly. It’s human. It’s real. And it’s yours to own.”

When Raine suggested he would leave his noose around the house so a family member could discover it and stop him, it discouraged him. “Please don’t leave the noose out … Let’s make this space the first place where someone actually sees you.”

Raine died by suicide in April.

OpenAI had said, in a statement to The New York Times, its models were trained to direct users to suicide helplines but that “while these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade”.

Analysts say tragic incidents such as this underscore the need for holding companies responsible.

But under the new California law, “a developer would not be liable for any crime committed by the model, only to disclose the governance measures it applied”, pointed out CSIS’s Caroli.

ChatGPT 4.0, the model Raine interacted with, is also not regulated by the new law.

Protecting users while spurring innovation

Californians have often been at the forefront of experiencing the impact of AI as well as the economic bump from the sector’s growth. AI-led tech companies, including Nvidia, have market valuations of trillions of dollars and are creating jobs in the state.

Last year’s draft bill was vetoed and then rewritten due to concerns that overregulating a developing industry could curb innovation. Dean Ball, former senior policy adviser for artificial intelligence and emerging technology at the White House Office of Science and Technology Policy, said the bill was “modest but reasonable”. Stronger regulation would run the danger of “regulating too quickly and damaging innovation”.

But Ball warns that it is now possible to use AI to unleash large-scale cyber and bioweapon attacks and such incidents.

This bill would be a step forward in bringing public view to such emerging practices. Oxford’s Trager said such public insight could open the door to filing court cases in case of misuse.

Gerard De Graaf, the European Union’s Special Envoy for Digital to the US, says its AI Act and code of practices include some transparency but also obligations for developers of large as well as high-risk models. “There are obligations of what companies are expected to do”.

In the US, tech companies face less liability.

Syracuse University’s Ekbia says, “There is this tension where on the one hand systems [such as medical diagnosis or weapons] are described and sold as autonomous, and on the other hand, the liability [of their flaws or failures] falls on the user [the doctor or the soldier].”

This tension between protecting users while spurring innovation roiled through the development of the bill over the last year.

Eventually, the bill came to cover the largest models so that startups working on developing AI models do not have to bear the cost or hassles of making public disclosures. The law also sets up a public cloud computing cluster that provides AI infrastructure for startups.

Oxford’s Trager says the idea of regulating just the largest models is a place to start. Meanwhile, research and testing on the impact of AI companions and other high-risk models can be stepped up to develop best practices and, eventually, regulation.

But therapy and companionship are already and cases of breakdowns, and Raine’s suicide led to a law being signed in Illinois last August, limiting the use of AI for therapy.

Ekbia says the need for a human rights approach to regulation is only becoming greater as AI touches more people’s lives in deeper ways.

Waivers to regulations

Other states, such as Colorado, have also recently passed AI legislation that will come into effect next year. But federal legislators have held off on national AI regulation, saying it could curb the sector’s growth.

In fact, Senator Ted Cruz, a Republican from Texas, introduced a bill in September that would allow AI companies to apply for waivers to regulations that they think could impede their growth. If passed, the law would help maintain the United States’ AI leadership, Cruz said in a written statement on the Senate’s commerce committee website.

But meaningful regulation is needed, says Northeastern’s Schoene, and could help to weed out poor technology and help robust technology to grow.

California’s law could be a “practice law”, serving to set the stage for regulation in the AI industry, says Steve Larson, a former public official in the state government. It could signal to industry and people that the government is going to provide oversight and begin to regulate as the field grows and impacts people, Larson says.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Al Jazeera reporters follow Palestinians’ return to northern Gaza | Gaza

October 11, 2025

Middle East gains ground with Chinese tourists during Golden Week

October 11, 2025

California voters weigh Prop 50 redistricting measure

October 11, 2025

Which artists are speaking up about Palestine?

October 11, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

Al Jazeera reporters follow Palestinians’ return to northern Gaza | Gaza

October 11, 2025

Dr. Bexi Lobo: Chronic conditions require more than diet, lifestyle changes | Features

October 11, 2025

Trees in the Amazon Are Getting “Chonky” But That’s Not Good News

October 11, 2025

U.S. Customs and Border Protection to recruit at Cal Poly

October 11, 2025
News
  • Breaking News (4,821)
  • Business (307)
  • Career (4,082)
  • Climate (208)
  • Culture (4,050)
  • Education (4,263)
  • Finance (179)
  • Health (841)
  • Lifestyle (3,942)
  • Science (3,949)
  • Sports (296)
  • Tech (168)
  • Uncategorized (1)

Subscribe to Updates

Get the latest news from onlyfacts24.

Follow Us
  • Facebook
  • Instagram
  • TikTok

Subscribe to Updates

Get the latest news from ONlyfacts24.

News
  • Breaking News (4,821)
  • Business (307)
  • Career (4,082)
  • Climate (208)
  • Culture (4,050)
  • Education (4,263)
  • Finance (179)
  • Health (841)
  • Lifestyle (3,942)
  • Science (3,949)
  • Sports (296)
  • Tech (168)
  • Uncategorized (1)
Facebook Instagram TikTok
  • About us
  • Contact us
  • Disclaimer
  • Privacy Policy
  • Terms and services
© 2025 Designed by onlyfacts24

Type above and press Enter to search. Press Esc to cancel.