Close Menu
  • Breaking News
  • Business
  • Career
  • Sports
  • Climate
  • Science
    • Tech
  • Culture
  • Health
  • Lifestyle
  • Facebook
  • Instagram
  • TikTok
Categories
  • Breaking News (5,241)
  • Business (318)
  • Career (4,449)
  • Climate (217)
  • Culture (4,419)
  • Education (4,638)
  • Finance (213)
  • Health (865)
  • Lifestyle (4,302)
  • Science (4,325)
  • Sports (342)
  • Tech (178)
  • Uncategorized (1)
Hand Picked

US immigration crackdown, arrests under way in Charlotte, North Carolina | Donald Trump News

November 15, 2025

How Obesity Could Fuel Alzheimer’s Risk, Per A New Study

November 15, 2025

Students connect with healthcare recruiters at CPDC career meet-up | News

November 15, 2025

KMJ NowKevin Sorbo to Newsmax: Hollywood ‘Cancel Culture’ Began Under ObamaThe entertainment industry is finally showing signs of pushing back against the ideological conformity that has dominated Hollywood for more….4 hours ago

November 15, 2025
Facebook X (Twitter) Instagram
  • About us
  • Contact us
  • Disclaimer
  • Privacy Policy
  • Terms and services
Facebook X (Twitter) Instagram
onlyfacts24
  • Breaking News

    US immigration crackdown, arrests under way in Charlotte, North Carolina | Donald Trump News

    November 15, 2025

    AI-electric appeal for underperforming infrastructure: ETF experts

    November 15, 2025

    Jon Voight warns NYC mayor-elect Zohran Mamdani poses threat to city

    November 15, 2025

    Sudan’s army captures two areas in North Kordofan as RSF burns more bodies | Sudan war News

    November 15, 2025

    Berkshire Hathaway’s surprising new tech stake

    November 15, 2025
  • Business

    Global Weekly Economic Update | Deloitte Insights

    November 15, 2025

    CBSE Class 12 Business Studies Exam Pattern 2026 with Marking Scheme and Topic-wise Marks Distribution

    November 13, 2025

    25 Tested Best Business Ideas for College Students in 2026

    November 10, 2025

    Top 10 most-read business insights

    November 10, 2025

    SAP Concur Global Business Travel Survey in 2025

    November 4, 2025
  • Career

    Students connect with healthcare recruiters at CPDC career meet-up | News

    November 15, 2025

    Harden’s 82nd career triple-double leads Clippers past Mavericks 133-127 in 2OT in NBA Cup

    November 15, 2025

    Gulf Coast News reporters talk with Girl Scouts about careers

    November 15, 2025

    Jumpstart your future at the Toppel Career Center

    November 15, 2025

    Skilled trades fair shows teens new career paths | Redmond News

    November 15, 2025
  • Sports

    Thunder’s Nikola Topic diagnosed with testicular cancer, undergoing chemotherapy

    November 15, 2025

    Nikola Topic, Oklahoma City Thunder, PG – Fantasy Basketball News, Stats

    November 14, 2025

    Sports industry in Saudi Arabia – statistics & facts

    November 14, 2025

    OKC Thunder Guard Nikola Topic Diagnosed with Testicular Cancer

    November 12, 2025

    Nikola Topic: Oklahoma City Thunder guard, 20, diagnosed with cancer

    November 11, 2025
  • Climate

    Organic Agriculture | Economic Research Service

    November 14, 2025

    PA Environment & Energy Articles & NewsClips By Topic

    November 9, 2025

    NAVAIR Open Topic for Logistics in a Contested Environment”

    November 5, 2025

    Climate-Resilient Irrigation

    October 31, 2025

    PA Environment & Energy Articles & NewsClips By Topic

    October 26, 2025
  • Science
    1. Tech
    2. View All

    Three Trending Tech Topics at the Conexxus Annual Conference

    November 15, 2025

    Another BRICKSTORM: Stealthy Backdoor Enabling Espionage into Tech and Legal Sectors

    November 14, 2025

    Data center energy usage topic of Nov. 25 Tech Council luncheon in Madison » Urban Milwaukee

    November 11, 2025

    Google to add ‘What People Suggest’ in when users will search these topics

    November 1, 2025

    Scientists Confirmed What Is Inside Our Moon : ScienceAlert

    November 15, 2025

    At 900 Meters Across, The Jinlin Crater Is Earth’s Largest Modern Impact Crater Ever Found

    November 15, 2025

    SpaceX completes second fastest turnaround between Falcon 9 launches from Cape Canaveral – Spaceflight Now

    November 15, 2025

    Sun fires off 2nd-strongest flare of 2025, sparking radio blackouts across Africa

    November 15, 2025
  • Culture

    KMJ NowKevin Sorbo to Newsmax: Hollywood ‘Cancel Culture’ Began Under ObamaThe entertainment industry is finally showing signs of pushing back against the ideological conformity that has dominated Hollywood for more….4 hours ago

    November 15, 2025

    Voices of Mexico: 7 podcasts worth adding to your queue

    November 15, 2025

    Tampa Bay TimesCommunism, ‘toxic culture’ and more: A busy Florida State Board of EducationA roundup of Florida education news from around the state..3 hours ago

    November 15, 2025

    THE POP CULTURE NEWS BULLETIN 216: SEE THE NEW TAYLOR SWIFT AND ‘PRADA’ TRAILERS!

    November 15, 2025

    Penn State celebrates culture and connections | University Park Campus News

    November 15, 2025
  • Health

    Editor’s Note: The Hot Topic Of Women’s Health

    November 14, 2025

    WHO sets new global standard for child-friendly cancer drugs, paving way for industry innovation

    November 10, 2025

    Hot Topic, Color Health streamline access to cancer screening

    November 6, 2025

    Health insurance coverage updates the topic of Penn State Extension webinar

    November 5, 2025

    Hot Topic: Public Health Programs & Policy in Challenging Times

    November 5, 2025
  • Lifestyle
Contact
onlyfacts24
Home»Science»More brainlike computers could change AI for the better
Science

More brainlike computers could change AI for the better

February 28, 2025No Comments
Facebook Twitter Pinterest LinkedIn Tumblr Email
030125 Brain Ai Feat.jpg
Share
Facebook Twitter LinkedIn Pinterest Email

The tiny worm Caenorhabditis elegans has a brain just about the width of a human hair. Yet this animal’s itty-bitty organ coordinates and computes complex movements as the worm forages for food. “When I look at [C. elegans] and consider its brain, I’m really struck by the profound elegance and efficiency,” says Daniela Rus, a computer scientist at MIT. Rus is so enamored with the worm’s brain that she cofounded a company, Liquid AI, to build a new type of artificial intelligence inspired by it.

Rus is part of a wave of researchers who think that making traditional AI more brainlike could create leaner, nimbler and perhaps smarter technology. “To improve AI truly, we need to … incorporate insights from neuroscience,” says Kanaka Rajan, a computational neuro­scientist at Harvard University.

Such “neuromorphic” technology probably won’t completely replace regular computers or traditional AI models, says Mike Davies, who directs the Neuromorphic Computing Lab at Intel in Santa Clara, Calif. Rather, he sees a future in which many types of systems coexist.

The tiny worm C. elegan.
The tiny worm C. elegans is inspiration for a new type of artificial intelligence.Hakan Kvarnstrom/Science Source

Imitating brains isn’t a new idea. In the 1950s, neurobiologist Frank Rosenblatt devised the perceptron. The machine was a highly simplified model of the way a brain’s nerve cells communicate, with a single layer of interconnected arti­ficial neurons, each performing a single mathematical function.

Decades later, the perceptron’s basic design helped inspire deep learning, a computing technique that recognizes complex patterns in data using layer upon layer of nested artificial neurons. These neurons pass input data along, manipulating it to produce an output. But, this approach can’t match a brain’s ability to adapt nimbly to new situations or learn from a single experience. Instead, most of today’s AI models devour massive amounts of data and energy to learn to perform impressive tasks, such as guiding a self-driving car.

“It’s just bigger, bigger, bigger,” says Subutai Ahmad, chief technology officer of Numenta, a company looking to human brain networks for efficiency. Traditional AI models are “so brute force and inefficient.”

In January, the Trump administration announced Stargate, a plan to funnel $500 billion into new data centers to support energy-hungry AI models. But a model released by the Chinese company DeepSeek is bucking that trend, duplicating chatbots’ capabilities with less data and energy. Whether brute force or efficiency will win out is unclear.

Meanwhile, neuromorphic computing experts have been making hardware, architecture and algorithms ever more brainlike. “People are bringing out new concepts and new hardware implementations all the time,” says computer scientist Catherine Schuman of the University of Tennessee, Knoxville. These advances mainly help with biological brain research and sensor development and haven’t been a part of mainstream AI. At least, not yet.

Here are four neuromorphic systems that hold potential for improving AI.

Making artificial neurons more lifelike

Real neurons are complex living cells with many parts. They are constantly receiving signals from the environment, with their electric charge fluctuating until it crosses a specific threshold and fires. This activity sends an electrical impulse across the cell and to neighboring neurons. Neuro­morphic computing engineers have managed to mimic this pattern in artificial neurons. These neurons, part of spiking neural networks, simulate the signals of an actual brain, creating discrete spikes that carry information through the network. Such a network may be modeled in software or built in hardware.

Spikes are not modeled in traditional AI’s deep learning networks. Instead, in those models, each arti­ficial neuron is “a little ball with one type of information processing,” says Mihai Petrovici, a neuromorphic computing researcher at the University of Bern in Switzerland. Each of these “little balls” links to the others through connections called parameters. Usually, every input into the network triggers every parameter to activate at once, which is inefficient. DeepSeek divides traditional AI’s deep learning network into smaller sections that can activate separately, which is more efficient.

But real brain and artificial spiking networks achieve efficiency a bit differently. Each neuron is not connected to every other one. Also, only if electrical signals reach a specific threshold does a neuron fire and send information to its connections. The network activates sparsely rather than all at once.

Comparing networks

Typical deep learning networks are dense, with interconnections among all their identical “neurons.” Brain networks are sparse, and their neurons can take on different roles. Neuroscientists are still working out how complex brain networks are actually organized. 

An illustration of an artificial network and brain networks.
J.D. Monaco, K. Rajan and G.M. HwangJ.D. Monaco, K. Rajan and G.M. Hwang

Importantly, brains and spiking networks combine memory and processing. The connections “that represent the memory are also the elements that do the computation,” Petrovici says. Mainstream computer hardware — which runs most AI — separates memory and processing. AI processing usually happens in a graphical processing unit, or GPU. A different hardware component, such as random access memory, or RAM, handles storage. This makes for simpler computer architecture. But zipping data back and forth among these components eats up energy and slows down computation.

The neuromorphic computer chip BrainScaleS-2 combines these efficient features. It contains sparsely connected spiking neurons physically built into hardware, and the neural connections store memories and perform computation.

BrainScaleS-2 was developed as part of the Human Brain Project, a 10-year effort to understand the human brain by modeling it in a computer. But some researchers looked at how the tech developed from the project might make AI more efficient. For example, Petrovici trained different AIs to play the video game “Pong.” A spiking network running on the BrainScaleS-2 hardware used a thousandth of the energy as a simulation of the same network running on a CPU. But the real test was to compare the neuromorphic setup with a deep learning network running on a GPU. Training the spiking system to recognize handwriting used a hundredth the energy of the typical system, the team found.

For spiking neural network hardware to be a real player in the AI realm, it has to be scaled up and distributed. Then, it could be “useful to computation more broadly,” Schuman says.

Connecting billions of spiking neurons

The academic teams working on BrainScaleS-2 currently have no plans to scale up the chip, but some of the world’s biggest tech companies, like Intel and IBM, do.

In 2023, IBM introduced its NorthPole neuro­morphic chip, which combines memory and processing to save energy. And in 2024, Intel announced the launch of Hala Point, “the largest neuromorphic system in the world right now,” says computer scientist Craig Vineyard of Sandia National Laboratories in New Mexico.

Despite that impressive superlative, there’s nothing about the system that visually stands out, Vineyard says. Hala Point fits into a luggage-sized box. Yet it contains 1,152 of Intel’s Loihi 2 neuromorphic chips for a record-setting total of 1.15 billion electronic neurons — roughly the same number of neurons as in an owl brain.

Like BrainScaleS-2, each Loihi 2 chip contains a hardware version of a spiking neural network. The physical spiking network also uses sparsity and combines memory and processing. This neuromorphic computer has “fundamentally different computational characteristics” than a regular digital machine, Schuman says.

A computer chip with blue and red accents on it.
This BrainScaleS-2 computer chip was built to work like a brain. It contains 512 simulated neurons connected with up to 212,000 synapses. Heidelberg Univ.

These features improve Hala Point’s efficiency compared with that of typical computer hardware. “The realized efficiency we get is definitely significantly beyond what you can achieve with GPU technology,” Davies says.

In 2024, Davies and a team of researchers showed that the Loihi 2 hardware can save energy even while running typical deep learning algorithms. The researchers took several audio and video processing tasks and modified their deep learning algorithms so they could run on the new spiking hardware. This process “introduces sparsity in the activity of the network,” Davies says.

A deep learning network running on a regular digital computer processes every single frame of audio or video as something completely new. But spiking hardware maintains “some knowledge of what it saw before,” Davies says. When part of the audio or video stream stays the same from one frame to the next, the system doesn’t have to start over from scratch. It can “keep the network idle as much as possible when nothing interesting is changing.” On one video task the team tested, a Loihi 2 chip running a “sparsified” version of a deep learning algorithm used 1/150th the energy of a GPU running the regular version of the algorithm.

The audio and video test showed that one type of architecture can do a good job running a deep learning algorithm. But developers can reconfigure the spiking neural networks within Loihi 2 and BrainScaleS-2 in numerous ways, coming up with new architectures that use the hardware differently. They can also implement different kinds of algorithms using these architectures.

It’s not yet clear what algorithms and architectures would make the best use of this hardware or offer the highest energy savings. But researchers are making headway. A January 2025 paper introduced a new way to model neurons in a spiking network, including both the shape of a spike and its timing. This approach makes it possible for an energy-efficient spiking system to use one of the learning techniques that has made mainstream AI so successful.

Neuromorphic hardware may be best suited to algorithms that haven’t even been invented yet. “That’s actually the most exciting thing,” says neuroscientist James Aimone, also of Sandia National Labs. The technology has a lot of potential, he says. It could make the future of computing “energy efficient and more capable.”

Designing an adaptable ‘brain’

Neuroscientists agree that one of the most important features of a living brain is the ability to learn on the go. And it doesn’t take a large brain to do this. C. elegans, one of the first animals to have its brain completely mapped, has 302 neurons and around 7,000 synapses that allow it to learn continuously and efficiently as it explores its world.

Ramin Hasani studied how C. elegans learns as part of his graduate work in 2017 and was working to model what scientists knew about the worms’ brains in computer software. Rus found out about this work while out for a run with Hasani’s adviser at an academic conference. At the time, she was training AI models with hundreds of thousands of artificial neurons and half a million parameters to operate self-driving cars.

An illustration of a C. elegans brain brain.
A C. elegans brain (its neurons are colored by type in this reconstruction) learns constantly and is a model for building more efficient AI.D. Witvliet et al/bioRxiv.org 2020

If a worm doesn’t need a huge network to learn, Rus realized, maybe AI models could make do with smaller ones, too.

She invited Hasani and one of his colleagues to move to MIT. Together, the researchers worked on a series of projects to give self- driving cars and drones more wormlike “brains” — ones that are small and adaptable. The end result was an AI algorithm that the team calls a liquid neural network.

“You can think of this like a new flavor of AI,” says Rajan, the Harvard neuroscientist.

Standard deep learning networks, despite their impressive size, learn only during a training phase of development. When training is complete, the network’s parameters can’t change. “The model stays frozen,” Rus says. Liquid neural networks, as the name suggests, are more fluid. Though they incorporate many of the same techniques as standard deep learning, these new networks can shift and change their parameters over time. Rus says that they “learn and adapt … based on the inputs they see, much like biological systems.”

To design this new algorithm, Hasani and his team wrote mathematical equations that mimic how a worm’s neurons activate in response to information that changes over time. These equations govern the liquid neural network’s behavior.

Such equations are notoriously difficult to solve, but the team found a way to approximate a solution, making it possible to run the network in real time. This solution is “remarkable,” Rajan says.

In 2023, Rus, Hasani and their colleagues showed that liquid neural networks could adapt to new situations better than much larger typical AI models. The team trained two types of liquid neural networks and four types of typical deep learning networks to pilot a drone toward different objects in the woods. When training was complete, they put one of the training objects — a red chair — into completely different environments, including a patio and a lawn beside a building. The smallest liquid network, containing just 34 artificial neurons and around 12,000 parameters, outperformed the largest standard AI network they tested, which contained around 250,000 parameters.

The team started the company Liquid AI around the same time and has worked with the U.S. military’s Defense Advanced Research Projects Agency to test their model flying an actual aircraft.

The company has also scaled up its models to compete directly with regular deep learning. In January, it announced LFM-7B, a 7-billion-parameter liquid neural network that generates answers to prompts. The team reports that the network outperforms typical language models of the same size.

“I’m excited about Liquid AI because I believe it could transform the future of AI and computing,” Rus says.

This approach won’t necessarily use less energy than mainstream AI. Its constant adaptation makes it “computationally intensive,” Rajan says. But the approach “represents a significant step towards more realistic AI” that more closely mimics the brain.

Matt Chinworth

Building on human brain structure

While Rus is working off the blueprint of the worm brain, others are taking inspiration from a very specific region of the human brain — the neocortex, a wrinkly sheet of tissue that covers the brain’s surface.

“The neocortex is the brain’s powerhouse for higher-order thinking,” Rajan says. “It’s where sensory information, decision-making and abstract reasoning converge.”

This part of the brain contains six thin horizontal layers of cells, organized into tens of thousands of vertical structures called cortical columns. Each column contains around 50,000 to 100,000 neurons arranged in several hundred vertical minicolumns.

These minicolumns are the primary drivers of intelligence, neuro­scientist and computer scientist Jeff Hawkins argues. In other parts of the brain, grid and place cells help an animal sense its position in space. Hawkins theorizes that these cells exist in minicolumns where they track and model all our sensations and ideas. For example, as a fingertip moves, he says, these columns make a model of what it’s touching. It’s the same with our eyes and what we see, Hawkins explains in his 2021 book A Thousand Brains.

“It’s a bold idea,” Rajan says. Current neuroscience holds that intelligence involves the interaction of many different brain systems, not just these mapping cells, she says.

Though Hawkins’ theory hasn’t reached widespread acceptance in the neuroscience community, “it’s generating a lot of interest,” she says. That includes excitement about its potential uses for neuromorphic computing.

Hawkins developed his theory at Numenta, a company he cofounded in 2005. The company’s Thousand Brains Project, announced in 2024, is a plan for pairing computing architecture with new algorithms.

In some early testing for the project a few years ago, the team described an architecture that included seven cortical columns, hundreds of minicolumns but spanned just three layers rather than six in the human neocortex. The team also developed a new AI algorithm that uses the column structure to analyze input data. Simulations showed that each column could learn to recognize hundreds of complex objects.

The practical effectiveness of this system still needs to be tested. But the idea is that it will be capable of learning about the world in real time, similar to the algorithms of Liquid AI.

For now, Numenta, based in Redwood, Calif., is using regular digital computer hardware to test these ideas. But in the future, custom hardware could implement physical versions of spiking neurons organized into cortical columns, Ahmad says.

Using hardware designed for this architecture could make the whole system more efficient and effective. “How the hardware works is going to influence how your algorithm works,” Schuman says. “It requires this codesign process.”

A new idea in computing can take off only with the right combination of algorithm, architecture and hardware. For example, DeepSeek’s engineers noted that they achieved their gains in efficiency by codesigning “algorithms, frameworks and hardware.”

When one of these isn’t ready or isn’t available, a good idea could languish, notes Sara Hooker, a computer scientist at the research lab Cohere in San Francisco and author of an influential 2021 paper titled “The Hardware Lottery.” This already happened with deep learning — the algorithms to do it were developed back in the 1980s, but the technology didn’t find success until computer scientists began using GPU hardware for AI processing in the early 2010s.

Too often “success depends on luck,” Hooker said in a 2021 Association for Computing Machinery video. But if researchers spend more time considering new combinations of neuromorphic hardware, architectures and algorithms, they could open up new and intriguing possibilities for both AI and computing.

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

Related Posts

Scientists Confirmed What Is Inside Our Moon : ScienceAlert

November 15, 2025

At 900 Meters Across, The Jinlin Crater Is Earth’s Largest Modern Impact Crater Ever Found

November 15, 2025

SpaceX completes second fastest turnaround between Falcon 9 launches from Cape Canaveral – Spaceflight Now

November 15, 2025

Sun fires off 2nd-strongest flare of 2025, sparking radio blackouts across Africa

November 15, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

US immigration crackdown, arrests under way in Charlotte, North Carolina | Donald Trump News

November 15, 2025

How Obesity Could Fuel Alzheimer’s Risk, Per A New Study

November 15, 2025

Students connect with healthcare recruiters at CPDC career meet-up | News

November 15, 2025

KMJ NowKevin Sorbo to Newsmax: Hollywood ‘Cancel Culture’ Began Under ObamaThe entertainment industry is finally showing signs of pushing back against the ideological conformity that has dominated Hollywood for more….4 hours ago

November 15, 2025
News
  • Breaking News (5,241)
  • Business (318)
  • Career (4,449)
  • Climate (217)
  • Culture (4,419)
  • Education (4,638)
  • Finance (213)
  • Health (865)
  • Lifestyle (4,302)
  • Science (4,325)
  • Sports (342)
  • Tech (178)
  • Uncategorized (1)

Subscribe to Updates

Get the latest news from onlyfacts24.

Follow Us
  • Facebook
  • Instagram
  • TikTok

Subscribe to Updates

Get the latest news from ONlyfacts24.

News
  • Breaking News (5,241)
  • Business (318)
  • Career (4,449)
  • Climate (217)
  • Culture (4,419)
  • Education (4,638)
  • Finance (213)
  • Health (865)
  • Lifestyle (4,302)
  • Science (4,325)
  • Sports (342)
  • Tech (178)
  • Uncategorized (1)
Facebook Instagram TikTok
  • About us
  • Contact us
  • Disclaimer
  • Privacy Policy
  • Terms and services
© 2025 Designed by onlyfacts24

Type above and press Enter to search. Press Esc to cancel.