The Hunt For The Laws Of Physics Behind Memory And Thought

The massive networks of neurons in our brains produce complex behaviors, like actions and thought. Now physicists want to understand the laws that govern this emergent phenomena.

memory
(Credit: earth phakphum/Shutterstock)

Newsletter

Sign up for our email newsletter for the latest science news
 

One of the curious features of the laws of physics is that many of them seem to be the result of the bulk behavior of many much smaller components. The atoms and molecules in a gas, for example, move at a huge range of velocities. When constrained in a container, these particles continually strike the surface creating a force.

But it is not necessary to know the velocities of all the particles to determine this force. Instead, their influence averages out into a predictable and measurable bulk property called pressure.

This and other bulk properties like temperature, density and elasticity, are hugely useful because of the laws of physics that govern them. Over one hundred years ago, physicists like Willard Gibbs and others determined the mathematical character of these laws and the statistical shorthand that physicists and engineers now use routinely in everything from laboratory experiments to large scale industrial processes.

The success of so-called statistical physics raises the possibility that other systems that consist of enormous numbers of similar entities might also have their own “laws of physics”. In particular, physicists have long hoped that the bulk properties of neurons might be amenable to this kind of approach.

Neural Physics

The behavior of single neurons is well understood. But put them together into networks and much more significant behaviors emerge, such as sensory perception, memories and thought. The hope is that a statistical or mathematical approach to these systems could reveal the laws of neural physics that describe the bulk behavior of nervous systems and brains.

“It is an old dream of the physics community to provide a statistical mechanics description for these and other emergent phenomena of life,” say Leenoy Meshulam at the University of Washington and William Bialek at Princeton University, who have reviewed progress in this area.

“These aspirations appear in a new light because of developments in our ability to measure the electrical activity of the brain, sampling thousands of individual neurons simultaneously over hours or days.”

The nature of these laws is, of course, fundamentally different to the nature of conventional statistical physics. At the heart of the difference is that neurons link together to form complex networks in which the behavior of one neuron can be closely correlated with the behavior of its neighbors.

It is relatively straightforward to formulate a set of equations that capture this behavior. But it quickly becomes apparent that these equations cannot be easily solved in anything other than trivial circumstances.

Instead, physicists must consider the correlations between all possible pairs of neurons and then use experimental evidence to constrain what correlations are possible.

The problem, of course, is that the number of pairs increases exponentially with the number of neurons. That raises the question of how much more data must be gathered to constrain the model as the number of neurons increases.

One standard system in which this has been well measured is the retina. This consists of a network of light sensitive neurons in which activity between neighbors is known to be corelated. So if one neuron is activated there is a strong possibility that its neighbor will be too. (This is the reason for the gently evolving, coral-like patterns in vision that people sometimes notice when they first wake up.)

Experiments in this area began by monitoring the behavior of a handful of neurons, then a few dozen, a few hundred and now approach thousands (but not millions). It turns out that the data helps constrain the model to the point where they give remarkably accurate predictions of neural behavior when asked, for example, to predict how many neurons are active out of any given set of them.

That suggests the system of equations accurately captures the behavior of retinal networks. In other words, “the models really are the solutions to the mathematical problem that we set out to solve,” say Meshulam and Bialek.

Of course, the retina is a highly specialized part of the nervous system so an important question is whether similar techniques can generalize to the higher cognitive tasks that take place in other parts of the brain.

Emergent Behavior

One challenge here is that networks can demonstrate emergent behavior. This is not the result of random correlations or even weak correlations. Instead, the correlations can be remarkably strong and can spread through a network like an avalanche.

Networks that demonstrate this property are said to be in a state of criticality and are connected in a special way that allows this behavior. This criticality turns out to be common in nature and suggests networks can tune themselves in a special way to achieve it.

“Self-organized criticality” has been widely studied in the last two decades and there has been some success in describing it mathematically. But exactly how this self-tuning works is the focus of much ongoing research.

Just how powerful these approaches will become is not yet clear. Meshulam and Bialek take heart from the observation that some natural behaviors are amenable to the kind of analysis that physicists are good at. “All the birds in a flock agreeing to fly in the same direction is like the alignment of spins in a magnet,” they say.

The fact that this is merely a metaphor concerns them — metaphors can help understanding but the real behavior of these system is often much more complex and subtle.

But there are reasons to think that mathematical models can go further. “The explosion of data on networks of real neurons offers the opportunity to move beyond metaphor,” they say, adding that the data from millions of neurons should soon help to inform this debate.

“Our experimentalist friends will continue to move the frontier, combining tools from physics and biology to make more and more of the brain accessible in this way,” conclude Meshulam and Bialek. “The outlook for theory is bright.”


Ref: Statistical mechanics for networks of real neurons : arxiv.org/abs/2409.00412

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.