What is Non-Equilibrium Statistical Physics?

Notes provided by forum experts as FAQ and reference material. This is not a wiki, but rather a small repository for material relevant to common discussions.

What is Non-Equilibrium Statistical Physics?

Postby linford86 on March 8th, 2010, 2:31 pm 

I've been asked on occasion what it is that I do, and I often find myself just answering with a bunch of seemingly incoherent jargon. For that reason, I decided to write a short piece on what exactly non-equilibrium statistical physics is and why it's important. There will be a bias here towards towards aspects of the discipline that I have spent some time thinking about, but I hope that you, the reader, will let that slide.

Equilibrium Statistical Physics

First, I need to explain what equilibrium statistical physics is and a short tid bit about it's history. In the 18th and 19th centuries, much attention was paid to problems dealing with things like steam engines, gases, and the foundations of chemistry. Largely, these are questions dealing with the structure of matter. Classical thermodynamics dealt with the large scale, macroscopic properties of material systems. These macroscopic thermodynamic properties are things like temperature, heat capacity, pressure, and volume (it's also possible to define entropy macroscopically but I'd rather not explore that here.) Molecular kinetic theory was an early attempt to derive macroscopic thermodynamic variables in terms of microscopic properties. Equilibrium statistical physics grew out of classical molecular kinetic theory and classical thermodynamics.

In statistical physics, one tries to derive macroscopic thermodynamic principles in terms of probability distributions over microscopic processes. In classical statistical physics, the actual microscopic properties are not really random. The microscopic properties in such systems are governed by Newton's Laws and, given a full set of initial conditions for all of the positions and momenta for all of the particles, one can, in principle, predict the entire time evolution for the system. Unfortunately, this is impossible in practice. Most actual thermodynamic systems are so large that the full set of initial conditions cannot be specified even on a computer. A mole of gas, for example, contains 6*10^23 particles, meaning that one would have to specify 3.6*10^24 initial conditions. Moreover, even if one could specify the full set of initial conditions, one would not be able to specify the full trajectories of all of the constituent particles. Therefore, one circumvents this problem altogether and imagines assigning probability distributions over microscopic properties. Macroscopic properties are then produced by some appropriate averaging procedure (I'll make this more clear in a moment.)

But wait!, you might say, if the system isn't really random, then why can we assign probability distributions? Well, just consider what happens when you flip a coin. Flipping a coin is a grade school example of a supposedly random process, yet a coin is a macroscopic object. The coin is not subject to any kind of quantum mechanical processes and therefore cannot be intrinsically random. Moreover, if we fully specified the state of the coin -- i.e. we fully specified its environment, whether or not it was weighted and if so by how much, and the coin's initial position and momentum -- then we could predict the coin's trajectory without error. However, when we flip coins, we don't usually do this. Usually, when flipping coins, we have next to complete ignorance about the coin's environment, the process flipping the coin, it's initial momentum, where precisely it was dropped from, etc. Living in a state of ignorance, the coin's end state -- whether it is heads or tails -- appears to be completely random. And, in fact, this is one way of thinking about probability distributions. Probability distributions reflect our state of knowledge (or of ignorance) about a system. You might say that it's a way of characterizing what we know about a system. In fact, given additional information about a system, there are well specified rules for updating the relevant probability distribution.

Since probability distributions reflect our state of knowledge (or of ignorance) about a system, and since we do not know the microscopic states of thermodynamic systems, we can invent probability distributions that connect the microscopic states to the macroscopic states. To do this, we imagine every possible microscopic state which is consistent with the observed macroscopic thermodynamic state. The easiest such distribution to produce is the so-called Boltzmann distribution. The Boltzmann distribution is the probability distribution for the microscopic state to have some particular energy (given a specified macroscopic temperature):



In this expression, is a normalization factor called the partition function, is the energy of the nth state (technically, the Hamiltonian of the nth state), and is the inverse of the temperature (i.e. . The advanced reader will note that I used units here such that Boltzmann's constant is 1, or, equivalently, such that temperature has units of energy. This is typical in physics, but not in other disciplines.)

From the Boltzmann distribution and the partition function, one can produce all of the macroscopic thermodynamic variables. In fact, it's quite easy to produce the ideal gas law, the equation that characterizes an ideal gas in classical thermodynamics, by taking the appropriate derivatives of the partition function.

Problems with Equilibrium Statistical Physics

This picture is all very nice, but what's wrong with it? Well, it assumes that the system is at best only very slowly changing over time. Classical thermodynamics, along with equilibrium statistical physics, paints a picture of systems that never change all that much. Wait!, you protest, my physics teacher taught us all about engines using thermodynamics! Those engines seem to be changing over time! And, indeed, you would be right. But if you look closely, you'll find the fine print. What we think of as changes over time in classical thermodynamics isn't really change at all. What one does in those theories is to find all of the equilibrium states and then assume that one can get continuously from one equilibrium state to another. For many systems, that's an excellent approximation. For other systems, however, predictions can just be dead wrong.

Imagine a piston with a gas inside. If one slowly pulls on the piston, the gas will slowly fill the container in such a way that there is never any empty space. But now imagine that one pulls really hard on the piston, moving it faster than the highest speed of any of the molecules in the gas. Now, the gas is left on one side of the vessel, then there's empty space, and finally the piston. If we only ever consider equilibrium thermodynamics, or the underlying statistical theory, then we cannot answer questions about how this obviously unstable system approaches equilibrium (the equilibrium system, in this case, would be one in which the gas completely fills the container with no empty spaces.) Of course, it approaches equilibrium quite quickly, but there are other systems which do not.

Imagine a pile of dry dirt, where each grain can freely slide on the other grains. The individual grains are all in mechanical equilibrium with one another, all just sitting there. However, if I increase the load on the top of the pile, I can increase the internal stresses. Eventually, the entire pile dissembles itself -- avalanche! -- and the system finds some other unstable state and sits there. Of course, if I keep doing this, I can get the system to keep on internally re-arranging itself to support any particular load that I would like. Eventually, the system is completely destroyed, but the point is that I can find many unstable states. Since I can always find a state with lower energy by re-arranging the pile, and since the pile always re-arranges itself upon minute perturbations, we say that these unstable states are non-equilibrium steady states (eventually I'll make this notion clearer.) We say that the system has many "metastable states" -- metastable states are these unstable states that the system likes to sit in before we perturb it again. Phenomena which gets stuck in a metastable state before we apply a sufficient force to get it out of that metastable state are called "stick-slip phenomena" and are responsible for avalanches, earthquakes, and many other things.

The Non-Equilibrium Picture

Okay, so we need a new formalism, one that can support systems far away from thermodynamic equilibrium. But how can we produce one? Well, we know that there are going to be some set of microscopic states and some set of transitions between those states. Furthermore, we want to retain this trick of assuming things are random to avoid having our models get overly complicated. We therefore assume that there are random (we call them stochastic) microscopic processes, transitioning micro-states back and forth. Almost immediately, one produces the master equation:



Here is the probability to be in state , is the probability to transition from state l to state k, and so forth. For those readers who are mathematically adept, this equation simply reflects conservation of probability.

I said earlier that I would make the notion of non-equilibrium steady states more precise. I will do that now. One might naively think that equilibrium statistical physics is produced if one sets . However, this actually is not sufficient. We need a much stronger condition -- one called "detailed balance" -- to recover equilibrium statistical physics.

Looking at the master equation, there are two ways that -- and the other is if the terms in the summation cancel in some other way. The first of these ways is called detailed balanced. Roughly speaking, it means that the flux of probabilty into a state equals the flux of probability out of the state. Systems which have microscopic dynamics that violate detailed balance are then said to be out of equilibrium (but in a steady state.) Such systems can still have steady states -- i.e. they can still fail to change over time -- so long as the left hand side of the master equation is zero. We call such states non-equilibrium steady states in order to differentiate them from thermodynamic equilibrium states.

There are many systems which exhibit non-equilibrium statistical physics. Some of these I've already mentioned -- like granular systems, avalanches, earthquakes, and rapidly expanding gases. Still others include biological processes (where there are often non-equilibrium steady states), the stock market (this falls under the domain of econophysics), statistically large populations of animals (i.e. generalization of the Lotka-Voltera equations), vehicular traffic, and so on.
User avatar
linford86
Active Member
 
Posts: 1933
Joined: 14 Apr 2009
Location: Planet Earth


Return to Expert Notes

Who is online

Users browsing this forum: No registered users and 2 guests