Statistical Mechanics

Entropy as a Measure of Possibilities

On the previous page, I gave a general description of entropy: it is a quantity that increases as a system evolves toward equilibrium. But it will be worthwhile to explain more carefully what this mysterious quantity really is. We can gain a better understanding of entropy by distinguishing between two ways of describing the state (physical condition) of a system at a particular time.

Imagine a room full of nothing but air. The physical state of the room at a given time could be described very precisely, at least in principle, by specifying the exact position and velocity of each air molecule in the room. (Of course it’s not possible to give such a description in real life.) The precise state of a system at a given time, in terms of the positions and velocities of its microscopic particles, is called the system’s microstate.

On the other hand, we could describe the state of the room less precisely, in terms of more easily observable properties. For example, we could describe how the temperature and pressure of the air varies throughout the room, without specifying the exact positions and velocities of all the molecules. The state of a system described in terms of volume, temperature, and pressure is called the system’s thermodynamic condition.

Thermodynamic Conditions

This simulation allows you to experiment with the thermodynamic condition of a gas. In the “Explore” experiment, try pumping some gas molecules into the box, then adjust the temperature of the system and/or change the volume of the box to see how pressure, temperature, and volume are related. In the “Energy” experiment, notice that higher temperature corresponds to higher average kinetic energy of the molecules. In the “Diffusion” experiment, add some molecules to one side of the box, then remove the divider. The gas will expand to fill the whole box and reach a new equilibrium condition. (This process, called “free expansion”—or “Joule expansion”—is an example of an irreversible thermodynamic process, as we’ll see on the next page.)

Many different microstates have the same volume, temperature, and pressure. For example, each molecule could be located at a slightly different position or be travelling in a different direction, without changing the volume, temperature, or pressure of the air in the room. Thus, any given thermodynamic condition corresponds to many possible microstates. The system will be in exactly one microstate at a time, of course, but its microstate is constantly changing as molecules bounce around. Although the microstate is constantly changing, the system could remain in the same thermodynamic condition for a while, because many different microstates correspond to the same thermodynamic condition.

We can think of entropy as a measure of how many possible microstates correspond to a system’s thermodynamic condition. The higher the entropy of a system, the more possible microstates correspond to its thermodynamic condition. That’s still not an exact definition of entropy. But it captures the main idea behind the definition of entropy used in statistical mechanics—a branch of physics that relates the laws of thermodynamics to the laws governing motion and energy at the molecular level. For a more detailed explanation of the statistical mechanical definition of entropy, see appendix C [link coming soon].

Actually, several different types of entropy are defined in statistical mechanics. The type of entropy just described is Boltzmann entropy, which is supposed to mimic thermodynamic entropy for some systems (namely, ideal gases). Other types include Gibbs entropy (used in classical statistical mechanics) and von Neumann entropy (used in quantum statistical mechanics). There are also types of entropy that have little to do with physics. For example, Shannon entropy is a concept used in information theory, and Kolmogorov-Sinai entropy is used in measure theory. All of these varieties of entropy differ from thermodynamic entropy in important ways; but Boltzmann entropy is closely analogous to thermodynamic entropy in that it increases as a system evolves toward equilibrium, and reaches a maximum value when the system is in equilibrium. Boltzmannian statistical mechanics also provides an intuitive explanation for the fact that entropy tends to increase toward equilibrium, as I will explain in what follows.

The higher the entropy of a thermodynamic condition, the more microstates correspond to it. A system’s equilibrium condition is the condition of maximum entropy, and corresponds to more possible microstates than any other thermodynamic condition. The equilibrium condition of air in a room—the condition in which the air has the same temperature and pressure throughout the room—corresponds to more microstates than a condition in which the pressure is much greater at the left side of the room, for instance. There are more possible ways for the air molecules to be distributed evenly throughout the whole room than for most of the molecules to be clustered near the left side. Technically, there are infinitely many ways for the molecules to be distributed throughout the whole room, and also infinitely many ways for them to be clustered near the left side. Nonetheless, there is a definite mathematical sense in which there are more ways for the molecules to be distributed throughout the whole room. See appendix C for an explanation.