How Entropy is Related to Probability

There is a direct correlation between entropy and physical probability, or chance. The higher the entropy of a thermodynamic condition, the more likely it is that a system will evolve into that condition by chance. To see why, let’s return to the example of air in a room. Suppose we assume that the air molecules have been freely bouncing around in the room for some time, undisturbed by any other forces except their own collisions with each other and with the walls of the room. Regardless of whether some parts of the room were initially hotter or colder than others, those faster and slower molecules will quickly disperse throughout the room, and collisions between them will slow down the fast ones and speed up the slow ones. Eventually, most of the molecules will be moving at similar speeds, and the temperature (average kinetic energy) will be the same throughout the room.

Similarly, suppose the air pressure was initially higher in some part of the room—the left side, say—because more molecules were located there. If the molecules are just bouncing around randomly, many of those extra molecules will soon find their way to the other side of the room. In fact, if enough time elapses, the amount of time each molecule spends in any particular part of the room will be proportional to the volume of that region. For example, each molecule will spend about half of the time in the left half of the room, a third of the time in the left third of the room, and so on. Therefore, a condition in which the air pressure is mostly uniform throughout the whole room is far more probable—likely to occur—than a condition in which the air pressure is uneven at various places in the room.

To make this point clearer, let’s use a simple analogy. My two daughters each have their own piggy bank. Now suppose they find some loose change lying around, and they decide to distribute it between their two piggy banks. Instead of just splitting the money evenly between the two piggies, they decide to distribute it by playing the following game. They flip each penny, and if it lands “heads” it goes in the older girl’s piggy bank; if it lands “tails” it goes to the younger. (I would never encourage my daughters to gamble, but this hypothetical scenario provides a convenient illustration.)

There are many possible outcomes to this game. One piggy might end up significantly wealthier than the other. But it’s much more likely or probable that the two piggies will contain roughly the same number of coins when the game is over, especially if there are a lot of coins. To see why, consider the following examples.

What is the chance of one piggy ending up with significantly more coins than the other? Well, if there are only four coins, there is a good chance that one piggy will get three or four of those coins: the probability is 0.625 (a 62.5% chance), to be exact.

Here’s the math: For 4 tosses, there are 24 = 16 possible outcomes. Of those, 4 include 3 heads, and one includes 4 heads.  Similarly, 4 include 3 tails, and 1 includes 4 tails.  So a total of 10 out of 16 possible outcomes are ones in which one piggy gets at least 3 of the 4 coins.

On the other hand, if there are 100 coins, then the chance of a piggy getting a similarly disproportionate share of the money (i.e. the chance of getting ¾ or more of the coins) is extremely low: the probability is only 0.00000056 (a 0.000056% chance).

Here’s the math: For 100 tosses, there are 2100 = 1.27 × 1030 possible outcomes. Of those outcomes, how many include at least 75 heads? That’s (100 choose 100) + (100 choose 99) + (100 choose 98) + … + (100 choose 75) = 3.57 × 1023 outcomes with at least 75 heads. The same number of outcomes include at least 75 tails, for a total of 7.14 × 1023 outcomes in which one piggy gets ¾ or more of the coins. So the probability of one piggy getting ¾ or more of the coins is (7.14 × 1023)/( 1.27 × 1030) = 5.6 × 10-7

The piggy bank analogy illustrates why high-entropy conditions are far more likely to occur by chance than low-entropy ones. The possible outcomes of the coin game are analogous to possible microstates. Each coin has a 50% chance of landing in one piggy or the other, just as each molecule has a 50% chance of being in one half of the room or the other. And there are many more microstates with uniform temperature and pressure throughout a room, just as there are many more coin toss outcomes in which the two piggies get roughly the same number of coins.

Thus, higher-entropy conditions are far more likely to occur than low-entropy ones, assuming that a system’s microstates are determined at random so that any given microstate is as likely to occur as any other. Equilibrium is the most probable condition of all, because an overwhelming majority of the possible microstates correspond to the equilibrium condition. That explains why systems tend to evolve toward equilibrium, and why they stay in equilibrium unless disturbed by other systems.

The piggy bank analogy also illustrates a rather surprising fact: it is possible for the second law to be violated! Technically, there is a small chance that the air molecules in a room could all bounce to the left side at the same time—just as there is a small chance that one piggy could get all the money. Of course, the probability of that happening is unfathomably low. Increasing the number of particles (or by analogy, the number of coins) drastically reduces the probability. The greater the number of particles, the lower the probability. Macroscopic systems contain billions upon billions of molecules, so the probability of a low-entropy condition occurring by chance is so close to zero that it just doesn’t happen. But it is possible nonetheless.