Skip to main content
\(\require{cancel}\)
Physics LibreTexts

13.24 Entropy

Imagine your favorite movie played backward. Could the events possibly ever occur that way? The people would seem to walk backward, which is silly but not impossible. But what about the smashed wine glass that leaps off the ground and assembles itself whole into a woman's hand? Or the man that flies out of the swimming pool and lands dry on the diving board? We would never see these things happen in real life. Yet none of the major laws of macroscopic physics mention time at all. Newton's laws are reversible and so are the conservation laws of energy, momentum, and angular momentum. Nonetheless, there is some intrinsic characteristic in the universe that makes reality largely (but not wholly) irreversible. Part of the solution may lay within the concept of entropy and the second law of thermodynamics. While this law cannot explain why we sense time as an arrow that moves in only one direction, we can use it to explain why nature has a directionality that tends to increase entropy. This is possible because we can define entropy in terms of a microscopic description of order and disorder.

It is easy to see that the atoms in a crystal have more order than the atoms in a gas, but how do scientists actually quantify entropy? Let's make a simple analogy using coins. An atom has many properties — energy, momentum, position, electric charge, spin — but a coin has only one — it can either be heads or tails. Suppose we are tossing four coins at the same time. Each coin can land two ways and the coins are all independent, so the number of possible results or "states" is 2 × 2 × 2 × 2 = 24 = 16. The most orderly result would be four heads or four tails, which occurs 2 out of 16 times, or 13% of the time. The most disorderly or mixed up result is two heads and two tails, which occurs six different ways (check it!) or 38% of the time. Disorder is three times more likely than disorder.

The effect that ordered states are less likely than disordered states becomes more dramatic as the number of particles and possible results increase. With 100 coins, there are 2100 ≈ 1030 possible results or states. So all heads can only occur by chance 1 in 1030 times. On the other hand, the most disordered result is 50 heads and 50 tails, and this occurs 1 in 12 times. What do these numbers mean? If you tossed 100 coins at a time every second, you would need less than a minute to see the most disordered state but 1022 years (far longer than the age of the universe) to see the most ordered state! We have related entropy to probability. In the real world, disorder is vastly more likely outcome than order.

An ordered system will always tend to become more disordered. Let's imagine a well-shuffled deck of cards. What is the probability that you could shuffle it back into its perfect ordered sequence? It is about one chance in 1068! It will never happen. For the same reason, you can never unscramble an egg or unstir the milk from the coffee. The great German physicist Ludwig Boltzmann derived the mathematical definition of entropy:

S = k ln(W)

In this equation, k is the universal Boltzmann constant and ln(W) is the natural logarithm (to the base of e = 2.718 rather to base 10) of the number of microscopic states of a system that gives a certain result. Entropy is higher for a result that has many possible microscopic states (like equal heads and tails) than it is for a result that has few microscopic states (like all heads or all tails). Boltzmann was so proud of this equation and the universal truth that it represents that he had it inscribed on his tombstone.

Not everyone was as pleased as Boltzmann with the second law of thermodynamics. The idea that the universe was "running down" was badly received by English poets and philosophers in the mid-19th century. Algernon Swinbourne wrote:

We thank with weak thanksgiving
Whatever gods there be
That no man lives forever
That dead men rise up never
That even the weariest river
Flows somewhere safe to sea.

The concept of entropy continues to fascinate writers and thinkers. The novelist John Updike wrote an ode to entropy that contains these words:

Entropy! Thou seal on extinction,
Thou curse on Creation
All change distributes energy,
Spills what cannot be gathered again.
A ramp has been built into probability
The universe cannot re-ascend.

While entropy may be a discomfort some thinkers, it can also be used to humorously justify the tendency of many a desk toward clutter, and many a child's bedroom toward disorder. It is interesting to note that in nature, order does at time arise in the form of crystals, differentiated planets, and even grand design spiral galaxies. In all cases, some physical process used force to bring order to chaos, and in many cases this is only a transitory state. Over the long dark future of the universe, galaxies will tend toward chaotic elliptical and black holes will slowly vacuum up much of the universe before evaporating into a fuzz of unstructured energy. While order is possible, it is always a temporary state when long enough timescales are considered.