The Temperature of History

The dispersion of gas in a container tells the story of entropy.

(The title is borrowed from Stephen Brush’s excellent history of 19th century thermodynamics, The Temperature of History.)

At the same 1860 meeting of the British Association for the Advancement of Science where Thomas H. Huxley and Bishop Samuel Wilberforce famously tangled over Darwin’s new theory of evolution, a Scottish physicist named James Clerk Maxwell presented a new idea in physics. In the laws expressed by Isaac Newton, processes are reversible: without friction or other dissipative forces involved, you can run time forward or backward just as easily, and the laws of physics don’t change. A pendulum will swing back and forth forever, planets orbit the Sun, following the same approximate paths for eternity.

Maxwell’s model, known as the kinetic theory, was an attempt to understand the processes of heat transfer—thermodynamics—by showing how they arise from the motion of tiny particles. Temperature is determined by the average speed of these particles, so higher temperature simply means the particles are generally moving faster. Each particle moves in accordance with Newton’s laws of motion, but with so many of them together, new kinds of behavior can arise. If you have two flasks of gas, one of which has a higher temperature than the other, and you bring them into contact with each other, then energy will flow from the hotter container to the cooler one, eventually bringing them both to the same temperature (a condition called thermal equilibrium). This energy is known as heat.

How does this transfer happen? The particles in each container bang up against the sides (which you can feel if you touch the container), and this action allows a bit of energy to pass from one flask into the other. Strictly in accordance with Newtonian physics, the heat will flow preferentially one direction: from the hot container to the colder one. By turning this flow into mechanical energy by pushing pistons or similar processes, you can construct heat engines, including the familiar kinds in cars, steam engines, coal-fired power plants, and so forth. In trying to build the best possible heat engines, engineers in the 19th century discovered that there is not only a maximum practical efficiency for heat transfer, but there is also a maximum possible efficiency: in every heat engine, some energy becomes unavailable. Engineer Sadi Carnot and physicist Rudolf Clausius quantified this loss of usable energy, and in 1865 Clausius gave the concept the name by which it is known today: entropy.

Here is the paradox of kinetic theory, as it stood in the middle of the 19th century: the microscopic particles obey the rules of Newtonian physics, which are independent of the direction of time, but when they are present in large numbers, the same rules give rise to irreversible processes. Heat doesn’t spontaneously flow from a cold object to a hot one (or else refrigeration would be much cheaper!), so time has a direction in thermodynamics: reversing past and future produces a physical picture that makes no sense.

Top: heat energy flows spontaneously from the container with high temperature gas to the one with lower temperature gas, until they both reach the same temperature. However, as the bottom row shows, if they both start at the same temperature, they’ll stay that way forever. Heat flow is not a reversible process.

While Maxwell, Clausius, and others established the kinetic theory of gases as a viable way to understand heat, it was Ludwig Boltzmann who grappled hardest with the implications of entropy. In a series of papers and letters to his critics over a number of years, he began (with difficulty) to shape the modern view of entropy; cosmologists including Sean Carroll, Stephen Hawking, and Roger Penrose continue trying to comprehend the meaning of entropy in the context of the entire universe. I’ll come back to entropy and the universe in a later post, since that’s a large subject in its own right. For now, let’s stick to small systems and try to understand entropy in that easier context.

The video below is a simplified version of a basic experiment: take a small flask of gas and open it up inside a larger container. The natural motion of the gas molecules as they bounce off each other and the sides of the container will tend to spread them throughout the interior. This simulation involves 100 identical gas particles (the difference in appearance in their size is due to perspective), and the only physical interaction is when they collide. In other words, I’m not gaming the system to make entropy increase: it happens on its own. (Watch the video in full-screen mode for best results.)

Note that entropy increases in time, but eventually reaches a maximum once the particles have dispersed enough. While it might fluctuate a bit, it won’t tend to go down again. This is typical of a closed system: we aren’t allowing any energy to enter or leave the box, so the particles will achieve an internal equilibrium once they’ve spread out as far as they can. (In fact, thermal equilibrium is best described using entropy rather than temperature, even though I introduced the concept using temperature for simplicity’s sake. Once entropy has reached its maximum, the system has achieved thermal equilibrium.) If you increase the number of gas particles in the box, the situation only becomes more pronounced: entropy increases that much faster until it reaches its maximum.

Now what happens if you start with the particles spread out through the box, as in the video above? As you no doubt already know, they won’t go and cluster in the corner: they’ll tend to stay dispersed throughout the entire volume. It’s not that they can’t collect in the corner, it’s that it’s highly unlikely for them to do so. Boltzmann gradually realized that the increase of entropy isn’t an absolute, but probable: we must think about it in terms of probabilities and statistics. That idea bugged a lot of people in the 19th century, and maybe it bothers you too. Certainly Boltzmann didn’t suddenly grasp the entire notion in a flash of insight: if you read his papers and letters, he seemed to change his mind several times as he struggled to understand what is actually going on.

So let’s see if we can summarize what we know about entropy, which is usually stated as the Second and Third Laws of Thermodynamics:

  • Entropy is a measure of how much usable energy is inevitably lost in the exchange of heat energy.
  • In a closed system (no exchange of energy with the outside world), entropy tends to increase until it reaches a maximum, and won’t tend to decrease.
  • However, the former is a probabilistic statement: nothing written in the laws of physics forbids the particles in the gas from spontaneously congregating in a corner. It’s just so unlikely to do so that we don’t have to worry about it.
  • In addition, you can’t make entropy vanish, even if you lower the temperature of the gas to absolute zero.

There’s a lot more we can and will say about entropy, but in the interests of wrapping up this post today, I’ll leave them for now. Suffice to say that we haven’t gotten very far: we still need to talk about information, the connection between entropy and disorder, the “arrow of time” (which is the subject of Sean Carroll’s book From Eternity to Here), and what all of this has to do with cosmology. Nevertheless, I hope you see that entropy isn’t entirely mysterious: it’s a natural result arising simply from basic processes in atoms.


5 responses to “The Temperature of History”

  1. I tell my students that entropy is a substance as tangible and energy. What think you of this?

    1. I guess it depends on what you mean by “tangible”. Entropy is as real as energy.

  2. […] who the audience was. It’s a very ambitious book, covering a lot of ground in relativity, statistical physics, quantum physics, and cosmology—largely as a set-up for Sean’s theory about the […]

  3. […] theory in combination with his work on gas laws led to the great 19th century science of Maxwell, Boltzmann, and others, in ways he couldn’t have seen at the time. In a world where determinism and […]

  4. […] must also consider entropy, the measure of the order a system possesses. (See my previous post about entropy and its meaning in gas physics.) Both the highest and the lowest energy states of the system are as ordered as can be, so they […]

%d bloggers like this: