# 2.5: Entropy of a Monatomic Ideal Gas

- Page ID
- 6340

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

So far in this chapter, we have been dealing very abstractly with a very general class of physical systems. We have made a number of assumptions that are reasonable but that we have not tested in practice. It is time to put some flesh on these formal bones. We do so by using our statistical definition of entropy to calculate the entropy of a monatomic ideal gas. (Here “monatomic” means that we approximate the atoms by point particles, and “ideal” means that those particles do not interact with each other. In addition, we assume that the gas contains only one chemical species and that classical mechanics provides an adequate description. Thus a more precise name for our system would be the “pure classical monatomic idea gas”, but in this case we wisely prefer brevity to precision.) Working with this concrete example will show us that what we have said is sensible (at least for this system), and guide us in further general developments.

The previous pages have been remarkably free of equations for a physics book. Now is the time to remedy that situation. Before studying this section, you need to know that the volume of a \(d\)-dimensional sphere is

\[V_{d}(r)=\frac{\pi^{d / 2}}{(d / 2) !} r^{d}\label{2.12}\]

If you don’t already know this, then read appendix D, “Volume of a Sphere in \(d\) Dimensions”, before reading this section. And if you don’t know the meaning of \(x!\), where \(x\) is a half-integer, then you should read appendix C, “Clinic on the Gamma Function”, before reading appendix D. Finally, if you don’t know Stirling’s approximation for the factorial function, namely

\[\ln n ! \approx n \ln n-n \quad \text { for } n \gg 1,\]

then you should also read appendix E, “Stirling’s Approximation”, before reading further. (Do not be discouraged by this long list of prerequisites. This mathematical material is quite interesting in its own right and will be valuable throughout this book.)

We consider a system of \(N\) identical, classical, non-interacting point particles, each of mass \(m\). The kinetic energy of this system is

\[\frac{1}{2 m}\left(p_{1}^{2}+p_{2}^{2}+\cdots+p_{N}^{2}\right)\label{2.14}\]

and the potential energy is

\[ \left\{\begin{array}{ll}{0} & {\text { if all particles are inside container }} \\ {\infty} & {\text { otherwise. }}\end{array}\right.\]

(One sometimes hears that the ideal gas has “no potential energy”. It is true that there is no potential energy due to atom-atom interaction, but, as the above expression makes clear, there is indeed a potential energy term due to atom-wall interaction. Because of the character we assume for that term, however, the numerical value of the potential energy is always zero. Note also that the ideal gas is not the same as the “hard-sphere gas”. In the hard-sphere model two atoms have infinite potential energy if they are separated by a distance of twice the hard-sphere radius or less. In the ideal gas model two atoms do not interact. It is permissible even for two atoms to occupy the same location. . . only in the model, of course!)

Now that the system is completely specified, it is time to begin the problem. We wish to calculate the entropy

\[S(E, \Delta E, V, N)=k_{B} \ln \frac{W(E, \Delta E, V, N)}{N ! h_{0}^{3 N}},\label{2.16}\]

where the function \(W\) represents the volume in phase space corresponding to energies from \(E\) to \(E + ∆E\) (i.e. the volume of the region \(σ(E, ∆E)\)). Before jumping into this (or any other) problem, it is a good idea to list a few properties that we expect the solution will have. . . this list might guide us in performing the calculation; it will certainly allow us to check the answer against the list to see if either our mathematics or our expectations need revision. We expect that:

- We will be able to take the limit as \(∆E \to 0\) and get sensible results.
- The entropy \(S\) will depend on only the volume of the container and not on its shape.
- If we double the size of the system, by doubling \(E, V,\) and \(N\), then we will double \(S\). (Additivity.)
- \(S\) will depend on \(h_0\) in a trivial, “sea-level” fashion.

The formal expression for the volume of the accessible region of phase space is

\[W(E, \Delta E, V, N)=\text { accessible volume in phase space }\]

\[\begin{array}{l}{=\int_{\sigma(E, \Delta E)} d \Gamma} \\[4pt] {=\int d x_{1} \int d y_{1} \int d z_{1} \cdots \int d x_{N} \int d y_{N} \int d z_{N} \cdots} \\[4pt] {\int d p_{x, 1} \int d p_{y, 1} \int d p_{z, 1} \ldots \int d p_{z, N} \int d p_{y, N} \int d p_{z, N}}\end{array}.\]

The complexity of this integral rests entirely in the complexity of the shape of \(σ(E, ∆E)\) rather than in the complexity of the integrand, which is just 1. Fortunately the integral factorizes easily into a position part and a momentum part, and the position part factorizes into a product of integrals for each particle (see problem 2.8). For, say, particle number 5, if the particle is inside the container it contributes 0 to the energy, so the total energy might fall between \(E\) and \(E + ∆E\) (depending on other factors). But if it is outside the container, then it contributes \(\infty\) to the energy, which always exceeds the limit \(E + ∆E\). Thus the integral is just the volume of the container:

\[\int d x_{5} \int d y_{5} \int d z_{5}=V.\]

This integral depends on the volume \(V\) but is independent of the shape of the container. We will soon see that, as a consequence, the entropy depends on volume but not shape (which is in accord with our expectations).

The integrals over momentum space do not factorize, so we must consider the entire \(3N\)-dimensional momentum space rather than \(N\) separate 3-dimensional spaces. We know that the total potential energy is zero (unless it is infinite), so the energy restriction is taken up entirely by the kinetic energy. Equation \ref{2.14} tells us that the momentum space points with energy E fall on the surface of a sphere of radius \(\sqrt{2 m E}\). Thus the accessible region in momentum space is a shell with inner radius \(\sqrt{2 m E}\) and with outer radius \(\sqrt{2 m(E+\Delta E)}\). (Notice that we are counting all the microstates within the accessible region of phase space, not just “typical” microstates there. For example, one microstate to be counted has all of the particles at rest, except for one particle that has all the energy of the system and is heading due west. This is to be counted just as seriously as is the microstate in which the energy is divided up with precise equality among the several particles, and they are traveling in diverse specified directions. Indeed, the system has exactly the same probability of being in either of these two microstates.) Using Equation \ref{2.12} for the volume of a \(3N\)-dimensional sphere, the volume of that shell is

\[\frac{\pi^{3 N / 2}}{(3 N / 2) !}\left[(2 m(E+\Delta E))^{3 N / 2}-(2 m E)^{3 N / 2}\right].\]

I prefer to write this result in a form with all the dimensionfull quantities lumped together, namely as

\[\frac{\pi^{3 N / 2}}{(3 N / 2) !}(2 m E)^{3 N / 2}\left[\left(1+\frac{\Delta E}{E}\right)^{3 N / 2}-1\right].\]

The quantity in square brackets is dimensionless. To find the accessible volume of the entire phase space, we multiply the above result by \(V^N\), the result of performing \(N\) separate position integrals. Thus

\[W(E, \Delta E, V, N)=\frac{\left(2 \pi m E V^{2 / 3}\right)^{3 N / 2}}{(3 N / 2) !}\left[\left(1+\frac{\Delta E}{E}\right)^{3 N / 2}-1\right].\]

As promised, *W* depends upon the variables *E*, ∆*E*, *V*, and *N*. It also depends upon the “unmentioned” mechanical parameter *m*. The arguments on page 15 have been vindicated. . . the phase space volume *W* depends only upon *V* and not upon the detailed shape of the container. At last we can find the entropy! It is

\[S=k_{B} \ln \frac{W}{N ! h_{0}^{3 N}}=k_{B} \ln \left\{\left(\frac{2 \pi m E V^{2 / 3}}{h_{0}^{2}}\right)^{3 N / 2} \frac{1}{N !(3 N / 2) !}\left[\left(1+\frac{\Delta E}{E}\right)^{3 N / 2}-1\right]\right\}\]

or

\[ \frac{S}{k_{B}}=\frac{3}{2} N \ln \left(\frac{2 \pi m E V^{2 / 3}}{h_{0}^{2}}\right)-\ln N !-\ln \left(\frac{3}{2} N\right) !+\ln \left[\left(1+\frac{\Delta E}{E}\right)^{3 N / 2}-1\right].\label{2.24}\]

How does this expression compare to our list of expectations on page 22?

- If we take the limit \(∆E \to 0\), the entropy approaches \(−\infty\), contrary to expectations.
- The entropy \(S\) depends on the volume of the container but not on its shape, in accord with expectations.
- If we double \(E, V\), and \(N\), then \(S\) will not exactly double, contrary to expectations.
- The entropy \(S\) does depend on \(h_0\) in a “sea-level” fashion, in accord with expectations.

Only two of our four expectations have been satisfied. (And it was obvious even from Equation \ref{2.16} that the fourth expectation would be correct.) How could we have gone so far wrong?

The trouble with expression \ref{2.24} for the entropy of a monatomic ideal gas is that it attempts to hold for systems of any size. In justifying the definition of entropy (2.7) (and in writing the list of expectations on page 22) we relied upon the assumption of a “large” system, but in deriving expression \ref{2.24} we never made use of that assumption. On the strength of this revised analysis we realize that our expectations will hold only approximately for finite systems: they will hold to higher and higher accuracy for larger and larger systems, but they will hold exactly only for infinite systems.

There is, of course, a real problem in examining an infinite system. The number of particles is infinite, as is the volume, the energy, and of course the entropy too. Why do we need an equation for the entropy when we already know that it’s infinite? Once the problem is stated, the solution is clear: We need an expression not for the total entropy \(S\), but for the entropy per particle \(s = S/N\). More formally, we want to examine the system in the “thermodynamic limit”, in which

\[N \rightarrow \infty \quad \text { in such a way that } \quad \frac{E}{N} \rightarrow e, \frac{V}{N} \rightarrow v, \text { and } \frac{\Delta E}{N} \rightarrow \delta e.\]

In this limit we expect that the entropy will grow linearly with system size, i.e. that

\[S(E, \Delta E, V, N) \rightarrow N s(e, v, \delta e).\label{2.26}\]

The quantities written in lower case, such as e, the energy per particle, and \(v\), the volume per particle, play the same role in statistical mechanics as “per capita” quantities do in demographics. (The gross national product of the United States is much larger than the gross national product of Kuwait, but that is just because the United States is much larger than Kuwait. The GNP per capita is higher in Kuwait than in the United States.)

Let’s take the thermodynamic limit of expression \ref{2.24} (the entropy of a finite system) to find the entropy per particle of an infinite monatomic ideal gas. The first thing to do, in preparing to take the thermodynamic limit, is to write \(V\) as \(vN, E\) as \(eN\), and \(∆E\) as \(δeN\) so that the only size-dependent variable is \(N\). This results in

\[\frac{S}{k_{B}}=\frac{3}{2} N \ln \left(\frac{2 \pi m e v^{2 / 3} N^{5 / 3}}{h_{0}^{2}}\right)-\ln N !-\ln \left(\frac{3}{2} N\right) !+\ln \left[\left(1+\frac{\delta e}{e}\right)^{3 N / 2}-1\right].\]

Next we use Stirling’s approximation,

\[\ln n ! \approx n \ln n-n \quad \text { for } n \gg 1,\]

to simplify the expressions like \(\ln( \frac{3}{2}*N*)!\) above. Thus for large values of \(N\) we have approximately (an approximation that becomes exact as \(N\to \infty\))

\(\begin{aligned} \frac{S}{k_{B}} & \approx \frac{3}{2} N \ln \left(\frac{2 \pi m e v^{2 / 3} N^{5 / 3}}{h_{0}^{2}}\right)-N \ln N+N-\left(\frac{3}{2} N\right) \ln \left(\frac{3}{2} N\right)+\frac{3}{2} N+\ln \left[\left(1+\frac{\delta e}{e}\right)^{3 N / 2}-1\right] \\ &=\frac{3}{2} N \ln \left(\frac{2 \pi m e v^{2 / 3}}{h_{0}^{2}}\right)+\frac{3}{2} N \ln N^{5 / 3}-N \ln N+N-\frac{3}{2} N \ln \left(\frac{3}{2} N\right)+\frac{3}{2} N+\ln \left[\left(1+\frac{\delta e}{e}\right)^{3 N / 2}-1\right] \end{aligned}.\)

The first term on the right increases linearly with \(N\). The next bunch of terms is

\(\begin{aligned} & \frac{3}{2} N \ln N^{5 / 3}-N \ln N+N-\frac{3}{2} N \ln \left(\frac{3}{2} N\right)+\frac{3}{2} N \\=& \frac{5}{2} N \ln N-N \ln N+N-\frac{3}{2} N \ln \left(\frac{3}{2}\right)-\frac{3}{2} N \ln (N)+\frac{3}{2} N \\=& N\left[\frac{5}{2}-\frac{3}{2} \ln \left(\frac{3}{2}\right)\right] \end{aligned}.\)

which again increases linearly with \(N\). The final term is, as \(N\) grows

\[\ln \left[\left(1+\frac{\delta e}{e}\right)^{3 N / 2}-1\right] \approx \ln \left(1+\frac{\delta e}{e}\right)^{3 N / 2}\]

\[=\frac{3}{2} N \ln \left(1+\frac{\delta e}{e}\right).\]

This term not only increases linearly with \(N\) in the thermodynamic limit, it also vanishes as \(δe \to 0!\) (This is a general principle: One must first take the thermodynamic limit \(N \to \infty\), and only then take the “thin phase space limit” \(δe \equiv ∆E/N \to 0\).)

Our expectation \ref{2.26} that in the thermodynamic limit the entropy would be proportional to the system size \(N\) has been fully vindicated. So has the expectation that we could let \(∆E \to 0\), although we have seen that we must do so carefully. The end result is that the entropy per particle of the pure classical monatomic ideal gas is

\[s(e, v)=k_{B}\left[\frac{3}{2} \ln \left(\frac{4 \pi m e v^{2 / 3}}{3 h_{0}^{2}}\right)+\frac{5}{2}\right].\]

This is called the “Sackur-Tetrode formula”.^{4} It is often written as

\[ S(E, V, N)=k_{B} N\left[\frac{3}{2} \ln \left(\frac{4 \pi m E V^{2 / 3}}{3 h_{0}^{2} N^{5 / 3}}\right)+\frac{5}{2}\right], \label{2.32}\]

with the understanding that it should be applied only to very large systems, i.e. to systems effectively at the thermodynamic limit.

## Problems

2.12 **Entropy of a spin system**

Consider again the ideal paramagnet of problem 2.9.

a. Write down an expression for \(\ln Ω(E, ∆E, H, N)\) as a function of \(E\). Simplify it using Stirling’s approximation for large values of \(N\). (Clue: Be careful to never take the logarithm of a number with dimensions.)

b. Find an expression for the entropy per spin \(s(e, H)\) as a function of the energy per spin e and magnetic field \(H\) in the thermodynamic limit.

c. Sketch the resulting entropy as a function of the dimensionless quantity \(u \equiv e/mH\). Does it take on the proper limits as \(e \to \pm mH\)? (In your sketch pay special attention to endpoints of the domain, and to any places where the function or its derivative suffers a discontinuity, or a kink, or goes to zero or infinity. In general, it is important that the sketch convey a correct impression of the qualitative character of the function, and less important that it be quantitatively accurate.)

2.13 **The approach to the thermodynamic limit**

For the classical monatomic ideal gas, plot entropy as a function of particle number using both the “finite size” form \ref{2.24} and the Sackur-Tetrode form \ref{2.32}. We will see in problem 4.11 that for a gas at room temperature and atmospheric pressure, it is appropriate to use

\[ E V^{2 / 3} / h_{0}^{2}=\left(1.66 \times 10^{29} \mathrm{kg}^{-1}\right) N^{5 / 3}.\]

Use the masses of argon and krypton. All other things being equal, is the thermodynamic limit approached more rapidly for atoms of high mass or for atoms of low mass?

2.14 **Other energy conventions**

In the text we found the entropy of a monatomic ideal gas by assuming that the potential energy of an atom was zero if the atom were inside the box and infinite if it were outside the box. What happens if we choose a different conventional zero of potential energy so that the potential energy is U for an atom inside the box and infinite for an atom outside the box?

2.15 **Other worlds**

Find the entropy as a function of \(E, V\), and \(N\) in the thermodynamic limit for a monatomic ideal gas in a world with arbitrary5 spatial dimensionality \(d\).

2.16 **Ideal gas mixtures**

Consider a large sample of classical monatomic ideal gas that is a mixture of two components: \(N_A\) particles of mass \(m_A\) and \(N_B\) particles of mass \(m_B\). If \(N \equiv N_A + N_B\), show that the entropy is

\[ S\left(E, V, N_{A}, N_{B}\right)=+k_{B} N_{A}\left[\frac{3}{2} \ln \left(\frac{4 \pi m_{A} E V^{2 / 3}}{3 h_{0}^{2} N^{5 / 3}}\right)+\frac{5}{2}\right]\]

^{4}Otto Sackur (1880–1914) was a German physical chemist. Hugo Tetrode (1895–1931) was a Dutch theoretical physicist. Each independently uncovered this equation in 1912.

^{5}Why, you wonder, should anyone care about a world that is not three-dimensional? For three reasons: (1) There are important physical approximations of two-dimensional worlds (namely surfaces) and of one-dimensional worlds (namely polymers). (2) The more general formulation might help you in unexpected ways. For example, Ken Wilson and Michael Fisher were trying to understand an important problem concerning critical points. They found that their technique could not solve the problem in three dimensions, but it *could* solve the problem in four dimensions. Then they figured out how to use perturbation theory to slide carefully from four dimensions to three dimensions, thus making their solution relevant to real physical problems. Wilson was awarded the Nobel Prize for this work. This illustrates the third reason, namely you never can tell what will be important and hence: (3) Knowledge is better than ignorance.

\[\begin{array}{l}{+k_{B} N_{B}\left[\frac{3}{2} \ln \left(\frac{4 \pi m_{B} E V^{2 / 3}}{3 h_{0}^{2} N^{5 / 3}}\right)+\frac{5}{2}\right]} \\ {-k_{B} N\left[\left(\frac{N_{A}}{N}\right) \ln \left(\frac{N_{A}}{N}\right)+\left(\frac{N_{B}}{N}\right) \ln \left(\frac{N_{B}}{N}\right)\right]}\end{array}\]

(Clue: Use the result of problem D.2.)