# 17.8: Entropy of Mixing, and Gibbs' Paradox

- Page ID
- 8668

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)In Chapter 7, we defined the increase of entropy of a system by supposing that an infinitesimal quantity dQ of heat is added to it at temperature T, and that no irreversible work is done on the system. We then asserted that the increase of entropy of the system is *dS* = *dQ*/*T*. If some irreversible work is done, this has to be added to the *dQ*.

We also pointed out that, in an isolated system any spontaneous transfer of heat from one part to another part was likely (*very* likely!) to be from a hot region to a cooler region, and that this was likely (*very* likely!) to result in an increase of entropy of the closed system − indeed of the Universe. We considered a box divided into two parts, with a hot gas in one and a cooler gas in the other, and we discussed what was likely (*very* likely!) to happen if the wall between the parts were to be removed. We considered also the situation in which the wall were to separate two gases consisting or red molecules and blue molecules. The two situations seem to be very similar. A flow of heat is not the flow of an “imponderable fluid” called “caloric”. Rather it is the mixing of two groups of molecules with initially different characteristics (“fast” and “slow”, or “hot” and “cold”). In either case there is likely (*very* likely!) to be a spontaneous mixing, or increasing randomness, or increasing disorder or increasing *entropy*. Seen thus, entropy is seen as a measure of the degree of disorder. In this section we are going to calculate the increase on entropy when two different sorts of molecules become mixed, without any reference to the flow of heat. This concept of entropy as a measure of disorder will become increasingly apparent if you undertake a study of *statistical mechanics*.

Consider a box containing two gases, separated by a partition. The pressure and temperature are the same in both compartments. The left hand compartment contains *N*_{1} moles of gas 1, and the right hand compartment contains *N*_{2} moles of gas 2. The Gibbs function for the system is

\[G=R T\left[N_{1}\left(\ln P+\phi_{1}\right)+N_{2}\left(\ln P+\phi_{2}\right)\right].\]

Now remove the partition, and wait until the gases become completely mixed, with no change in pressure or temperature. The partial molar Gibbs function of gas 1 is

\[\mu_{1}=R T\left(\ln p_{1}+\phi_{1}\right)\]

and the partial molar Gibbs function of gas 2 is

\[\mu_{2}=R T\left(\ln p_{2}+\phi_{2}\right).\]

Here the pi are the partial pressures of the two gases, given by and *p*_{1} = *n*_{1}*P*, *p*_{2} = *n*_{2}*P *where the *n _{i}* are the mole fractions.

The total Gibbs function is now *N*_{1}µ_{1} + *N*_{2}µ_{2}, or

\[G=R T\left[N_{1}\left(\ln n_{1}+\ln P+\phi_{1}\right)+N_{2}\left(\ln n_{2}+\ln P+\phi_{2}\right)\right].\]

The new Gibbs function minus the original Gibbs function is therefore

\[\Delta G=R T\left(N_{1} \ln n_{1}+N_{2} \ln n_{2}\right)=N R T\left(n_{1} \ln n_{1}+n_{2} \ln n_{2}\right).\]

This represents a *decrease* in the Gibbs function, because the mole fractions are less than 1.

The new entropy minus the original entropy is \(\Delta S=-\left[\frac{\partial(\Delta G)}{\partial T}\right]_{P}\), which is

\[\Delta S=-N R\left(n_{1} \ln n_{1}+n_{2} \ln n_{2}\right).\]

This is positive, because the mole fractions are less than 1.

Similar expressions will be obtained for the increase in entropy if we mix several gases.

Here’s maybe an easier way of looking at the same thing. (Remember that, in what follows, the mixing is presumed to be ideal and the temperature and pressure are constant throughout.)

Here is the box separated by a partition:

Concentrate your attention entirely upon the left hand gas. Remove the partition. In the first nanosecond, the left hand gas expands to increase its volume by *dV*, its internal energy remaining unchanged (*dU* = 0). The entropy of the left hand gas therefore increases according to \( d S=\frac{P d V}{T}=N_{1} R \frac{d V}{V}\). By the time it has expanded to fill the whole box, its entropy has increased by ln( / ). *RN*_{1}_{ }ln(*V/V*_{1}). Likewise, the entropy of the right hand gas, in expanding from volume *V*_{2} to *V*, has increased by *RN*_{2} ln(*V/V*_{2}). Thus the entropy of the system has increased by *R*[ *N*_{1} ln(*V/V*_{1}) ln(*V/V*_{2})], and this is equal to *RN*[ *n*_{1} ln(1/*n*_{1}) ln(1/*n*_{2})] = − *NR*[*n*_{1 }ln *n*_{1} + *n*_{2} ln *n*_{2}].

Where there are just two gases, *n*_{2} = 1 − *n*_{1}, so we can conveniently plot a graph of the increase in the entropy versus mole fraction of gas 1, and we see, unsurprisingly, that the entropy of mixing is greatest when \(n_{1}=n_{2}=\frac{1}{2}\), when ∆S = *NR* ln 2 = 0.6931*NR*.

What is *n*_{1} if \(\Delta S=\frac{1}{2} N R \)? (I make it *n*_{1} = 0.199 710 or, of course, 0.800 290.)

We initially introduced the idea of entropy in Chapter 7 by saying that if a quantity of heat *dQ* is added to a system at temperature *T*, the entropy increases by *dS* = *dQ/T*. We later modified this by pointing out that if, in addition to adding heat, we did some irreversible work on the system, that irreversible work was in any case degraded to heat, so that the increase in entropy was then *dS* = (*dQ* + *dW*_{irr})/*T*. We now see that the simple act of mixing two or more gases at constant temperature results in an increase in entropy. The same applies to mixing any substances, not just gases, although the formula −*NR*[*n*_{1} ln *n*_{1} + *n*_{2} ln *n*_{2}] applies of course just to ideal gases. We alluded to this in Chapter 7, but we have now placed it on a quantitative basis. As time progresses, two separate gases placed together will spontaneously and probably (*very* probably!) irreversibly mix, and the entropy will increase. It is most unlikely that a mixture of two gases will spontaneously separate and thus decrease the entropy.

*Gibbs’ Paradox* arises when the two gases are identical. The above analysis does nothing to distinguish between the mixing of two different gases and the mixing of two identical gases. If you have two identical gases at the same temperature and pressure in the two compartments, nothing changes when the partition is removed – so there should be no change in the entropy. Within the confines of classical thermodynamics, this remains a paradox – which is resolved in the study of *statistical mechanics*.

Now consider a reversible chemical reaction of the form Reactants ↔ Products − and it doesn’t matter which we choose to call the “reactants” and which the “products”. Let us suppose that the Gibbs function of a mixture consisting entirely of “reactants” and no “products” is less than the Gibbs function of a *mixture* consisting entirely of “products”. The Gibbs function of a mixture of reactants and products will be less than the Gibbs function of either reactants alone or products alone. Indeed, as we go from reactants alone to products alone, the Gibbs function will look something like this:

The left hand side shows the Gibbs function of the reactants alone. The right hand side shows the Gibbs function for the products alone. The equilibrium situation occurs where the Gibbs function is a minimum.

If the Gibbs function of the reactants were greater than that of the products, the graph would look something like: