# 7.3: Entropy

- Page ID
- 7251

Definition: Entropy Differential

If an infinitesimal quantity of heat *dQ* is added to a system at temperature *T*, and if no irreversible work is done on the system, the increase in entropy *dS* of the system is defined by

\[ d S=\frac{d Q}{T}.\]

Exercise \(\PageIndex{1}\)

What are the SI units of entropy?

Note that, since \(dQ\) is supposed to be an infinitesimal quantity of heat, any increase in temperature is also infinitesimal. Note also that, as with internal energy, we have defined only what is meant by an *increase* in entropy, so we are not in any position to state what *the* entropy of a system is. (Much later, we shall give evidence that the molar entropy of all substances is the same at the absolute zero of temperature. It may then be convenient to define the zero of the entropy scale as the molar entropy at the absolute zero of temperature. At present, we have not yet shown that there is an absolute zero of temperature, let alone of entropy.)

To the question "What is meant by __entropy__?" a student will often respond with "Entropy is the state of disorder of a system." What a vague, unquantitative and close to meaningless response that is! What is meant by "disorder"? What could possibly be meant by a statement such as "The state of disorder of this system is 5 joules per kelvin"? Gosh! I would give nought marks out of ten for such a response! Now it *is* true, when we come to the subjects of statistical mechanics, and statistical thermodynamics and mixing theory, that there is a sense in which the entropy of a system is some sort of measure of the state of disorder, in the sense that the more disordered or randomly mixed a system is, the higher its entropy, and random processes do lead to more disorder and to higher entropy. Indeed, this is all connected to the second law of thermodynamics, which we haven't touched upon yet. But please, at the present stage, entropy is defined as I have stated above, and, for the time being, it means nothing less and nothing more.

It will have been noted that, in our definition of entropy so far, we specified that no irreversible work be done on the system. What if some irreversible work *is* done? Let us suppose that we do work on a gas in two ways. (I choose a gas in this discussion, because it is easier to imagine compressing a gas with \(PdV\) work than it is with a solid or a liquid, because the compressibility of a solid or a liquid is relatively low. But the same argument applies to any substance.) We compress it with the piston, but, at the same time, we also stir it with a paddle. In that case, the work done **on** the gas is *more* than \(−PdV\). (Remember that \(−PdV\) is positive.) If we didn't compress it at all, but only stirred it, \(dV\) would be zero, but we would still have done work on the gas by stirring. Let's suppose the work done on the gas is

\[δW = −PdV + δW_{irr}.\]

The part \(δW_{irr}\) is the irreversible or dissipative part of the work done **on** the gas; it is unrecoverable as work, and is *irretrievably* converted to heat. You cannot use it to turn the paddle back. Nor can you cool the gas by turning the paddle backwards.

We can now define the increase of entropy in the irreversible process by

\[TdS = dQ + dW_{irr}\]

that is,

\[ d S=\frac{dQ + dW_{\mathrm{irr}}}{T}.\]

In other words, since \(dW_{irr}\) is irreversibly converted to heat, it is just as though it were part of the addition of heat.

In summary,

\[dU = dQ + dW\]

and

\[dU = TdS − PdV\]

apply whether there is reversible or irreversible work. But only if there is no irreversible (unrecoverable) work does \(dQ = TdS\) and \(dW = −PdV\). If there is any irreversible work,

\[dW = −PdV + dW_{irr}\]

and

\[dQ = TdS − dW_{irr}.\]

Of course there are other forms of reversible work than \(PdV\) work; we just use the expansion of gases as a convenient example.

Note that \(P\), \(V\), and \(T\) are state variables (together, they define the state of the system) and \(U\) is a function of state. Thus the *entropy*, too, is a *function of state*. That is to say that the change in entropy as you go from one point in *PVT*-space to another point is route-independent. If you return to the same point that you started at (the same state, the same values of \(P\), \(V\) and \(T\)), there is no change in entropy, just as there is no change in internal energy.

Definition: Specific Heat Capacity

The *specific heat capacity *\(C\) of a substance is the quantity of heat required to raise the temperature of unit mass of it by one degree. We shall return to the subject of heat capacity in Chapter 8. For the present, we just need to know what it means, in order to do the following exercise concerning entropy.

Example \(\PageIndex{1}\)

A litre (mass = 1 kg) of water is heated from 0 ^{o}C to 100 ^{o}C. What is the increase of entropy? Assume that the specific heat capacity of water is *C* = 4184 J kg^{−1} K^{−1}, that it does not vary significantly through the temperature range of the question, and that the water does not expand significantly, so that no significant amount of work (reversible or irreversible) is done.

**Solution**

The heat required to heat a mass \(m\) of a substance through a temperature range \(dT\) is \(mCdT\). The entropy gained then is \( \frac{mCdT}{T}\). The entropy gained over a finite temperature range is therefore

\[\begin{align*} m C \int_{T_{i}}^{T_{2}} \dfrac{d T}{T} &= m C \ln \left(\dfrac{T_{2}}{T_{1}}\right) \\[4pt] &=1 \times 4184 \times \ln \left( \dfrac{373.15}{273.15} \right) = 1305\, \mathrm{JK}^{-1}. \end{align*}\]