# 8.3: Some Thermodynamics and Statistical Mechanics

Besides needing to know Stirling's approximation and the method of Lagrangian multipliers, before we can embark upon Boltzmann's equation we also need to remind ourselves of two small results from thermodynamics and statistical mechanics. I mention these only briefly here, with barely adequate explanations. Fuller treatments are given in courses or books on thermodynamics and statistical mechanics. If you are rusty on these topics, or perhaps have never studied them thoroughly, the only consequence is that you may not be able fully to understand the derivation of Boltzmann's equation. This will not matter a great deal and should not deter you from reading subsequent sections. It is more important to understand what Boltzmann's equation means and how to apply it, and this can be done even if you have missed some of the details of its derivation.

Most readers will either understand this section very well and will not need prolonged explanation, or will not understand it at all, and will be happy to skip over it. Therefore, for brevity's sake, I do little more than quote the results, and I do not even explain what many of the symbols mean.

Those who are familiar with thermodynamics will have no difficulty in recalling

\[dU = TdS - PdV . \tag{8.3.1} \label{8.3.1}\]

The result that we shall be needing in Section 8.4 is \(\left( \partial U / \partial S \right)_V = T\), or more likely its reciprocal:

\[\left( \frac{\partial S}{\partial U} \right)_V = \frac{1}{T} . \label{8.3.2} \tag{8.3.2}\]

The relation we need from statistical mechanics is Boltzmann's relation between entropy and thermodynamic probability. Suppose we have an assembly of \(N\) particles that can be distributed or "partitioned" among \(m\) distinct states. If \(X\) is the number of ways in which this partition can be achieved, Boltzmann's equation for the entropy (indeed, his conception of entropy) is

\[S = k \ln X . \tag{8.3.3} \label{8.3.3}\]

Here, \(k\) is Boltzmann's constant, \(1.38 \times 10^{−23} \ \text{J K}^{-1}\). Those who remember having seen this before might just like to be reminded of the gist of the argument leading to it. It presupposes some functional relation between \(S\) and \(X\), and it notes that, if you have several assemblies, the total "\(X\)" for the ensemble as a whole is the product of the \(X\)'s of the individual assemblies, whereas the total entropy is the sum of the individual entropies, and therefore the entropy must be proportional to the logarithm of the number of possible configurations.

That was very brief, but it will do for the purposes of Section 8.4.