Skip navigation

Every once in a while (well, actually, pretty frequently) I see a post out there in the Blagopelago which makes me feel bad about ranting so much and discussing science so little. Today’s entry in this category is Jacques Distler’s treatment of Boltzmann entropy. He explains his motivation as follows:

This semester, I’ve been teaching a Physics for non-Science majors (mostly Business School students) class.

Towards the end of the semester, we turned to Thermodynamics and, in particular, the subject of Entropy. The textbook had a discussion of ideal gases and of heat engines and whatnot. But, somewhere along the line, they made a totally mysterious leap to Boltzmann’s definition of Entropy. As important as Boltzmann’s insight is, it was presented in a fashion totally disconnected from Thermodynamics, or anything else that came before.

So, equipped with the Ideal Gas Law, and a little baby kinetic theory, I decided to see if I could present the argument leading to Boltzmann’s definition.

The “baby kinetic theory” required here is a simple model of atoms bouncing elastically about inside a container. Their average energy is given to be
[tex]\frac{1}{2}mv^2_{\rm RMS} = \frac{3}{2}k_B T,[/tex]
implying that the root-mean-square speed scales as the one-half power of the temperature:
[tex]v_{\rm RMS} \propto T^{1/2}.[/tex]
The other necessary ingredients are Clausius’s definition of the entropy,
[tex]dQ = TdS,[/tex]
where for business majors we say that the “d” just means “a small amount”: add a small amount of heat dQ to a system at temperature T, and you increase the entropy by the small amount dS. We also use the conservation of energy in its thermodynamic clothing,
[tex] dU = dQ + dW = dQ – PdV,[/tex]
and the Ideal Gas Law
[tex] PV = Nk_B T = \alpha U.[/tex]
The coefficient α is 3/2 for a monatomic ideal gas (in three dimensions).

Consider an isothermal process, i.e., one the gas stays at constant temperature. The Ideal Gas Law tells us that the energy U must also remain constant, which means dQ = PdV. Integrating both sides of this gives the total change in heat, ΔQ, on the left side and the integral of PdV on the right. Solve for P in the gas law and integrate.
[tex]\Delta Q = N k_B T \log\left(\frac{V_f}{V_i}\right).[/tex]
Comparing this equation with Clausius’s definition dQ = TdS shows that the entropy, S, is
[tex]\Delta S = N k_B T \log(V) + f(T),[/tex]
where f(T) is some as-yet-unknown function which depends only upon the temperature (not on the volume).

Whew! I’ll leave it to the interested reader to consider an adiabatic process, where no heat flows in or out of the gas (dQ = 0 throughout) and show that
[tex]\alpha \frac{dT}{T} = -\frac{dV}{V},[/tex]
which (because dS = 0 in this process) we can use to find that
[tex]S = Nk_B \log(VT^\alpha) + {\rm const}.[/tex]
This is an interesting result. Because vRMS is proportional to T1/2, the volume in “velocity space” within which we can expect to find any particle is proportional to T3/2 (remember, we’re working in three dimensions). Because we know that all the particles must exist within the volume V, the chunk of “phase space” accessible to each atom is VT3/2, which is just the quantity inside the logarithm in the entropy formula! What’s more, if we shift the N into the logarithm, we see that we’re raising the phase-space volume available to each atom to the power of the number of atoms — which is just the phase space available to the whole system. So, leaving off the constant,
[tex]S = k_B \log \Omega.[/tex]
We’ve gone from the Clausius entropy to the Boltzmann entropy.

Some elaborations:

  • If we imagine the system’s phase space divided up into states, each of which has an equal probability of containing the system at any given time, then our Boltzmann formula starts to look like Shannon’s equation,
    [tex]H = -\sum_i p_i \log p_i.[/tex]
  • For extra credit, show that our Boltzmann formula in terms of V and T fails to be extensive: given two containers of the same gas at the same pressure and temperature, the entropy S of the two volumes put together is not the sum of the individual entropies, S1 + S2. We fix this by “correct Boltzmann counting”: because the gas particles are indistinguishable, we have to divide the phase-space volume by N!, the number of permutations.

I’ve noticed that my description is already more calculus-oriented than Distler’s. So it goes.

If you want the full apparatus of kinetic theory, you can brave Mehran Kardar’s lecture notes for the MIT graduate class in statistical physics (8.333). For the Clausius definition, see lecture 2; for grown-up kinetic theory, see lecture 7; and for the counting of identical particles, see lecture 13.

One Comment

    • manigen
    • Posted Friday, 4 May 2007 at 06:12 am
    • Permalink

    Ahh, this takes me back. Admittedly, it’s only been a couple of years, so it doesn’t take me back that far, but it takes me back there anyway.