
2.6 ENTROPY AND PROBABILITY: A MACROSCOPIC EXAMPLE 33
So if the heat capacity of the systems increases, the temperature fluctuations
decrease, as expected. The heat capacity of any system increases with the
number of molecules N in the system. For a monoatomic gas, we have C =
3Nk
B
/2, so the relative size of the temperature fluctuations is
T
2
/T
2
a
= 4/3N. (2.54)
For macroscopic systems, spontaneous macroscopic fluctuations are vanish-
ingly small.
A useful way of rewriting the above fluctuation equation is
CT
2
4T
a
=
1
2
k
B
T
a
. (2.55)
The left-hand side is the average energy required to produce a temperature
fluctuation of the size T (this energy is T
a
S, see Eq. 2.51). We see that
this energy is k
B
T
a
/2, which is precisely what would be expected from the
equipartition theorem: any accessible degree of freedom in the system con-
tains, on average, k
B
T
a
/2 of energy.
Now consider a large system made up of many such subsystems. If we put
heat into the system – perhaps in the sub-systems representing the bound-
ary of the large system – then the entropy of the large system will increase
reversibly. The system will be out of equilibrium, however, because the tem-
perature of the subsystems will now be different. If we allow this heat to
be redistributed equally amongst all subsystems, the entropy continues to
increase. We can generalize Eq. 2.50 to demonstrate that the entropy will
increase when the temperature difference between two subsystems decreases
through heat exchange. The final equilibrium state must have constant tem-
perature throughout. It will also be the maximum entropy state (amongst all
possible redistributions of the heat throughout the subsystems) because any
redistribution of heat that increases the temperature difference between two
subsystems would reduce the entropy.
The maximum entropy state thus defines the final equilibrium state of the
large system when there are no internal constraints that keep the heat from
redistributing over the subsystems.
We can impose extra macroscopic constraints on the system that would
prevent the heat from being redistributed equally amongst the subsystems.
But the large system would then again evolve to a maximum entropy state
consistent with the externally imposed constraints. The entropy of this con-
strained system is necessarily lower than the entropy of the unconstrained
system. In this sense, entropy measures how many system configurations are
consistent with the macroscopic constraints.
14
14
The configuration freedom of a system can be quantified by its information entropy, as
defined by Claude Shannon. Edwin Jaynes showed how Shannon’s information entropy
maps onto the thermodynamic entropy we use here. See Shannon, C. E. (1948) Bell System
Technical Journal 27, 379–423, 623–656, Jaynes, E. T. (1957) Physical Review 4, 620–630,
Jaynes, E. T. (1965) American Journal of Physics 33, 391–398. See also Problem 4.8.