1.12 Entropy 27
one can give absolute values of V.Asaconsequence, the change of G due to a variation
of P, (∂G/∂ P)
T,N
,isawell defined quantity because it is equal to V. One may thus
compare the values of G of two systems at different pressures.
For internal energy or enthalpy there is no natural zero point but in practical appli-
cations it may be convenient to choose a point of reference for numerical values. The
same is true for entropy although it is quite common to put S = 0 for a well-crystallized
substance at absolute zero. That is only a convention and it does not alter the fact that
the change of the Gibbs energy G due to a variation of T, (∂G/∂T )
P,N
, cannot be given
an absolute value because it is equal to −S.Asaconsequence, it makes no sense to
compare the values of G of two systems at the same pressure but different temperatures.
The interaction between such systems must be based upon kinetic considerations, not
upon the difference in G values. The same is true for the Helmholtz energy F because
(∂ F/∂T )
V,N
is also equal to −S.
The convention to put S = 0atabsolute zero is useful because the entropy difference
between two crystalline states of a system of fixed composition goes to zero there accord-
ing to Nernst’s heat theorem, sometimes called the third law.Itshould be emphasized
that the third law defined in this way only applies to states, which are not frozen in a
disordered arrangement.
Statistical thermodynamics can provide answers to some questions, which are beyond
classical thermodynamics. It is based upon the Boltzmann relation
S = klnW, (1.68)
where k is the Boltzmann constant (= R/N
A
, where N
A
is Avogadro’s number) and W
is the number of different ways in which one can arrange a state of given energy. 1/W is
thus a measure of the probability that a system in this state will actually be arranged in a
particular way. Boltzmann’s relation is a very useful tool in developing thermodynamic
models for various types of phases and it will be used extensively in Chapters 19–22.
It will there be applied to one physical phenomenon at a time. The contribution to the
entropy from such a phenomenon will be denoted by S or more specifically by S
i
and we can write Boltzmann’s relation as
S
i
= k ln W
i
, (1.69)
where W
i
and S
i
are evaluated for this phenomenon alone. Such a separation of the
effects of various phenomena is permitted because W = W
1
· W
2
· W
3
·...
S = k ln W = k ln(W
1
· W
2
· W
3
· ...) = k(ln W
1
+ ln W
2
+ ln W
3
+···)
= S
1
+ S
2
+ S
3
+ ··· (1.70)
Finally, we should mention here the possibility of writing the combined law in a form
which treats entropy as a characteristic state function, although this will be discussed in
much more detail in Chapters 3 and 6.From Eq. (1.63)weget
− dS =−(1/T )dU − (P/T )dV + (G
m
/T )dN − (D/T )dξ. (1.71)