Sei sulla pagina 1di 21

CHAPTER NINE

THE DENSITY OF STATES AND ENTROPY

Introduction
In this chapter we will look again at the internal energy of a system and ask about the number of states
available to this system. In particular, we will examine the density of states, =I, of the system, i.e., the number
of states available to a system within a given energy range between I and I  $ I . We will find that the density
of states varies rapidly as the energy of the system changes. This discussion will lead naturally to the concept of
entropy, the tendancy of a system to move toward a state where there are a larger number of states available, i.e.,
toward a less ordered system.

Density of States
Consider the phase space diagram for a single free particle moving in one dimension within a fixed region
!  B  P. If the particle has an energy I , the momentum of the particle is given by

:B #7I (9.1)
since the total energy of a free particle is just the kinetic energy. The number of cells in phase space which
represent all possible systems with energy less than I :B# #7 is just the area of the phase space diagram
depicted below divided by the size of a cell 29 $ :B $ ;B . We will denote the number of cells with energy
less than I by FI.

+p
o

q
-po
0 L
:#
Figure 9.1 The region of phase space for a one-dimensional free particle bounded by the energy I #7
and the dimensions !  B  P.

The number of cells in phase space (the number of microstates of the system) which correspond to a one-
dimensional, free particle with an energy between I and I  $ I , HI, must therefore be given by
` FI
HI FI  $ I  FI $I (9.2)
`I
or
HI =I $ I (9.3)
where =I is the density of states. In this particular example, we have
#:B PB #P "
FI #7I I # (9.4)
29 29
where is a constant given by )7P# 29# . So we see that the number of states accessible to a single, one-
"
dimentional particle with energy between zero and I is proportional to I # . But this is for a single, one-
dimensional particle. In three dimensions, we can also count the number of states (phase space cells) with energy
less than I , but phase space in this case is three-dimensional, and we must consider the six-dimensional volume
of phase space accessible to the single particle. If we assume the particle is confined to a cubic box, each
Chapter Nine: The Density of States and Entropy 2

dimension is equivalent, and we have


"
FI I # (9.5)
for each of the three conjugate pairs: B :B ; C :C ; and D :D . This means that the number of cells with energy less
than I for a free particle in three dimensions is given by
$
FI I # (9.6)
If we now ask about the number of states which are accessible to R free particles moving in a three-dimensional
cubic box, we would have
$R
FI I # (9.7)
or

FI )7P# 29# )729#


$R $R
$R $R
I # Z RI # (9.8)

provided that each of the particles is distinguishable. If the particles are not distinguishable, then we can
enterchange particles without actually changing the microstate of the system. The number of different ways we
can interchange N particles is N!, so that we have actually overcounted the number of microstates for this system.
Thus, for indistinguishable particles, the actual number of states with energy less than E is given by

)729# I #
ZR $R $R
FI (9.9)
Rx
We have demonstrated that the number of cells (or microstates) in phase space is proportional to the total energy
of the system raised to some power, which we can express in terms of the number of degrees of freedom for the
system. In the case of R three-dimensional particles free to move in a cubic box, there are $R degrees of
freedom We can therefore write

)729# I #
ZR
FI (9.10)
Rx
To find the density of states =I we take the derivative of FI with respect to I , and obtain

)729#
ZR
` FI
HI $ I =I $ I I # " $ I (9.11)
`I Rx #

typically set #  " # , so that we have


Now since the number of degrees of freedom is proportional to the number of particles in the system, R we

=I I # (9.12)

The Density of States for Various Systems


Three-dimensional Simple Harmonic Oscillator. The energy for a one-dimensional simple harmonic
oscillator is given by
:# "
I  5B# (9.13)
#7 #
so that the phase space diagram for such a system is an ellipse whose semi-major and semi-minor axes are
proportional to the maximum kinetic and potential energy of the oscillator.
Chapter Nine: The Density of States and Entropy 3

Figure 9.2 The phase space diagram for a one-dimensional simple harmonic oscillator of energy
:#
I #7  #" 5B# . This figure also depicts the division of phase space into equal cells of size 29 $ B$ :.

The number of cells in phase space for the one-dimensional oscillator is given by the area of the ellipse divided by
the size of a cell, 29 , or
1+,
FI (9.14)
29
where
+ B7+B #I5
, :7+B #7I
(9.15)

This gives
#1 I
FI (9.16)
29 =
or, in the semi-classical approximation,
#1 I
FI #1I% (9.17)
h=
To get an idea of the number of accessible states we are talking about, consider the case where the
frequency of oscillation is 1 kHz, 2 ''$ "!15 eV-s, and I is only 1.0 eV. In that case, FI "!"# Even
when / 1!! MHz, FI "!( .
Now for a three-dimensional oscillator, this would become

FI I
$
#1 $
(9.18)
%
and for R such three-dimensional oscillators we have

FI
$R
#1
I $R (9.19)
%
provided the oscillators are distinguishable. This might be the case if the oscillators are fixed in location so that
individual oscillators do not move around and are therefore distinguishable by location.. If the oscillators are not
distinguishable, we again must divide this last equation by N! Either way, we see that FI is proportional to the
number of degrees of freedom of the system (here there are 2 degrees of freedom for each dimension since the
energy of the oscillator has two quadratic terms: one for potential energy and one for kinetic energy). The
constant term in this last equation is different from that which we obtained for the free particle, but the
dependence of the number of cells on the total energy is the same - both vary as the total energy raised to the #
power, where is the number of degrees of freedom of the system.
Chapter Nine: The Density of States and Entropy 4

The Number of States, HI, for an Ideal Gas. As we examine the number of states for an ideal gas, we
need first to look at the energy of the molecules which make up a gas. If we consider the case of a gas of R
identical molecules enclosed in a container of volume Z , the total energy of the system, I , is given by
I O  Y  Iint (9.20)
The total kinetic energy of the system, O , is due to the translational motion of the molecules, and is given by

:t
" R #
O O:t" ,:t# , ,:tn ) (9.21)
#7 3" 3

where the :t3 's are the translational momenta of each particle (which obviously changes whenever the molecules
collide with each other or with the walls of the container). The quantity Y Y tr" ,rt# , ,rt8 ) represents the total
potential energy of the system and is a function of the location of each and every particle of the system. It is a
function of the relative separations of all the particles, and, therefore, is constantly changing as the molecules
move around. We might, however, be able to find an expression for the average potential energy in which the
particles move.
If we assume that the molecules are not monatomic, each molecule can also rotate and vibrate relative to its
center of mass. The energy of this rotation and vibration we designate by Iint , which is the internal energy of the
system (i.e., the sum of the internal energy of each individual molecule). An expression for Iint for diatomic
molecules might look something like

Iint  # .R  # ,(R  Ro )
R $
P#4 " # " #
(9.22)
3" 4"
#M 4

For an ideal, monatomic gas, with no interaction potential, the total energy of the system simplifies to give

I :B#  :C#  :D# +4 :4#


R 3R
"
(9.23)
3"
#7 4"

where +4 "#7.
We now want to determine the number of states, HI, for such an ideal gas with total energy lying
between I and I  $ I . To do this we again consider phase space in which each coordinate, B, is paired with a
momentum, :B , and we simply determine the number of cells in phase space with energy between I and I  $ I .
This could be done by integrating over all the accessible coordinates and momenta of the system,
I$ I

(
"
HI d:" d:# d:$R d;" d;# d;$R (9.24)
R x 2o$R
I

where we have introduced R x in the denominator since the individual atoms of the ideal gas are indistinguishable.
Expressing this last equation in terms of three-dimensional momenta and coordinates,
I$ I

(
"
HI d$ :t" d$ :t# d$ :tR d$ t<" d$ t<# d$ t<R (9.25)
R x 2o$R
I

where we have written d$ :t" d:3B d:3C d:3D and d$ t<" dB3 dC3 dD3 . This integral is difficult to perform because of
the limitation on the energy. It is much simpler to proceed as we did earlier by evaluating the integral
I

( d :t" d :t# d :tR d t<" d t<# d t<R


" $ $ $ $ $ $
FI (9.26)
R x 2o$R
!
Chapter Nine: The Density of States and Entropy 5

and then determine HI from the equation


` FI
HI $I (9.27)
`I
Since the potential energy of interaction between the molecules is assumed to be zero for an ideal gas, and
since we assume no external force acting on the system, the total energy is independent of the location of the
individual molecules. We can therefore integrate over all spacial coordinates immediately and obtain

FI  ;I 
ZR
(9.28)
R x 2o$R
since each particle can be anywhere within the volume Z of the container. The function ;I , given by

;I  ( . $ :t " . $ :t # . $ :t R (
I I
.:" .:# .:$ .:$R (9.29)
! !

is a function of the total energy of the system, but independent of the volume. This integral is effectively a
volume integral in momentum space with the condition that the total energy of the system remain less than some
value I . It is extremely important to realize that this integral is not simply the integral over independent
momenta from :t 3 ! to :t 3 :t 7+B . It cannot be written as
:3max R
[ VSR K ;I ( %1:3# .:3
!
[ VSR K (9.30)

The ;I  function is a function of the total energy of the system, which is shared between all the molecules of
the system. The total energy of a non-interacting ideal gas must be given by

 #  3
R
:t #" :t # :t #
I (9.31)
#7 #7 3"
#7

This means that, for a fixed value of the energy I , a change in the magnitude and direction of the momentum of
one particle must be reflected by a change in the momentum of one or more other particles. But in order to count
all the possible states accessible to the system, we must sum up all possible combinations of the different
components of the momenta of all particles which will give a total energy I !

t #7I , and the


To see how this is accomplished, consider a single particle with an energy I :t # /#7. The components
of the momentum of this particle, :B , :C , and :D , cannot individually exceed the magnitude l:l
magnitude of each component must combine so that
:B#  :C#  :D# #7I (9.32)

This situation is illustrated in the three-dimensional momentum-space diagram shown in the figure below.
Chapter Nine: The Density of States and Entropy 6

p
z

p
y

p
x

Figure 9.2 The constant energy sphere for a single particle in three-dimensions. The momentum vector can
have any orientation is space, but must always have the same magnitude. The volume of this sphere is
proportional to the number of ways of choosing the different possible :3 's for the particle to have an energy
less than I :# #7.

t #7I from the origin. This circle, therefore, represents the locus of all vectors which have a constant
In this diagram, the tip of vector :t touches a circle which represents all points that are a distance
l:l
energy I . Such a vector can be oriented in any arbitrary direction so long as the tip of the vector is on the circle.

t #7I , which is given by the function ;I  divided by the size of a single cell in phase space.
The number of possible momentum states which have a total energy less than I is just the volume of this sphere
of radius l:l
The volume of momentum space with energy less that I is given by

;I  1: 1#7I $#


% $ %
(9.33)
$ $
In two dimensions, this would be

;I  1:# 1#7I ## (9.34)


For the case of dimensions, we expect ;I  to be of the form

;I  G #7I # (9.35)


where G is a constant (see the Appendix for the derivation) given by
1# 1#
G (9.36)
# ># >#  "
and where
>8  " 8>8
>8 8  ">8  "
(9.37)

>8 8  "x 8 38>/1/<


1
> =
"
#
1 3 5 72m-1
2

>m+ =
"
1
# 2m
Chapter Nine: The Density of States and Entropy 7

The total number of states in three-dimensions for R monatomic atoms of an ideal gas is, therefore,
given by

;I  G$R #7I $R #


ZR ZR
FI $R
(9.38)
R x 2o R x 2o$R
Now to find the number of states between I and I  $ I , we differentiate FI  with respect to I

(27I )$R /#" #7$ I


` FI ZR $R
HI $I G$R (9.39)
`I R x 2o$R #

or

G$R (27I )$R /#" #7$ I


ZR $
HI =I $ I (9.40)
R  "x 2o$R #
Just as before, in those cases where R is very large, we can neglect the 1 term in the exponential, and in the
factoral term in the denominator, giving

G$R (27I )$R /# (37$ I ) FI $7$ I 


ZR
HI =I $ I (9.41)
R x 2o$R

This last result is characteristic of those situations where R is very large. In those cases, the only difference
between HI, FI, and =(E) is a relatively small constant. For that reason, we can work equally well
with HI, FI, or =(E); but we usually work with FI, since it is easiest to calculate! And since R is
of the order of Avogadro's number, we see that HI, FI, and =(E) are all extremely rapidly increasing
functions of the energy.

The Density of States and Equilibrium


Consider two systems E" and E# which are allowed to interact with each other thermally, but not
mechanically. (We also do not allow particle diffusion which we will be taken up later.)

A2
A1

Figure 9.3 This figure depicts a large isolated system E9 consisting of two sub-systems E" and E# which
can interact only by exchanging energy from one system to the other (i.e., thermally).

These two systems make up a larger system, Eo , which we will assume to be isolated from the rest of the
universe, so that there can be no net gain or loss of energy by the system Eo . This means that as the two
subsystems E" and E# interact, the energy I" and I# must satisfy the equation
I"  I# Io (9.42)
at all times. From what we have just learned, we know that the number of microstates accessible to system E" is
#
given by H" I" ) and is proportional to I" " and that the number accessible to system E# is given by H# I# )
# 2
which is proportional toI# . The number of microstates accessible to the combined system Eo must, therefore,
be given by
Ho (Io ) H" (I" )H# (I# ) (9.43)
Chapter Nine: The Density of States and Entropy 8

[This is just like the case where we have two dice each of which have six faces that are equally likely to be up".
The probability of a certain combination occuring is the product of the probability for each individual die.]
In the example which follows, you will see that there is a single macrostate of the combined system Eo ,
which is the most likely to occur because it corresponds to that macrostate with the largest number of microstates.

This particular macrostate is characterized by the case where subsystem E" has energy I " and where subsystem

E# has energy I # Io  I " . We therefore express the maximum number of microstates of the combined
system Eo by the equation:

H7+B Ho I " ,I # H" (I " ) H# (I # ) (9.44)
In the case where the number of degrees of freedom of a system is very large, we will find that the probability of

the combined system being in any state other than the one for which I" I " and I# I # is extremely small.
Let us suppose that system E" has " 12 degrees of freedom, while system E# has # 20 degrees of
freedom. We will also assume that the energy of each system is quantized so that the energy is given by 8 %o , and
that the total energy of the combined system is &%o . We can then make a table of the possible energy distributions
between the two subsystems and determine the number of states accessible to these two subsystems and to the
system as a whole. If we assume that each microstate of the system is equally likely, we can then determine which
energy macrostate is the most likely one. Table 9.1 lists the number of microstates of the system for the various
energy macrostates.

I" I# H" H# H! H" H#


! & ! *(( "!' !
" % " "!& "!' "!& "!'
# $ '% &*! "!% $() "!'
$ # (#* "!# "!$ !.(%% "!'
% " %"! "!$ " !!!% "!'
& ! "&' "!% ! !
Total &&( "!'

Table 9.1 The microstates accessible to the combined system E! E"  E#

In this example, you can see that the macrostate corresponding to I" #%o ; I# $%o is by far the most
likely macrostate of the system [more than 66% of the microstates correspond to this particular macrostate], and
that the other possible macrostates of the system have a much smaller likelyhood. This situation is even more
dramatic when the number of degrees of freedom in each of the subsystems is much larger. In the case where
" "#! and # #!!, more than 99% of all the microstates of the system correspond to a single (most
probable) macrostate of the system. Thus, in the case where there is Avagadro's number of particles and even
more degrees of freedom, the probability of the system being in any state other than the most probable one is
practically zero! We can, therefore, say with confidence that a system with many degrees of freedom which is left
isolated from the rest of the universe will eventually be in only one possible macrostate of the system, the
equilibrium macrostate! (Just how long it takes for this state to be reached, i.e., for equilibrium to occur, is
another question altogether.)
We see, then, that Ho I" ,I# is a sharply peaked function which depends upon the way the energy is
distributed between the two subsystems E" and E# . Since the energy of the system must be constant,
Io I"  I# (9.45)
we see that specifying a certain energy for system E" also specifies the energy for system E# ,

I" I " I # Io  I " (9.46)
so that Ho I" I# can be written as a function either of I" , or of I# . Since Ho I" I# is a sharply peaked
function of the energy of the sytem, we can take the derivative of Ho I with respect to either I" or to I# and set
Chapter Nine: The Density of States and Entropy 9


this equal to zero to determine the most likely values I " or I # :

0 0
dHo (I" ,I# ) dHo (I" ,I# )
or (9.47)
d I" I" d I# I#

The Density of States, Temperature, and Entropy


If the two subsystems are not in equilibrium with each other, the temperature of one subsystem is different

from that of the other, and the energies I" and I# will be changing toward the equilibrium values I " and I # .
As we demonstrated in the last section, we expect that the equilibrium condition, the state toward which the two
subsystems will move, corresponds to that in which the number of states available to the system as a whole will
increase. In other words, we expect Ho I" ,I# ) to increase with changing I" and I# until the maximum value

Ho I " ,I # H" (I " ) H# (I # ) is reached. Thus, we expect heat to flow from one system into the other until the
systems come to equilibrium. It would seem that our statistical arguments must give the observed result that heat
always flows from a hotter system to a cooler system. This prefered direction of heat flow is associated with the
entropy change of the system.
We have found that the macrostate (and its corresponding parameters) which maximizes the number of

microstates of the combined system is given by Ho (I " I # H" I " H# I # , where the number of
microstates of the combined system is the product of the microstates of the individual subsystems. The fact that
we must consider a product of the microstates of the two systems is in contrast to other parameters of the two
subsystems, such as the energy, the volume, and the number of particles which are additive. To create a little
more similarity with these other parameters of the subsystems, we notice that the log of the number of microstates
of the subsystems is additive, or:

ln Ho I " I # ln H" I "  ln H# I # (9.48)
We therefore define a quantity called entropy by the equation
WI 5 ln HI (9.49)

where 5 (Boltzmann's constant) is just a scaling factor. With this definition we can write
Wo I" I# W" I"  W# I# (9.50)
for the combined system Eo . Notice that a change in Wo can be written
`W" I" `W# I#
.Wo I" I# .I"  .I# (9.51)
`I" `I#
But the energy change .I" is related to the energy change .I# by the equation
.I"  .I# .Io ! .I"  .I# (9.52)
so that the change in Wo I can also be written as

.I" .I"
`W" I" `W# I# `W" I" `W# I#
.Wo I" I# .I"   (9.53)
`I" `I# `I" `I#
Now, in the case where we evaluate this change at the most likely values of I" and I# (which we designate by

I " and I # ), we know that Ho I and therefore W 5 ln[Ho I] is a maximum, giving

.Wo k

0
`I#
`W" I " `W# I #
I"  .I" (9.54)
`I"

The previous equation indicates that the maximum entropy of the combined system is obtained only when


`W" I" `W# I#
(9.55)
`I" I" `I# I#
Chapter Nine: The Density of States and Entropy 10

Notice that the left hand side of this equation is only a function of system E" while the right hand side is only a
function of system E# ! This last equation would seem to imply that the partial of W with respect to energy,
evaluated at the equilibrium energy, must be related to the temperature, for it is the temperature of two adjacent
systems which is equal when they come to equilibrium. In fact, you should recall that this is just the definition of
the temperature - that thermometric property which determines when two systems are in thermal equilibrium!

`W9
`I"
!
So

E1

E2
Figure 9.4 The entropy of the combined system plotted against the energy of each subsystem

To see just how the partial derivative of the entropy with respect to energy of a system is related to the
temperature, let us look back at the general expression for the number of states of the system with energy I ,
which we designated HI. We found that a general expression for the number of states for an ideal gas is given
by the equation
HI =I $ I FI $7$ I  (9.56)
The entropy, then can be written as
WI 5 lnHI 5 cln=I   ln$ I d 5 clnFI  ln$7$ Id (9.57)
So we see that the entropy can be expressed as
WI 5 lncHId
WI 5 lnc=Id
(9.58)

WI 5 lncFId

to within some arbitrary constant! In most cases, we will simply ignore the constant, and use whichever of these
expressions is easiest to calculate [usually the one with FI].
Thus, the entropy for a three-dimensional ideal gas is given by

WI 5 lnFI  5 ln #7I $R #


ZR 1$R #
(9.59)
R x 2o$R >$R #  "

 cR lnR  R d  ln1
$R #
WI 5 ln  ln   ln(#7I ) (9.60)
R
Z $R $R $R $R #
2o$ # # #
Chapter Nine: The Density of States and Entropy 11

This can be written as

 R lnR  R  ln1
$R #
WI 5 ln  ln  ln(#7I )$R #
R $R #
Z $R $R
$
 (9.61)
2o # #

or combining the middle terms, we can write

WI 5 ln  R lnR  ln  ln(#7I )$R #


R $R #
Z #1 &R
$
 (9.62)
2o $R #

In this last expression we have several ln terms raised to the Nth power. This can be written as

WI 5 R ln  R lnR  R ln  R ln(#7I )$#


$#
Z #1 &R
 (9.63)
2o$ $R #

and we can factor the N out of the equation to obtain

WI R 5 ln  lnR  ln  ln(#7I )$#


$#
Z #1 &
$
 (9.64)
2o $R #

This can be further simplified to give

WI R 5 ln  ln( ) 
Z %17I $# &
(9.65)
R 2o$ $R #

or, finally,

WI R 5 68  #
$#
Z %17I &
(9.66)
R $R 29#

which is the Sakur-Tetrode equation. This equation has been experimentally verified as the correct entropy of
an ideal gas at high tempereatures, if 29 is numberically set equal to Plank's constant. Another form of this
equation can be written by recognizing that we can write
&
ln /&# (9.67)
#
Using this we can write the Saku-Tetrode equation in the form

WI R 5 68 /
$#
Z &# %1 7I
(9.68)
R $R 29#

Now, the partial of W with respect to energy is given by



. I $#

R 5 R 5 "I 
`WI $
I 
.I
(9.69)

`I $# #
Z R

where we have specifically indicated that the volume and the number of particles remains constant. Solving the
the energy, we obtain

1
$R 5 `WI
I (9.70)
# `I Z R
Chapter Nine: The Density of States and Entropy 12

From the equipartition theorem, we know that the energy of a three-dimensional ideal gas is given by
3R
I 5X 5X (9.71)
# #
so we see that the change in entropy with respect to energy must be the inverse of the absolute temperature:

`I
`W
"
X (9.72)
Z R

Note: It is interesting to notice that the temperature of a system is proportional to the average
energy per degree of freedom (I) of a system, and that two systems come to equilibrium when
this average energy is equal.

Thus, Equ. 9.55 simply states that two systems E" and E# in thermodynamic equilibrium, must have the
same temperature:
" "
or X" X# (9.73)
X" X#

Entropy, Heat Flow and the Second Law of Thermodynamics


For our combined system Eo , the number of microstates accessible is given by
Ho (Io ) H" (I" )H# (I# ) (9.74)
so that the entropy of the combined system is given by
Wo Io W" I"  W# I# (9.75)
since W 5 ln[HI]. When the system comes to equilibrium, the combined system is in a macrostate which
corresponds to the maximum number of microstates, and, therefore, the maximum value of the entropy:

Wmax Wo I " ,I # ) W" (I " )  W# (I # ) (9.76)
But what happens when the system is not in equilibrium? In this case we have two system E" and E#
which are both isolated and independently in a state of thermodynamic equilibrium. When we bring these two
systems in contact with one another, and allow them to interact thermally, the combined system E9 is not initially
in its most probable configuration, and so the two systems exchange energy in such a way that the entropy of the
entire system increases. We know that this must be true because the equilibrium condition arises when the
entropy of the combined system is a maximum. Thus, when two systems are brought in thermal contact with one
another, the total entropy of the combined system must either increase or (if the two systems are already in
equilibrium with each other) remain constant:
.Wo (I" ,I# ) ! for isolated systems (9.77)

The fact that the entropy of an isolated sytem must increase or remain constant is one expression of the second
law of thermodynamics. As a natural consequense of this we find that
`W" `W#
.Wo .I"  .I# ! (9.78)
`I" `I#
but the conservation of energy requires that
.I"  .I# (9.79)
which gives

.I" .I" !
`W" `W# `W" `W#
.Wo .I"   (9.80)
`I" `I# `I" `I#
Chapter Nine: The Density of States and Entropy 13

or

 .I" !
" "
(9.81)
X" X#
This equation tells us that if .I" I !, i.e, if energy is flowing into subsystem E" , then the term in brackets must
also be greater than zero, which implies that X# I X" ! Similarly, if .I"  !, i.e., if energy is leaving subsystem
E" , then the term in the brackets must also be negative, which implies that X" I X# ! This is consistent with our
experience and is another way of stating the second law of thermodynamics: energy flows spontaneously from
hotter objects into cooler objects and not vice versa!
Notice that what we have said so far indicates that it is the entropy of the combined system which must
increase when two systems interact with each other. The entropy of individual subsystems may either increase or
decrease. You can see that this is true by looking back at Table 7.1. The maximum number of microstates (the
maximum entropy) of the combined system does not correspond to the maximum number of microstates for either
of the subsystems. If the two systems are not initially in equilibrium, then the systems will transfer energy is such
a was as to give the maximum number of microstates for the combined system. One of the subsystems will loose
energy (decreasing the number of microstates accessible to that subsystem) while the other will gain energy
(increasing the number of microstates accessible to that subsystem).

Entropy and the Third Law


The entropy is defined by the equation W 5 lnH, or W 5 lnF  -98=> and we have shown that the
number of states accessible to a system is a rapidly increasing function of the energy. But what happens as the
energy of the system decreases? Now the temperature of a system is a measure of the average energy per degree
of freedom. As the energy of a system decreases, the number of microstates accessible to the system dramatically
decrease. In fact, as the energy of the system decreases more and more the energy approaches the ground state
energy of the system. In the case where the ground state is non-degenerate, there is only one state of the system
for the system's lowest energy, which means that the entropy of this state of the system is given by
W9 5 ln " ! (9.82)
Even in the case where the ground state energy is degenerate, the number of states accessible to the system
becomes quite small, so that the entropy of that state of the system is extremely small in comparison to the
entropy of higher energy states. It is instructive to note that the lower limit to the entropy is purely a quantum
mechanical concept.

The Calculation of the Entropy and Changes in Entropy


The Sakur-Tetrode equation for the entropy of an ideal gas is given by

WI R 5 68  #
$#
Z %17I &
(9.83)
R $R 29#

where the entropy is clearly a function of the number of particles, the volume, and the energy of the system.
(Note: In particular, note that the entropy is a function of the volume per particle and the energy per particle.)
The change in entropy .W can be expressed by the equation

.W .I  .Z  .R
`W `W `W
(9.84)
`I Z R `Z IR `R Z I

Now, if we hold the number of particles and the volume of the system constant, the change in entropy of the
system is due only to an energy transfer in the form of heat (i.e., .I $ Q if the energy change in only in the
form of heat) In this case we can write

.W $U
`W $U
(9.85)
`I Z R X
Chapter Nine: The Density of States and Entropy 14

This last equation is valid for any infintesimal quasi-static transfer of heat energy no matter how large or small
the system. This means that the change in the entropy of a system can be determined from the equation

?W (
$U
(9.86)
/; X

where we have explicitly indicated that the process must be carried out quasi-statically. It is important to notice
that although the amount of heat, $U, added to (or removed from) a system depends upon the process, when we
divide by the absolute temperature, X , of the system at each point along the equilibrium surface of the process
(which will generally change from point to point), we obtain the exact differential of the entropy .W . The fact
that the entropy is a state variable can be easily seen in that it is related to the number of microstates accessible to
the system which has a unique value depending upon the parameters of that system.

Fluctuations about the Equilibrium Temperature


Since the entropy of a system is based upon probabilities, and since all microstates of a combined system
are equally likely, is it not possible for the system to move away from the state of equilibrium, since such a
microstate of the system surely exists? If this is possible, then we have the situation where the entropy of an
isolated system actually decreases (and heat spontaneously flows from cold objects into hot objects)! To answer
this question, we want to examine the likelyhood of fluctuations about the maximum value of the entropy of the
system. This is the same thing as looking at the dispersion of the probability distribution to determine the
likelyhood of seeing occurances other than the average. Let's examine the probability that a subsystem E" has an

energy I" , close to the equilibrium energy I " . Now the number of states accessible to this subsystem is related
to the entropy W" 5 lnHI" . We will express this function in terms of a Taylor series about the point of
maximum entropy and obtain:

lnHI" lnHI"  I"  I"


` lnHI"
(9.87)
`I" I"

I"  I " ) 
" ` # lnHI" #
+
#x `I"# I"

(we have divided each term of this expansion by 5 ). We will write this equation in the more simplified form:
"
lnHI" lnHI"  "" (  -" (#  (9.88)
#x

where ( (I"  I " , "" 5X" " ` ln`I and -"  `I


` "
HI"
, defined so that - I !. Likewise, we
" I" I"

will expand lnHI# about I # to obtain

lnHI# lnHI#  I#  I#
` lnHI#
`I# I#

I#  I # ) 
" ` # lnHI# #

# `I## I#

or (9.89)
"
lnHI# lnHI#  "#  (  -#  (#  (9.90)
#
Note that the energy differences, (, are opposite in sign, since the total energy of the system must remain fixed, so

that (" (I"  I " (, and (# I#  I #  (.
The number of states for the combined system is found by adding these two equations together to get:
"
lnHI" HI# lnHI " HI #  ""  "# (  (-"  -# )(#  (9.91)
#
Chapter Nine: The Density of States and Entropy 15


Now, at the point where I" I " and I# I # , we see that "" "# , so that the term linear in ( goes to zero,
leaving
"
lnc I lnc I  -o (I  I # (9.92)
#
or
"
c I c I /B:  -o I  I # (9.93)
#
where we have define -o -"  -# , and where c I is the probability of finding one of the subsystems (in this
particular calculation, I" ) with an arbitrary energy I . We notice that the probability of measuring a particular

energy I is a Gaussian function which is peaked about the average energy, I , and that the standard deviation of
#
this Gaussian is given by 5 "-o "-"  -# . Central to this conclusion is the requirement that -3 is
positive. To demonstrated this, remember that H I # . This gives
` lnH ` lnI
" (9.94)
`I # `I #I
and

 I0
`"
-  (9.95)
`I #I #
Now, since 66% of all states lie within 5 of the average value, the question we need to answer is just how
large an enegy change in one of the subsystems will give one standard deviation in the probability function. The
standard deviation is given by

-o
"
5 , (9.96)

where

-o -"  -# -" "  -# " 


-# -"
(9.97)
-" -#
Notice that if -" -# , -o -" ; while if -# -" -o -# ; and if -" -# , -o #-" . Thus, let -  #I # 
be equal to the largest of -3 , or

I
-o
" #
5 (9.98)

If we let ?I5 be the energy fluctuation equal to one standard deviation, then ?I5
I # , and for systems with a
large number of degrees of freedom (say "!#! ), the percentage fluctuation in energy is of the order of "!) %.
Thus we expect only very small fluctuations about the equilibrium states of the system, so that to a good
approximation, once the system reaches equilibrium, the parameters of the system remain essentially constant.

Summary
In summary, then, we find that when two systems interact with one another, the macroscopic parameters
of the two systems change in such a way as to increase the entropy the number of microstates) of the combined
system, and the equilibrium values of these parameters are those for which the entropy is a maximum! And once
the equilibrium parameters of a system are reached, these equilibrium parameters remain constant to a precision
greater than we normally can measure. It is important to notice that the entropy of individual subsystems may
increase or decrease, and that our conclusions are based upon the assumption that the composite system Eo is
isolated from the rest of the universe. Therefore, the second law of thermodynamics should be stated: For any
isolated system, the entropy of the system will always increase or remain the same!
APPENDIX 9.1
INTERNAL ENERGY OF A NON-IDEAL GAS

If we consider the case of a gas of R identical molecules enclosed in a container of volume Z , the energy of
the system is given by
I O  Y  Iint (A9.1.1)
where I is the total energy of the system. The total kinetic energy of the system, O , is due to the translational
motion of the molecules, and is given by

:t
" R #
O O:t" ,:t# , ,:tn ) (A9.1.2)
#7 3" 3

where the :t3 's are the translational momenta of each particle (which obviously changes whenever the molecules
collide with each other or with the walls of the container). The quantity Y Y tr" ,rt# , ,rt8 ) represents the total
potential energy of the system and is a function of the location of each and every particle of the system. It is a
function of the relative separations of all the particles, and, therefore, is constantly changing as the molecules
move around. Since we did not assume that the molecules were monatomic, each molecule can also rotate and
vibrate relative to its center of mass. The energy of this rotation and vibration we designate by Iint , the internal
energy of the system which is the sum of the internal energy of each individual molecule. An expression for Iint
for diatomic molecules might look something like

Iint  # .R  # ,(R  Ro )
R $
P#4 " # " #
(A9.1.3)
3" 4"
#M 4

Typically, we designate the generalized momenta (linear momentum, angular momentum, etc.) relative to the
center of mass of the molecule by T" , T# , ,T7 , and the corresponding generalized coordinates (the
internuclear distance, the angles of rotation, etc.) by U" ,U# , ,U7 . Thus, a general expression for the internal
energy of the system might be

Iint 4 T4#  "4 U4#


R 7
(A9.1.4)
3" 4"

where the 's and " 's are constants (some of which may be zero).
Now if we denote the generalized momenta of the center of mass of individual molecules by :" ,:# ,:$, and
corresponding generalized coordinates of the center of mass of the individual molecules by ;" ,;# ,;$ , we can write
the total energy of the system as

I +4 :4#  ,4 ;4#  Iint


R $
(A9.1.5)
3" 4"

where the +'s and , 's are constants (some of which may be zero). However, it is customary to simply number each
momentum variable, :4 , and its constant +4 from 4 " to $R , and correspondingly, each coordinate variable, ;4 ,
and its constant ,4 from 4 " to $R , giving

I +4 :4#  ,4 ;4#  Iint


3R
(A9.1.6)
4"

In some cases of interest, this equation can be greatly simplified. In the case of monatomic gasses, for
example, the internal energy, Iint , is zero. Likewise, for the case where the intermolecular distances are large on
average the potential energy of interaction is negligible. This implies that the total energy is independent of the
Chapter Nine: The Density of States and Entropy 17

coordinates, ;4 , of the center of mass, and that the total energy of the system can be expressed by

I +4 :4#
3R
(A9.1.7)
4"

This is the so-called ideal gas" approximation.


To determine the number of states HI for a non-ideal gas, we simply add up" all the states between the
energy I and I  $ I by integrating over all the accessible coordinates and momenta of the system. This is
equivalent to integrating over all accessible volume elements in phase space, or, in general,
I$ I

HI ( d:" d:# d:$R d;" d;# d;$R dT" dT# dT7 dU" dU# dU7 (A9.1.8)
I

or, expressed in terms of three-dimensional momenta and coordinates,


I$ I

HI ( d$ :t" d$ :t# d$ :tR d$ t<" d$ t<# d$ t<R dT" dT# dT7 dU" dU# dU7 (A9.1.9)
I

where we have written d$ :t" d:3B d:3C d:3D and d$ t<" dB3 dC3 dD3 . In fact, what we do is to evaluate the integral
I

FI ( d$ :t" d$ :t# d$ :tR d$ t<" d$ t<# d$ t<R dT" dT# dT7 dU" dU# dU7 (A9.1.10)
!

and then determine HI from the equation


` FI
HI $I (A9.1.11)
`I
For those cases in which the potential energy of interaction is negligible, the total energy is independent of
the intermolecular distances, and, therefore, of the coordinates of the molecules. This means that since the energy
is not a function of the coordinates, we can immediately integrate over all values of the coordinates accessible to
the molecules (the energy doesn't restrict which values of the coordinates we can integrate over). Each integral
over t<3 just gives the volume accessible to molecule 3, so that we have

( d t<3 Z
$
(A9.1.12)

for each particle in the system. This means that the number of states must be given by
FI Z R ;I (A9.1.13)
where
I

;I ( d$ :t" d$ :t# d$ :tR dT" dTQ dU" dUQ (A9.1.14)


0

is independent of the volume Z , since neither O nor Iint depend upon the coordinates <t3 .
APPENDIX 9.2
EVALUATION OF THE INTEGRATING CONSTANT FOR AN 8-DIMENSIONAL SPACE

As pointed out in the body of the chapter, the number of states available to an ideal gas with energy
ranging from ! to I is given by the integral
I

( d :t" d :t# d :tR d t<" d t<# d t<R


" $ $ $ $ $ $
FI (A9.2.1)
2o$R
!

from which we can determine HI from the equation


` FI
HI $I (A9.2.2)
`I
Since the potential energy of interaction between the molecules is zero for an ideal gas, and since we
assume no external force acting on the system, the total energy is independent of the location of the individual
molecules so that we can perform the integration over spacial coordinates immediately to obtain
ZR
FI ;I (A9.2.3)
2o$R

since each particle can be anywhere within the volume Z of the container. The function ;Iis given by
I I

;I ( d :t" d :t# d :tR ( .:" .:# .:$ .:% .:$R


$ $ $
(A9.2.4)
! !

and is a function of the total energy of the system, but independent of the volume. This integral is effectively a
volume integral in momentum space with the condition that the total energy of the system remain less than some
value I, which is given by

I
t:#"  t:## 
R
t:#3 (A9.2.5)
#7 #7 3"
#7

This means that, for a fixed value of the total energy I of the system, a change in the magnitude of the
momentum of one particle must be reflected by a change in momentum of one or more other particles. But in
order to count all the possible states accessible to the system, we must sum up all possible combinations of the
different momenta which will give a total energy I !

diagram the tip of vector :t touches a circle which represents all points that are a distance l:l #7I from the
This situation is illustrated in the diagram of three-dimensional space shown in Figure A7.1. In this

origin. This circle, therefore, represents the locus of all vectors which have the constant energy I . Such a vector
can be oriented in any arbitrary direction so long as the tip of the vector is on the circle. The number of possible

l:l #7I . The volume of momentum space with energy less than I is given by
momentum states which have a total energy less than E must be proportional to the volume of this sphere of radius

1: 1#7I $#
% $ %
;I (A9.2.6)
$ 3
In two dimensions, this would be

;I 1:# 1#7I 2/2 (A9.2.7)


For the case of dimensions, we expect ;I to be of the form

;I G #7I # (A9.2.8)
where G is some constant which we want to evaluate.
Chapter Nine: The Density of States and Entropy 19

p
z

p
y

p
x

Fig A9.4.1 The constant energy sphere for a single particle in three-dimensions. The momentum vector can
have any orientation is space, but must always have the same magnitude. The volume of this sphere is
proportional to the number of ways of choosing the different possible :3 's for the particle to have an energy
less than I :# #7.

In the diagram above we want to determine the volume covered by all possible combinations of :B , :C , and
:D , such that :# :B#  :C#  :D# , or expressed mathematically
:

Z : ( .:B .:C .:D ( %1:# .:


% $
1: (A9.2.9)
$
:B# :C# :D# :# 0

where Z :is the volume in three-dimensional space which is enclosed by the radius vector :. In two-
dimensions, the radius vector incloses an area, and we write
:

E: ( .:B .:C ( 21 .: 1:# (A9.2.10)


:B# :C# :# 0

so we see that the volume i8 eenclosed within a radius vector of length e in an 8-dimensional space is of
the form
e

i8 ( e ) ( .B" .B# .B8 ( f e. e (A9.2.11)


B#" B## B#8 e# 0

where We is the surface area of the enclosed volume. Examining the 2- and 3-dimensional results, we
expect the volume and surface area to be of the forms:
i8 e V8 e8 (A9.2.12)
and
. i8 e
f8 e 8V8 e8" (A9.2.13)
.e
Our task is to determine the general form of the constant V8 for an 8-dimensional space.
Chapter Nine: The Density of States and Entropy 20

To do this we consider the Gaussian integral


( / .B 1
B "# #
(A9.2.14)


If we multiply this integral times itself 8 times we have


8
( / B#
.B 18#

(A9.2.15)

but this is just


8

( / B#
.B ( .B" /B" ( .B# /B# ( .B$ /B$ ( .B8 /B8
# # # #
8#

 
1 (A9.2.16)
  

( .B" ( .B# ( .B$ ( .B8 /B" B# B8


8# # # #
1
   

18# ( .B" .B# .B8 /B" B# B8


# # #

Now the term .B" .B# .B8 is just the differential volume . i8 ewhich can be expressed in terms of the surface
area f8 e
. i8 e=f8 (e) . e 8V8 e8" . e (A9.2.17)
provided we integrate from e ! to e , giving

( .B" .B# .B8 /B" B# B8


8# # # #
1 (A9.2.18)


18# ( 8V8 e8" /B" B# B8 . e


# # #

( 8V8 e8" /e . e


8# #
1
!

8V8 ( e8" /e . e


8# #
1
!

.>
Now we make a change of variables, letting > e# , so that e >"# , and #e . e .> . e #>"#
, so that we
have

8V8 ( >"# 8" />


8# .>
1 (A9.2.19)
#>"#
!

( > # / .>
8# 8V8 8
" >
1
#
!
Chapter Nine: The Density of States and Entropy 21

Now this integral is in the form of the Gamma function integral


>: ( >:" /> .> (A9.2.20)


!

where : 8# , so that our last equation can be written as


8V8 8
18# > (A9.2.21)
# #
Our constant Vn is, therefore, given by

18#
V8 8 (A9.2.22)
# > 8#

To check and make sure that this is correct, we will evaluate this constant for the cases of one, two, three and
four dimensions.
#1"/# #1"/#
" > #
V" " # (A9.2.23)
" 1"#
#1 #1
# >1
V# 1
# !x
#1$# #1$#
$ > #
%
V$ $ 1 1
$ $
#
#1 # #1 # 1#
% >2
V%
% "x #

Note: Some useful properties of the Gamma function are


>8  " 8>8
>8 8  ">8  "
>8 8  "x 8 38>/1/<
1
> =
"
#
1 3 5 72m-1
2
>m+ =
"
1
# 2m

Potrebbero piacerti anche