Transcript Document
Lecture 3
•
Entropy in statistical mechanics.
•
Thermodynamic contacts:
i. mechanical contact,
Ice melting" - classic
example of entropy
increase described in 1862
by Rudolf Clausius as an
increase in the
disgregation of the
molecules of the body of
ice.
ii. heat contact,
iii. diffusion contact.
•
Equilibrium.
•
Chemical potential.
•
The main distributions in statistical mechanics.
•
A
system
Thermostat.
in
the
canonical
ensemble.
1
Entropy in Statistical Mechanics
From the principles of thermodynamics we learn that the
thermodynamic entropy S has the following important
properties:
dS is an exact differential and is equal to DQ/T for a
reversible process, where DQ is the heat quantity added
to the system.
Entropy is additive: S=S1+S2. The entropy of the
combined system is the sum of the entropy of two
separate parts.
S 0.
If the state of a closed system is given
macroscopically at any instant, the most probable state at
any other instant is one of equal or greater entropy.
2
State Function
This statement means that entropy is a state function in
that the value of entropy does not depend on the past
history of the system but only on the actual state of the
system...
In this case one of the great accomplishments of
statistical mechanics is to give us a physical picture of
entropy.
The entropy of a system (in classical statistical physics)
in statistical equilibrium can be defined as
=ln
(3.1)
where is the volume of phase space accessible to the
system, i.e., the volume corresponding to the energies
between E- 12 and E+ 1 .
3
2
Let us show first, that changes in the entropy are
independent of the system of units used to measure .
As is a volume in the phase space of N point particles it
has dimensions
(Momentum Length)3N =(Action)3N
(3.2)
Let denote the unit of action; then / 3 N is
dimensionless. If we were to define
ln
3N
ln 3N ln
(3.3)
we see that for changes
ln
(3.4)
4
independent of the system units. =Plank’s constant is a
natural unit of action in phase space.
It is obvious that the entropy , as defined by (3.1), has a
definite value for an ensemble in statistical equilibrium;
thus the change in entropy is an exact differential. Once
the ensemble is specified in terms of the spread in the
phase space, the entropy is known.
We see that, if is interpreted as a measure of the
imprecision of our knowledge of a system or as a measure
of the “randomness” of a system then the entropy is also
to be interpreted as a measure of the imprecision or
randomness.
5
Entropy is an additive
It can be easily shown that is an additive. Let us
consider a system made up of two parts, one with N1
particles and the other with N2 particles. Then
N N1 N 2
(3.5)
and the phase space of the combined system is the
product space of the phase spaces of the individual parts:
12
(3.6)
The additive property of the entropy follows directly:
ln ln 12 ln 1 ln 2 1 2
(3.7)
6
Thermodynamic contacts between
two systems
We have supposed that the condition of statistical
equilibrium is given by the most probable condition of a
closed system, and therefore we may also say that the
entropy is a maximum when a closed system is in
equilibrium condition.
The value of for a system in equilibrium will depend on
the energy E ( <E>) of the system; on the number Ni of
each molecular species i in the system; and on external
variables, such as volume, strain, magnetization, etc.
7
Let us consider the condition for equilibrium in a system
made up of two interconnected subsystems, as in Fig. 3.1.
Initially a rigid, insulating, non-permeable barrier separates
the subsystems from each other.
Insulation
1
2
Barrier
Fig. 3.1.
8
Thermal contact
Thermal contact - the systems can exchange the
energy. In equilibrium, there is no any flow of
energy between the systems. Let us suppose that the
barrier is allowed to transmit energy, the other inhibitions
remaining in effect. If the conditions of the two
subsystems 1 and 2 do not change we say they are in
thermal equilibrium.
In the thermal equilibrium the entropy of the total
system must be a maximum with respect to small
transfers of energy from one subsystem to the other.
Writing, by the additive property of the entropy
9
1 2
we have in equilibrium
=1+2=0
1
2
E1
E2 0
E1
E2
(3.8)
(3.9)
We know, however, that
E E1 E2 0
(3.9)
as the total system is thermally closed, the energy in a
microcanonical ensemble being constant. Thus
1 2
E1 0
E1 E2
(3.11)
10
As E1 was an arbitrary variation we must have
1
2
E1
E 2
(3.12)
in thermal equilibrium. If we define a quantity by
1
E
(3.13)
then in thermal equilibrium
1 2
(3.14)
Here is known as the temperature and will shown later to
be related to the absolute temperature T by =kT , where k
is the Boltzmann constant, 1.38010-23 j/deg K.
11
Mechanical contact
Mechanical contact the systems are separated by the
mobile barrier; The equilibrium is reached in this case by
adequation of pressure from both sides of the barrier.
We now imagine that the wall is allowed to move and also
passes energy but do not passes particles. The volumes
V1, V2 of the two systems can readjust to maximize the
entropy. In mechanical equilibrium
1
2
1
2
V1
V2
E1
E2 0
V1
V2
E1
E2
(3.15)
12
After thermal equilibrium has been established the last two
terms on the right add up to zero, so we must have
1
2
V
1
V2 0
V1
V2
(3.16)
Now the total volume V=V1+V2 is constant, so that
V V1 V2
(3.17)
We have then
1 2
V1 0
V1 V2
(3.18)
13
As V1 was an arbitrary variation we must have
1
2
V1
V2
(3.19)
in mechanical equilibrium. If we define a quantity by
V E , N
(3.20)
we see that for a system in thermal equilibrium the
condition for mechanical equilibrium is
1 2
(3.21)
We show now that has the essential characteristics of
the usual pressure p.
14
Material-transferring contact
The systems can be exchange by the particles. Let us
suppose that the wall allows diffusion through it of
molecules of the i th chemical species. We have
Ni1 Ni 2
(3.22)
For equilibrium
1 2
N i1 0
N i1 N i 2
(3.23)
or
1 2
Ni1 Ni 2
(3.24)
15
We define a quantity i by the relation
i
N i E ,V
(3.25)
The quantity i is called the chemical potential of the ith
species. For equilibrium at constant temperature
i1 i 2
(3.26)
16
The Canonical Ensemble
The microcanonical ensemble is a general statistical tool,
but it is often very difficult to use in practice because of
difficulty in evaluating the volume of phase space or the
number of states accessible to the system.
The canonical ensemble invented by Gibbs avoids some of
the difficulties, and leads us easily to the familiar
Boltzmann factor exp(-E/kT) for the ration of populations
of two states differing by E in energy.
We shall see that the canonical ensemble describes
systems
in thermal
contact
with
a heat
reservoir; the
In 1901,
at the age
of 62,
Gibbs
(1839-1903)
microcanonical
ensemble
systems
that are
published a book
calleddescribes
Elementary
Principles
in
Statistical
Mechanics (Dover, New York).
perfectly
insulated.
17
We imagine that each system of the ensemble is divided
up into a large number of subsystems, which are in mutual
thermal contact and can exchange energy with each other.
We direct our attention (Fig.3.2) to one subsystem
denoted s; the rest of the system will be denoted by r and
is sometimes referred to as a heat reservoir.
Total system (t)
Rest
(s)
( r ) or heat
reservoir
Subsystem
(Fig.3.2)
The total system is denoted by t
and has the constant energy Et
as it is a member of a
microcanonical ensemble. For
each value of the energy we
think of an ensemble of systems
(and subsystems).
18
The subsystems will usually, but not necessarily, be
themselves of macroscopic dimensions.
The subsystem may be a single molecule if, as in a gas,
the interactions between molecules are very weak,
thereby permitting us to specify accurately the energy of a
molecule. In a solid, a single atom will not be a
satisfactory subsystem as the bond energy is shared with
neighbors.
Letting dwt denote the probability that the total system is
in an element of volume dt of the appropriate phase
space, we have for a microcanonical ensemble
19
dwt Cdt if theenergyis in Et at Et
dwt 0
otherwise
(3.27)
here C is constant. Then we can write
dwt Cds dt if theregion of phasespaceis accessible
dwt 0
otherwise
(3.28)
We ask now the probability dws that the subsystem is in
ds, without specifying the condition of the reservoir, but
still requiring that the total system being in Et at Et .
Then
dws Cds r
(3.29)
20
where r is the volume of phase space of the reservoir
which corresponds to the energy of the total system being
in Et at Et.
Our task is to evaluate r; that is, if we know that the
subsystem is in ds, how much phase space is accessible
to the heat reservoir?
The entropy of the reservoir is
r ln r
r e
r
(3.30)
(3.31)
Note that
Er Et E s
(3.32)
21
where we may take Es<< Et because the subsystem is
assumed to be small in comparison with the total system.
We expend
r ( Er ) r ( Et Es ) r ( Et )
r ( Et )
Es ......
Et
(3.33)
Thus
r ( Et )
r exp r ( Et )exp
Es
Et
(3.34)
As Et is necessarily close to Er, we can write, using 1
E
(3.13)
r ( Et )
1
1
(3.35)
kT
Et
Here is the temperature characterizing every part of the
system, as thermal contact as assumed. Finally, from
(3.29), (3.34) and (3.35)
22
dws Ae E ds
s
(3.36)
where
A Ce
r
( Er )
(3.37)
may be viewed as a quantity which takes care of the
normalization:
dws 1 A e
Es
ds
(3.38)
Thus for the subsystem the probability density (distribution
function) is given by the canonical ensemble
( E ) Ae
E
kT
(3.39)
23
where here and henceforth the subscript s is dropped.
We emphasize that E is the energy of the entire
subsystem.
We note that ln is additive for two subsystems in
thermal contact:
ln1=lnA1-E1/
ln2=lnA2-E2/
ln12=lnA1A2 - (E1+E2)/
so that , with =12; A=A1A2 ; E=E1+E2, , we have
ln=lnA-E/
(3.40)
24
for the combined systems. This additive property is central
to the use of the canonical ensemble.
The average value of any physical quantity f(p,q) over
canonical distribution is given by
f
E ( p ,q )/ kT
e
f ( p, q )d
E ( p ,q )/ kT
e
d
We have to note that for a subsystem consisting of a large
number of particles, the subsystem energy in a canonical
ensemble is very well defined.
This is because the density of energy levels or the volume
in phase space is a strongly varying function of energy, as
is also the distribution function (3.39).
25