Lecture_5_Statistica..

Download Report

Transcript Lecture_5_Statistica..

The Statistical Interpretation of Entropy

The aim of this lecture is to show that entropy can be interpreted in terms of the degree of randomness as originally shown by Boltzmann.

probability that a given state exists.

S

k B

ln   is the For example, we consider a system composed of 3 particles with energy

   

is 2u and level 3 is 3u. Let the total energy of the system, U = 3u.

The total energy of 3u can be present with various configurations or

microstate complexions

.

3

2

1

o

a =1

Distinguishable complexions for U = 3u.

b = 3 c = 6

3u 2u u 0 a; all three particles in level 1; probability of occurrence 1/10 b; one particle in level 3, 2 particles in level 0; 3/10 c; one particle in level 2, 1 particle in level 1, one particle in level 0; 6/10

All 3 of these complexions or microstates correspond to a single “observable” macrostate.

In general the number of arrangements or complexions within a single distribution is given by  

n

!

o

! !

1 2

n i

where n particles are distributed among energy levels such that

n 1

are in level  o , etc.

n o

are in level  o ,

distribution a; distribution b; distribution c;

 

a

3!

3!0!0!

 1  

b

3!

2!1!0!

 3  

c

3!

1!1!1!

 6

The most probable distribution is determined by the set of numbers that maximizes  .

n i

Since for real systems the numbers can be large (consider the number in 1 mole of gas), Stirling’s approximation will be useful, 

x

ln

x

x

The observable macrostate is determined by constraints.

U

n o

o

n

1 1

n r

r

 

n i

i n

n o

n o

  1 ...

n r

 

n i

constant energy in the system constant number of particles in the system

Any interchange of particles among the energy levels are constrained by the conditions:

 

U n

  

i

i

  

i n i n i

 0  0

A B

Also using the definition and Stirling’s approximation;

 

n

!

ln  

o

1

n

i

 ln  

n

ln

n

o

n

1

i

 

n

i

ln

n

i

n

i

n

i

!

The constraints on the particle numbers impose a condition on  ,

ln    

n i

ln

n i

 0

C

What we need is to find the most likely microstate or complexion and that will be given by the maximum value of  . This occurs when equations

A, B

and

C

are simultaneously satisfied.

Technique of Lagrange multipliers which is a method for finding the extrema of a function of several variables subject to one or more constraints. We will multiply equation A by a quantity

b

, which has the units of reciprocal energy.

i

b 

i n i

 0

D

Equation B is multiplied by a dimensionless constant

a

, 

i

a

n i

 0 Equations C, D and E are added to give,  

ln

n

i i

n

i

 0.

E

 ln

n o

i.e.,

o

n o

  ln

n

1 1 

n

1 

n r r

n r

 0.

This can only occur is each of the bracketed quantities are identically zero,

ln

n

i i

 0

rearranging for the

n

i , n i

  a

e e

 b

i

and summing over all

r

energy levels, 

n

i

e

 a 

e

 b

i

The quantity 

e

 b

i

e

 b

o

e

 b 1

e

 b

r

P

is very important and occurs very often in the study of statistical mechanics. It is called the partition function, P. Then, 

n i e

 a 

e

 b

i e

 a  This allows us to write the expression for

n

i

in convenient form,

n

i

ne

 b

i

P

So, the distribution of particles maximizing

is one in which the occupancy or population of the energy levels decreases exponentially with increasing energy.

We can identify, the undetermined multiplier

b

connecting

with entropy,

S

.

using the following argument Consider two similar systems a and b in thermal contact with entropies

S a

and and

S b

and associated thermodynamic probabilities

b .

Since entropy (upper case) is an extensive variable, the

a

total entropy of the composite system is

S

S

a

S

b

The thermodynamic probability of the composite system involves a product of the individual probabilities,

   

a b

Since our aim is to connect

with entropy,

S

,, we seek

S

f

Then we must have

f f

   

a b

f

 

a f

 

b

The only function satisfying this is the logarithm, so that we must have

S

k

where

k

is a constant. Now we can identify the quantity

b

. We start with the condition,

ln    

n i

ln

n i

 0

C

and make the substitution in

C

for

ln

n

i

from

ln

n

i

 ln

n

i i i

 0,

ln    

n i

ln

n i

  

i

 

n i

Expanding

ln   

b 

i n i

 

a

n i

rearranging

ln  

b

i n i

a

n i

= 0 and solving for

b

,

 ln   b

U

b

ln 

U

But we can see that,

b   ln 

U

 

d

ln 

dU

 1

d k dU k

ln

k

  

S

U

V

The constant volume condition results from the fixed number of energy states.

The from the combined 1 st and 2 nd Law

dU

TdS

pdV

 

U

S

V

T

;  

S

U

V

 1

T

and finally

b

 1

kT

;

k

k B

Configurational and Thermal Entropy Mixing of red and blue spheres for unmixed state 1,

  1

for mixing of red and blue spheres; 

conf

 

n b

n r b

! !

r

 !

Then

S conf

k B

ln 

n b

n r b

! !

r

 !

The total entropy will be given by

S total S total S total

S thermal

S conf

k B

ln  

th k B

ln 

conf

k B

ln  

th conf

The number of spatial configurations available to 2 closed systems placed in thermal contact is unity. For heat flow down a temperature gradient we only have  th changing. Similarly for mixing of particles A and B the only contribution to the entropy change will be S conf if the redistribution of particles does not cause any   

th th

since the total energy of the mixed system would be identical to the sum of the energies of the individual systems. This occurs in nature only rarely.