Physics I - Chap 15 - Zhejiang University

Download Report

Transcript Physics I - Chap 15 - Zhejiang University

Physics I
Entropy: Reversibility,
Disorder, and Information
Prof. WAN, Xin
[email protected]
http://zimp.zju.edu.cn/~xinwan/
1st & 2nd Laws of Thermodynamics

The 1st law specifies that we cannot get more energy out of
a cyclic process by work than the amount of energy we put
in.
U  Q  W

The 2nd law states that we cannot break even because we
must put more energy in, at the higher temperature, than
the net amount of energy we get out by work.
Qc
Tc
W

 1
  carnot  1 
Qh
Qh
Th
Carnot’s Engine
Efficiency of a Carnot Engine
All Carnot engines
operating between the
same two temperatures
have the same efficiency.
An Equality
Now putting in the proper signs,
positive
negative
Qh Qc

0
Th Tc
dQ
Carnot Cycle T  0
A Sum of Carnot Cycles
adiabats
P
Th,i
Any reversible process can
be approximated by a sum
of Carnot cycles, hence
Qh,i
T
i
h ,i

Qc,i
Tc,i
0
dQ
C T  0
Tc,i
V
Clausius Definition of Entropy

Entropy is a state function, the change in entropy
during a process depends only on the end points
and is independent of the actual path followed.
dQreversible
dS 
T
2
C2
 dS   dS   dS  0
C1 ,1 2
1
C1
S 2  S1 
C 2 , 21
C
 dS    dS   dS
C1 ,1 2
C 2 , 21
C2 ,1 2
Return to Inexact Differential
x
Assume dg  dx  dy
y
 ( 2,1)  ( 2, 2)  dx  x dy   1  2 ln 2
(2,1)  y 
 (1,1)
 (1, 2)  ( 2, 2)  dx  x dy   ln 2  1
(1,2)  y 
 (1,1)
dg dx dy
Note: df 


x
x
y
Integrating factor
is an exact differential.
f ( x, y)  ln x  ln y  f 0
Digression on Multivariate Calculus

Heat is path dependent.
dQ  dU  PdV

Therefore, 1/T is really the integrating factor for
the differential form of heat. Now we can recast
the 1st law of thermodynamics as
dU  TdS  PdV

Entropy is also a state function, as is the internal
energy or volume.
Entropy of an Ideal Gas (1 mole)
RT
p (T , V ) 
V
U (T )  C
mol
V
fR
T
T
2
CVmol dT RdV
1
dS  dU  pdV  

T
T
V
Integrating from (T0,V0) to (T, V)
S (T ,V )  S0  C
mol
V
T
V
ln  R ln
T0
V0
Carnot’s Theorem
No real heat engine operating between two energy reservoirs
can be more efficient than Carnot’s engine operating between
the same two reservoirs.
positive
negative
Q 'c
Tc
e'  1 
 1
Q'h
Th
Q ' h Q 'c


0
Th
Tc
What does this mean? Still, for any
engine in a cycle (S is a state function!)
 dS  0
Counting the Heat Baths in
 Q 'h
S h 
Th
Q'h > 0
S gas   dS  0 after a cycle
 Q 'c
S c 
Tc
S  S h  S gas
Q'c < 0
 Q'h
 Q 'c
 Sc 
0
0
Th
Tc
Counting the Heat Baths in
 Q 'h
S h 
Th
Q'h > 0
S gas   dS  0 after a cycle
 Q 'c
S c 
Tc
Q'c < 0
The total entropy of an isolated system that undergoes a
change can never decrease.
Example 1: Clausius Statement
Q
S h 
Th
Q
S c 
Tc
Q Q
S  S h  Sc     0
Th Tc
Irreversible!
Example 2: Kelvin Statement
Q
S   0
T
Irreversible!
Example 3: Mixing Water
TA
TB
TA< TB
Q
TA  T :
Q  cmA T  TA 
TB  T :
Q  cmB TB  T 
mA T  TA

mB TB  T
TA Q TB
A
B
mATA  mBTB
 T
mA  mB
Example 3: Mixing Water
TA
TA< TB
TB
Q
TA  T :
TB  T :
TA Q TB
A
B
cmA dT
S A  
 cmA ln T   0
TA
 TA 
T
T cm dT
B
S B  
 cmB ln T   0
TB
 TB 
T
T
For simplicity, assume mA  mB  m, T  TA  TB  / 2
2

0
T
S  S A  S B  cm ln

T
T
A B

Irreversible!
Example 4: Free Expansion
U  Q  W  0  S  0
?
We can only calculate S with a
reversible process! In this case, we
replace the free expansion by the
isothermal process with the same
initial and final states.
Vf
S  
Vi
V f nRdV
dQ V f PdV
V f 


 nR ln
0
V
V
V
i
i
i
T
T
V

Irreversible!
The Second Law in terms of Entropy

The total entropy of an isolated system that
undergoes a change can never decrease.
– If the process is irreversible, then the total entropy of
an isolated system always increases.
– In a reversible process, the total entropy of an isolated
system remains constant.

The change in entropy of the Universe must be
greater than zero for an irreversible process and
equal to zero for a reversible process.
SUniverse  0
Order versus Disorder

Isolated systems tend toward disorder and that
entropy is a measure of this disorder.
Ordered: all molecules
on the left side
Disordered: molecules
on the left and right
Macrostate versus Microstate





Each of the microstates is equally probable.
Ordered microstate to be very unlikely because random
motions tend to distribute molecules uniformly.
There are many more disordered microstates than
ordered microstates.
A macrostate corresponding to a large number of
equivalent disordered microstates is much more probable
than a macrostate corresponding to a small number of
equivalent ordered microstates.
How much more probable?
Entropy: A Measure of Disorder
We assume that each molecule occupies
some microscopic volume Vm.
V
Wi   i 
 Vm 

N
Wf
V f 
Wf  

V
m

V f 

Wi  Vi 
N
N
V f 
S  Nk B ln
  Nk B ln 2
V
i

suggesting
S  kBT lnW
(Boltzmann)
A Similar Probability Problem
Let’s Play Cards

Imagine shuffling a deck of
playing cards:
– Systems have a natural
tendency to become more
and more disordered.

Disorder almost always increases is that
disordered states hugely outnumber highly
ordered states, such that the system inevitably
settles down in one of the more disordered states.
Computers are useless. They can
only give us answers.
---- Pablo Picasso
Information and Entropy

(1927) Bell Labs, Ralph Hartley
– Measure for information in a message
– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon
– “A mathematical theory of communication”
– Probability of a particular message
But there is no
information.
You are not
winning the
lottery.
Information and Entropy

(1927) Bell Labs, Ralph Hartley
– Measure for information in a message
– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon
– “A mathematical theory of communication”
– Probability of a particular message
Now that’s
something.
Okay, you
are going to
win the
lottery.
Information and Entropy

(1927) Bell Labs, Ralph Hartley
– Measure for information in a message
– Logarithm: 8 bit = 28 = 256 different numbers

(1940) Bell Labs, Claude Shannon
– “A mathematical theory of communication”
– Probability of a particular message
– Information ~ - log (probability) ~ negative entropy
Sinfomation   Pi log Pi
i
It is already in use under that
name. … and besides, it will give
you great edge in debates because
nobody really knows what
entropy is anyway.
---- John von Neumann
Maxwell’s Demon
To determine whether to let a molecule through, the demon must acquire
information about the state of the molecule. However well prepared, the
demon will eventually run out of information storage space and must begin to
erase the information it has previously gathered. Erasing information is a
thermodynamically irreversible process that increases the entropy of a system.
Landauer’s Principle & Verification


Computation needs to involve heat dissipation only when
you do something irreversible with the information.
Lutz group (2012)
Q
k BT
 ln 2  0.693
Homework (for the 2nd Law)
CHAP. 24 Exercises
25, 30, 35 (P565) 4, 8
(P566)
Homework

Reading (downloadable from my website):
– Charles Bennett and Rolf Landauer, The
fundamental physical limits of computation.
– Antoine Bérut et al., Experimental verification
of Landauer’s principle linking information
and thermodynamics, Nature (2012).
– Seth Lloyd, Ultimate physical limits to
computation, Nature (2000).
Dare to adventure where you have not been!