Statistical Mechanics statistical distributions Mar. 25 go to slide 9. Maxwell-Boltzmann Statistics Apr. 4 go to slide 24. molecular energies in an ideal gas quantum statistics Since I’ve been unjustly picking.

Download Report

Transcript Statistical Mechanics statistical distributions Mar. 25 go to slide 9. Maxwell-Boltzmann Statistics Apr. 4 go to slide 24. molecular energies in an ideal gas quantum statistics Since I’ve been unjustly picking.

Statistical Mechanics
statistical distributions
Mar. 25
go to
slide 9.
Maxwell-Boltzmann Statistics
Apr. 4
go to
slide 24.
molecular energies in an ideal gas
quantum statistics
Since I’ve been unjustly picking on chemists: “All science is either Physics
or stamp collecting.”—Ernest Rutherford, physicist and 1908 Nobel Prize
winner in Chemistry.
Chapter 9
Statistical Mechanics
spinach
good stuff!
spinach
get it outta
here
As a grade schooler, I went to a Catholic school. They served
lots of stewed spinach. Some of us got sick just from the
fumes wafting up from the cafeteria 2 floors down.
The nuns made us “clean our plate” at lunch. It had something
to do with the starving children in China.
They would inspect our trays as we passed through the “dump
the trash” line.
What to do on spinach day?
Stuff it in your empty milk carton and
hope the nuns didn’t inspect it?
Sit next to the one kid in class who liked
stewed spinach,* and see how much you
could pass off to him?
*The most valuable kid in school that day.
What does this have to do with statistical mechanics?
Physics faculty tend to think of thermodynamics (including
statistical mechanics) as the stewed spinach of college physics
courses.
The wacko faculty member who actually likes teaching that stuff
is a treasured friend whenever it comes time to give out
teaching assignments.
Just thought you might want to know* that before we start this
chapter on statistical mechanics (thermodynamics).
Fair warnings and all that. There are more cautionary tales, but not for now.
I don’t want to scare you all off at once.
Before we get into chapter 9, let’s think about this course a bit.
We started with relativity. A logical starting point, because
relativity heralded the beginning of modern physics.
• Relativity forced us to start accepting previously
unthinkable ideas.
• Relativity is more “fun” than most of the rest of the
material, so it won’t drive prospective students away from
the class.
Relativity shows us that photons have momentum—a particle
property, and gets us thinking about particle properties of
waves.
Waves having particle properties leads us (e.g., de Broglie) to
ask if particles have wave properties.
After chapter 3, we backtracked to try to explain the properties
of hydrogen, the simplest atom.
• This is backtracking, because we had to move
backwards in the time sequence of discoveries, from
de Broglie back to Rutherford.
• It doesn’t break the logical chain because we find we
can’t explain hydrogen without invoking wave
properties.
The puzzle of hydrogen forces us to completely re-think the
fundamental ideas of physics.
• If physics can’t explain hydrogen—the simplest of all
atoms—it is in dire shape. Something drastic must be
done.
• “Something drastic” = “quantum mechanics”
Once quantum mechanics is discovered we rush off to find
applications and confirmations.
• A logical place to start testing quantum mechanics
(Schrödinger’s equation) is to start with simple model
systems (particle in box) and move up from there.
Once we’ve practiced with model systems, we go full circle and
apply quantum mechanics to the system that started this
trouble in the first place—hydrogen.
The next step up from hydrogen is atoms (chapter 7).
The next step up from atoms is a few atoms bonded together
(chapter 8).
What’s the next step up from a few atoms bonded together?
Good! Lots of atoms. Lets start with them interacting but not
bonded together in large masses (chapter 9). Then we’ll be
able to tackle lots of atoms bonded together (chapter 10).
There’s a logic to this, isn’t there. No wonder most modern physics books
follow the same sequence.
9.1 Statistical Distributions
Statistical mechanics deals with the behavior of systems of a
large number of particles.
We give up trying to keep track of individual particles. If we
can’t solve Schrödinger’s equation in closed form for helium (4
particles) what hope do we have of solving it for the gas
molecules in this room (10really big number particles).
Statistical mechanics handles many particles by calculating the
most probable behavior of the system as a whole, rather than
by being concerned with the behavior of individual particles.
not again?
yes, again
In statistical mechanics, we assume that the more ways there
are to arrange the particles to give a particular distribution of
energies, the more probable is that distribution. (Seems
reasonable?)
6 units of energy, 3 particles to give it to
321
411
312
213
231
123
141
114
3 ways
more likely
132
6 ways
(repeating) In statistical mechanics, we assume that the more
ways there are to arrange the particles to give a particular
distribution of energies, the more probable is that distribution.
(Seems reasonable.)
We begin with an assumption that we believe describes
nature.
We see if the consequences of the assumption
correspond in any way with reality.
It is not “bad” to begin with an assumption, as long as
we realize what we have done, and discard (or modify)
the assumption when it fails to describe things we
measure and observe.
A brief note...
Beiser mentions W, which is the number of ways to
arrange particles to give a particular distribution of
energies.
The idea is to calculate W, and find when W is maximized.
That gives us the most probable state of the system.
W doesn't appear anywhere else in this chapter. In previous
editions, it was calculated in an appendix, where Beiser derived
all the distribution functions we will use.
So you don’t need to worry about W.
http://www.rozies.com
/Zzzz/920/alpha.html
Here, in words, is the equation we will be working with in this
chapter:
(# of particles in a state of energy E) = (# of states having
energy E) x (probability that a particle occupies the state of
energy E).
If we know the distribution function, the (probability that a
particle occupies a state of energy E), we can make a number
of useful calculations.
Mathematically, the equation is written
nε  = gε  f ε 
It is common to use epsilon to represent energy; I will call it "E"
when I say it.
In systems such as atoms, only discrete energy levels are
occupied, and the distribution g() of energies is not
continuous.
On the other hand, it may be that the distribution of energies is
continuous, or at least can be approximated as being
continuous. In that case, we replace g(ε) by g(ε)dε, the
number of states between ε and ε+dε.
We will find that there are several possible distributions f(ε)
which depend on whether particles are distinguishable, and
what their spins are.
Beiser mentions them (Maxwell-Boltzmann, Bose-Einstein,
Fermi-Dirac) in this section. Let’s wait and introduce them one
at a time.
9.2 Maxwell-Boltzmann Statistics
We take another step back in time from quantum mechanics
(1930’s) to statistical mechanics (late 1800’s).
Classical particles which are identical but far enough apart to be
distinguishable obey Maxwell-Boltzmann statistics.
classical  “slow,” wave functions don’t overlap
distinguishable  you would know if two
particles changed places (you could put your
finger on one and follow it as it moves about)
Two particles can be considered distinguishable if their
separation is large compared to their de Broglie wavelength.
Example: ideal gas molecules.
The Maxwell-Boltzmann distribution function is
f  ε  = A e-ε/kT .
I’ll explain the various symbols in a minute.
Boltzmann discovered statistical mechanics
and was a pioneer of quantum mechanics.
His work contained elements of relativity
and quantum mechanics, including discrete
atomic energy levels.
“In his statistical interpretation of the second law of
thermodynamics he introduced the theory of probability into a
fundamental law of physics and thus broke with the classical
prejudice, that fundamental laws have to be strictly
deterministic.” (Flamm, 1997.)
“With Boltzmann's pioneering work the probabilistic
interpretation of quantum mechanics had already a precedent.”
Boltzmann constantly battled for acceptance of his work. He
also struggled with depression and poor health. He committed
suicide in 1906. Most of us believe thermodynamics was the
cause. See a biography here.
Paul Eherenfest, who wrote Boltzmann’s
eulogy, carried on (among other things)
the development of statistical
thermodynamics for nearly three
decades.
Ehrenfest was filled with self-doubt and
deeply troubled by the disagreements
between his friends (Bohr, Einstein, etc.)
which arose during the development of
quantum mechanics.
Ehrenfest shot himself in 1933.
US physicist Percy
Bridgmann (the man on the
right, winner of the 1946
Nobel Prize) took up the
banner of thermodynamics,
and studied the physics of
matter under high pressure.
Bridgman committed suicide in 1961.
There’s no need for you to worry; I’ve never lost a
student as a result of chapter 9 yet…
Back to physics…
The facts above accurate but rather selectively presented for dramatic effect.
f  ε  = A e-ε/kT
Maxwell-Boltzmann distribution function
The number of particles having energy ε at temperature T is
n  ε  = A g  ε  e-ε/kT .
A is like a normalization constant; we integrate n(ε) over all
energies to get N, the total number of particles. A is fixed to
give the "right" answer for the number of particles. For some
calculations, we may not care exactly what the value of A is.
ε is the particle energy, k is Boltzmann's constant
(k = 1.38x10-23 J/K), and T is the temperature in Kelvin.
Often k is written kB. When k and T appear together, you can
be sure that k is Boltzmann's constant.
n ε  = A g ε  e-ε/kT
We still need g(ε), the number of states having energy ε. We
will find that g(ε) depends on the problem under study.
Beiser justifies this distribution in Chapter 9, and but doesn't
derive it in the current text. I won't go through all this
justification. You can read it for yourself.
Before we do an example… monatomic hydrogen is less stable
than H2, so are you more likely to find H2 or H in nature?
H2, of course!
Nevertheless, suppose we could “make” a cubic meter of H
atoms. How many atoms would we have?
Example 9.1 A cubic meter of atomic H at 0 ºC and
atmospheric pressure contains about 2.7x1027 H atoms. Find
how many are in their first excited state, n=2.
Gas atoms at atmospheric pressure and temperature behave
like ideal gases. Furthermore, they are far enough apart that
Maxwell-Boltzmann statistics can be used.
For the hydrogen atoms in the ground state,
n ε1  = A g ε1  e-ε1/kT .
For the hydrogen atoms in first excited state,
n ε2  = A g ε2  e-ε2/kT .
We can divide the equation for n(ε2) by the equation for n(ε1) to
get
n  ε2 
A g  ε2  e-ε2 /kT
=
.
-ε1 /kT
n  ε1 
A g  ε1  e
Important: this is
temperature in K,
not in C!
We know ε1, ε2, and T. We need to calculate the g(ε)'s, which
are the number of states of energy ε. We don’t need to know
A, because it divides out.
We get g(ε) for hydrogen by counting up the number of allowed
electron states corresponding to each ε.
Or we can simply recall that there are 2n2 states corresponding
to each n, so that g(ε1)=2(1)2 and g(ε2)=2(2)2.
Plugging all of our numbers into the above equation gives
n(ε2)/n(ε1)=1.3x10-188. In other words, none of the atoms are
in the n=2 state.
Caution: the solution to example 9.1 and g(εn)=2(n)2 only
works for energy levels in atomic H and not for other assigned
problems! For example, to do it for H2 would require
knowledge of H2 molecular energy levels.
Skip example 9.2; I won’t test you on it.
9.3 Molecular Energies in an Ideal Gas
The example in section 9.2 dealt with atomic electronic energy
levels in atomic hydrogen. In this section, we apply MaxwellBoltzmann statistics to ideal gas molecules in general.
We use the Maxwell-Boltzmann distribution to learn about the
energies and speeds of molecules in an ideal gas.
We already have f(). We assume a continuous distribution of
energies (why?), so that
n ε  dε = A g ε  e-ε/kT dε .
We need to calculate g(ε), the number states having an energy
ε in the range ε to ε+dε.
g(ε) is called the “density of states.”
It turns out to be easier to find the number of momentum
states corresponding to a momentum p, and transform back to
energy states.
Why? Every classical particle has a position and momentum
given by the usual equations of classical mechanics.
Corresponding to every value of momentum is a value of
energy.
Momentum is a 3-dimensional vector
quantity. Every (px,py,pz)
corresponds to some energy.
Think of as (px,py,pz) forming a 3D
grid in space. We count how many
momentum states there are in a
region of space (the density of
momentum states) and then
transform to the density of energy
states.
We need to find how many momentum states are in this
spherical shell.
The Maxwell-Boltzmann distribution is for classical particles, so
we write
p =
2mε =
p2x + p y2 + p 2z .
The number of momentum states in a spherical shell from p to
p+dp is proportional to 4πp2dp (the volume of the shell).
Thus, we can write the number of states having momentum
between p and p+dp as
gp dp = B p2dp
where B is a proportionality constant, which we will worry about
later.
Because each p corresponds to a single ε,
g ε  dε = gp dp = B p2dp .
Now,
2
p = 2mε
p = 2mε
1 -1/2
dp = 2m ε dε ,
2
so that
p2 dp  ε ε-1/2 dε ,
g ε  dε  ε ε-1/2 dε .
and
n  ε  dε = C ε e-ε/kT dε .
The constant C contains B and all the other proportionality
constants lumped together.
If the derivation on the previous four slides went by rather fast
and seems quite confusing… don’t worry, that’s quite normal.
It’s only the final result (which we haven’t got to yet) which I
want you to be able to use.
Here are a couple of links presenting the same (or similar)
derivation:
• hyperphysics
• Britney Spears' Guide to Semiconductor
Physics: Density of States
n  ε  dε = C ε e-ε/kT dε
To find the constant C, we evaluate


0
0
N =  n  ε  dε = C 
ε e-ε/kT dε
where N is the total number of particles in the system.
Could you do the integral? Could I do the integral? No, not
any more. Could I look the integral up in a table? Absolutely!
The result is
C
N =
2
  kT 
3/2
,
so that
n  ε  dε =
2N
 kT 
3/2
ε e-ε/kT dε .
This is the number of molecules having energy between ε and
ε+dε in a sample containing N molecules at temperature T.
Wikipedia says: “The Maxwell-Boltzmann distribution is an
important relationship that finds many applications in physics
and chemistry.”
“It forms the basis of the kinetic theory of gases, which
accurately explains many fundamental gas properties, including
pressure and diffusion.”
“The Maxwell-Boltzmann distribution also finds important
applications in electron transport and other phenomena.”
Webchem.net shows how the Maxwell-Boltzmann distribution is
important for its influence on reaction rates and catalytic
reactions.
Here’s a plot of
the distribution:
Notice how “no” molecules have E=0, few molecules have high
energy (a few kT or greater), and there is no maximum of
molecular energy.
k has units of [energy]/[temperature] so kT has units of energy.
Here’s how the distribution changes with temperature (each
vertical grid line corresponds to 1 kT).
Notice how the
distribution for higher
temperature is skewed
towards higher
energies (makes
sense!) but all three
curves have the same
total area (also makes
sense).
Notice how the probability of a particle having energy greater
than 3kT (in this example) increases as T increases.
If you aren’t interested enough in the derivation of g(ε) to visit
Britney Spears' Guide to Semiconductor Physics: Density of
States…
n  ε  dε =
2N
 kT 
3/2
ε e-ε/kT dε .
… you might miss this graphic. (Can you tell which one has the
Ph.D.?)
Continuing with the physics, the total energy of the system is

E=
 ε n  ε  dε =
0
2πN
 kT 
3/2

3/2
-ε/kT
ε
e
dε .

0
Evaluation of the integral gives
E=
3NkT
.
2
This is the total energy for the N molecules, so the average
energy per molecule is
3
ε = kT ,
2
exactly the result you get from elementary kinetic theory of
gases.
3NkT
E=
2
3
ε = kT
2
Things to note about our ideal gas energy:
• The energy is independent of the molecular mass.
• Which gas molecules will move faster at a given
temperature: lighter or heavier ones? Why?
• ε at room temperature is about 40 meV, or (1/25)
eV. This is not a large amount of energy.
• kT/2 of energy "goes with" each degree of freedom.
Because ε = mv2/2, we can also calculate the number of
molecules having speeds between v and v + dv.
The result is
n  v  dv =
2  N m3/2
Here’s a plot (number
having a given speed vs.
speed):
“Looks like” n( ) plot—
nothing at speed=0,
long exponential tail.
 kT 
3/2
2
v e
-mv 2 /2kT
dv .
“We” (Beiser) call this n(v). The
hyperphysics web page calls it f(v).
The speed of a molecule having the average energy comes from
solving
mv 2
3
ε=
= kT
2
2
for v. The result is
vrms = v 2 =
3kT
.
m
vrms is the speed of a molecule having the average
energy ε.
It is an rms speed because we took the square root of the
square of an average quantity.
The average speed
v
can be calculated from

v=
 v n(v) dv
 n(v) dV
0

.
0
The result is
v=
8kT
.
m
Comparing this with vrms, we find that
vrms = 1.09 v .
Because the velocity distribution curve is skewed towards high
energies, this result makes sense (why?).
You can also set dn(v) / dv = 0 to find the most probable
speed. The result is
2kT
vp =
.
m
The subscript “p” means “most probable.”
Summarizing the different velocity results:
v rms =
v=
8kT
m
vp =
3kT
m
vrms = 1.09 v
2kT
m
n(v)
Plot of velocity distribution again:
This plot comes from the hyperphysics web site. The R’s and
M’s in the equations are a result of a different scaling than we
used. See here for how it works (not testable material).
Example 9.4 Find the rms speed of oxygen molecules at
0 ºC.
Would anybody (who hasn’t done the calculation yet) care to
guess the rms speed before we do the calculation?
0 m/s? 10 m/s? 100 m/s? 1,000 m/s? 10,000 m/s?
You need to know that an oxygen molecule is O2. The atomic
mass of O is 16 u (1 u = 1 atomic mass unit = 1.66x10-27 kg).
 1.66×10-27 kg 
-26
mass of O2 =  2 16 u  
=
5.31×10
kg

u


v rms =
3kT
m
vrms =
3 1.38×10-23 J/K    0+273 K 
5.31×10
-26
kg
vrms = 461 m/s
vrms =
 461
m/s mile / 1610 m3600 s/h
vrms = 1031 miles / hour .
Holy cow! You think you’d feel all these zillions of O2 molecules
constantly crashing into your skin at more than 1000 mph!
And why no sonic booms?? (No—this is not a question I expect
you to answer.)
Click here and scroll down for a handy molecular speed
calculator.
Now, we've gone through a lot of math without thinking much
about the physics of our system of gas molecules. We should
step back and consider what we've done.
A statistical approach has let us calculate the properties of an
ideal gas. We've looked at a few of these properties (energies
and speeds).
Who cares about ideal gases? Anybody interested in the
atmosphere, or things moving through it, or machines moving
gases around. Chemists. Biologists. Engineers. Physicists.
Etc.
This is "modern" physics, but not quantum physics.
9.4 Quantum Statistics
Here we deal with ideal particles whose wave functions overlap.
We introduce quantum physics because of this overlap.
Remember:
n  ε  = g ε  f  ε 
The function f(ε) for quantum statistics depends on whether or
not the particles obey the Pauli exclusion principle.
“The wierd thing about the half-integral spin particles (also known as
fermions) is that when you rotate one of them by 360 degrees, it's
wavefunction changes sign. For integral spin particles (also known as
bosons), the wavefunction is unchanged.” –Phil Fraundorf of UMSL,
discussing why Balinese candle dancers have understood quantum
mechanics for centuries.
Links--http://hyperphysics.phy-astr.gsu.edu/hbase/math/statcon.html#c1
http://www.chem.uidaho.edu/~honors/boltz.html
http://www.wikipedia.org/wiki/Boltzmann_distribution
http://www.webchem.net/notes/how_far/kinetics/maxwell_boltzmann.htm
http://www.physics.nwu.edu/classes/2002Spring/Phyx103taylor/mbdist.html
http://mats.gmd.de/~skaley/pwc/boltzmann/Boltzmann.html
http://britneyspears.ac/physics/dos/dos.htm
Cut and paste stuff:  ℓ ö
    •   º

     

ħ