Stat Mechanics
Stat Mechanics
statistical distributions
Maxwell-Boltzmann Statistics
molecular energies in an ideal gas
quantum statistics
Since Ive been unjustly picking on chemists: All science is either Physics
or stamp collecting.Ernest Rutherford, physicist and 1908 Nobel Prize
winner in Chemistry.
Mar. 25
go to
slide 9.
Apr. 4
go to
slide 24.
Chapter 9
Statistical Mechanics
spinach good stuff!
spinach
get it outta
here
As a grade schooler, I went to a Catholic school. They served
lots of stewed spinach. Some of us got sick just from the
fumes wafting up from the cafeteria 2 floors down.
The nuns made us clean our plate at lunch. It had something
to do with the starving children in China.
They would inspect our trays as we passed through the dump
the trash line.
What to do on spinach day?
Stuff it in your empty milk carton and
hope the nuns didnt inspect it?
Sit next to the one kid in class who liked
stewed spinach,* and see how much you
could pass off to him?
*The most valuable kid in school that day.
What does this have to do with statistical mechanics?
Physics faculty tend to think of thermodynamics (including
statistical mechanics) as the stewed spinach of college physics
courses.
The wacko faculty member who actually likes teaching that stuff
is a treasured friend whenever it comes time to give out
teaching assignments.
Just thought you might want to know* that before we start this
chapter on statistical mechanics (thermodynamics).
Fair warnings and all that. There are more cautionary tales, but not for now.
I dont want to scare you all off at once.
Before we get into chapter 9, lets think about this course a bit.
We started with relativity. A logical starting point, because
relativity heralded the beginning of modern physics.
Relativity forced us to start accepting previously
unthinkable ideas.
Relativity is more fun than most of the rest of the
material, so it wont drive prospective students away from
the class.
Relativity shows us that photons have momentuma particle
property, and gets us thinking about particle properties of
waves.
Waves having particle properties leads us (e.g., de Broglie) to
ask if particles have wave properties.
After chapter 3, we backtracked to try to explain the properties
of hydrogen, the simplest atom.
This is backtracking, because we had to move
backwards in the time sequence of discoveries, from
de Broglie back to Rutherford.
It doesnt break the logical chain because we find we
cant explain hydrogen without invoking wave
properties.
The puzzle of hydrogen forces us to completely re-think the
fundamental ideas of physics.
If physics cant explain hydrogenthe simplest of all
atomsit is in dire shape. Something drastic must be
done.
Something drastic = quantum mechanics
Once quantum mechanics is discovered we rush off to find
applications and confirmations.
A logical place to start testing quantum mechanics
(Schrdingers equation) is to start with simple model
systems (particle in box) and move up from there.
Once weve practiced with model systems, we go full circle and
apply quantum mechanics to the system that started this
trouble in the first placehydrogen.
The next step up from hydrogen is atoms (chapter 7).
The next step up from atoms is a few atoms bonded together
(chapter 8).
Whats the next step up from a few atoms bonded together?
Good! Lots of atoms. Lets start with them interacting but not
bonded together in large masses (chapter 9). Then well be
able to tackle lots of atoms bonded together (chapter 10).
Theres a logic to this, isnt there. No wonder most modern physics books
follow the same sequence.
9.1 Statistical Distributions
Statistical mechanics deals with the behavior of systems of a
large number of particles.
We give up trying to keep track of individual particles. If we
cant solve Schrdingers equation in closed form for helium (4
particles) what hope do we have of solving it for the gas
molecules in this room (10
really big number
particles).
Statistical mechanics handles many particles by calculating the
most probable behavior of the system as a whole, rather than
by being concerned with the behavior of individual particles.
not again?
yes, again
In statistical mechanics, we assume that the more ways there
are to arrange the particles to give a particular distribution of
energies, the more probable is that distribution. (Seems
reasonable?)
6 ways
3 2 1
3 1 2
2 1 3
2 3 1
1 2 3
1 3 2
1 1 4
4 1 1
1 4 1
3 ways
more likely
6 units of energy, 3 particles to give it to
We begin with an assumption that we believe describes
nature.
We see if the consequences of the assumption
correspond in any way with reality.
It is not bad to begin with an assumption, as long as
we realize what we have done, and discard (or modify)
the assumption when it fails to describe things we
measure and observe.
(repeating) In statistical mechanics, we assume that the more
ways there are to arrange the particles to give a particular
distribution of energies, the more probable is that distribution.
(Seems reasonable.)
Beiser mentions W, which is the number of ways to
arrange particles to give a particular distribution of
energies.
The idea is to calculate W, and find when W is maximized.
That gives us the most probable state of the system.
W doesn't appear anywhere else in this chapter. In previous
editions, it was calculated in an appendix, where Beiser derived
all the distribution functions we will use.
So you dont need to worry about W.
A brief note...
http://www.rozies.com
/Zzzz/920/alpha.html
Here, in words, is the equation we will be working with in this
chapter:
(# of particles in a state of energy E) = (# of particles in a state of energy E) = (# of states having
energy E) x
(# of particles in a state of energy E) = (# of states having
energy E) x (probability that a particle occupies the state of
energy E).
If we know the distribution function, the (probability that a
particle occupies a state of energy E), we can make a number
of useful calculations.
Mathematically, the equation is written
( ) ( ) ( )
= n g f
It is common to use epsilon to represent energy; I will call it "E"
when I say it.
In systems such as atoms, only discrete energy levels are
occupied, and the distribution g(c) of energies is not
continuous.
On the other hand, it may be that the distribution of energies is
continuous, or at least can be approximated as being
continuous. In that case, we replace g() by g()d, the
number of states between and +d.
We will find that there are several possible distributions f()
which depend on whether particles are distinguishable, and
what their spins are.
Beiser mentions them (Maxwell-Boltzmann, Bose-Einstein,
Fermi-Dirac) in this section. Lets wait and introduce them one
at a time.
9.2 Maxwell-Boltzmann Statistics
Classical particles which are identical but far enough apart to be
distinguishable obey Maxwell-Boltzmann statistics.
classical slow, wave functions dont overlap
distinguishable you would know if two
particles changed places (you could put your
finger on one and follow it as it moves about)
Example: ideal gas molecules.
We take another step back in time from quantum mechanics
(1930s) to statistical mechanics (late 1800s).
Two particles can be considered distinguishable if their
separation is large compared to their de Broglie wavelength.
The Maxwell-Boltzmann distribution function is
( )
-/kT
f = A e .
Ill explain the various symbols in a minute.
Boltzmann discovered statistical mechanics
and was a pioneer of quantum mechanics.
His work contained elements of relativity
and quantum mechanics, including discrete
atomic energy levels.
In his statistical interpretation of the second law of
thermodynamics he introduced the theory of probability into a
fundamental law of physics and thus broke with the classical
prejudice, that fundamental laws have to be strictly
deterministic. (Flamm, 1997.)
With Boltzmann's pioneering work the probabilistic
interpretation of quantum mechanics had already a precedent.
Boltzmann constantly battled for acceptance of his work. He
also struggled with depression and poor health. He committed
suicide in 1906. Most of us believe thermodynamics was the
cause. See a biography here.
Paul Eherenfest, who wrote Boltzmanns
eulogy, carried on (among other things)
the development of statistical
thermodynamics for nearly three
decades.
Ehrenfest was filled with self-doubt and
deeply troubled by the disagreements
between his friends (Bohr, Einstein, etc.)
which arose during the development of
quantum mechanics.
Ehrenfest shot himself in 1933.
Back to physics
US physicist Percy
Bridgmann (the man on the
right, winner of the 1946
Nobel Prize) took up the
banner of thermodynamics,
and studied the physics of
matter under high pressure.
Bridgman committed suicide in 1961.
Theres no need for you to worry; Ive never lost a
student as a result of chapter 9 yet
The facts above accurate but rather selectively presented for dramatic effect.
The number of particles having energy at temperature T is
( )
-/kT
f = A e
( ) ( )
-/kT
n = A g e .
A is like a normalization constant; we integrate n() over all
energies to get N, the total number of particles. A is fixed to
give the "right" answer for the number of particles. For some
calculations, we may not care exactly what the value of A is.
is the particle energy, k is Boltzmann's constant
(k = 1.38x10
-23
J/K), and T is the temperature in Kelvin.
Often k is written k
B
. When k and T appear together, you can
be sure that k is Boltzmann's constant.
Maxwell-Boltzmann distribution function
( ) ( )
-/kT
n = A g e
We still need g(), the number of states having energy . We
will find that g() depends on the problem under study.
Beiser justifies this distribution in Chapter 9, and but doesn't
derive it in the current text. I won't go through all this
justification. You can read it for yourself.
Before we do an example monatomic hydrogen is less stable
than H
2
, so are you more likely to find H
2
or H in nature?
Nevertheless, suppose we could make a cubic meter of H
atoms. How many atoms would we have?
H
2
, of course!
Example 9.1 A cubic meter of atomic H at 0 C and
atmospheric pressure contains about 2.7x10
27
H atoms. Find
how many are in their first excited state, n=2.
Gas atoms at atmospheric pressure and temperature behave
like ideal gases. Furthermore, they are far enough apart that
Maxwell-Boltzmann statistics can be used.
( ) ( )
1
- /kT
1 1
n = A g e .
For the hydrogen atoms in the ground state,
For the hydrogen atoms in first excited state,
( ) ( )
2
- /kT
2 2
n = A g e .
We can divide the equation for n(
2
) by the equation for n(
1
) to
get
( )
( )
( )
( )
2
1
- /kT
2 2
- /kT
1 1
n A g e
= .
n A g e
We know
1
,
2
, and T. We need to calculate the g()'s, which
are the number of states of energy . We dont need to know
A, because it divides out.
We get g() for hydrogen by counting up the number of allowed
electron states corresponding to each .
Or we can simply recall that there are 2n
2
states corresponding
to each n, so that g(
1
)=2(1)
2
and g(
2
)=2(2)
2
.
Plugging all of our numbers into the above equation gives
n(
2
)/n(
1
)=1.3x10
-188
. In other words, none of the atoms are
in the n=2 state.
Important: this is
temperature in K,
not in C!
Caution: the solution to example 9.1 and g(
n
)=2(n)
2
only
works for energy levels in atomic H and not for other assigned
problems! For example, to do it for H
2
would require
knowledge of H
2
molecular energy levels.
Skip example 9.2; I wont test you on it.
9.3 Molecular Energies in an Ideal Gas
The example in section 9.2 dealt with atomic electronic energy
levels in atomic hydrogen. In this section, we apply Maxwell-
Boltzmann statistics to ideal gas molecules in general.
We use the Maxwell-Boltzmann distribution to learn about the
energies and speeds of molecules in an ideal gas.
We already have f(c). We assume a continuous distribution of
energies (why?), so that
We need to calculate g(), the number states having an energy
in the range to +d.
( ) ( )
-/kT
n d = A g e d .
It turns out to be easier to find the number of momentum
states corresponding to a momentum p, and transform back to
energy states.
g() is called the density of states.
Why? Every classical particle has a position and momentum
given by the usual equations of classical mechanics.
Corresponding to every value of momentum is a value of
energy.
Momentum is a 3-dimensional vector
quantity. Every (p
x
,p
y
,p
z
)
corresponds to some energy.
We need to find how many momentum states are in this
spherical shell.
Think of as (p
x
,p
y
,p
z
) forming a 3D
grid in space. We count how many
momentum states there are in a
region of space (the density of
momentum states) and then
transform to the density of energy
states.
2 2 2
x y z
p = 2 m = p + p + p .
The Maxwell-Boltzmann distribution is for classical particles, so
we write
The number of momentum states in a spherical shell from p to
p+dp is proportional to 4p
2
dp (the volume of the shell).
Thus, we can write the number of states having momentum
between p and p+dp as
( )
2
g p dp = B p dp
where B is a proportionality constant, which we will worry about
later.
Because each p corresponds to a single ,
( ) ( )
2
g d = g p dp = B p dp .
Now,
2 -1/2
1
p = 2m p = 2m dp= 2m d ,
2
so that
2 -1/2
p dp d ,
( )
-1/2
g d d .
and
( )
-/kT
n d = C e d .
The constant C contains B and all the other proportionality
constants lumped together.
If the derivation on the previous four slides went by rather fast
and seems quite confusing
If the derivation on the previous four slides went by rather fast
and seems quite confusing dont worry, thats quite normal.
Its only the final result (which we havent got to yet) which I
want you to be able to use.
Here are a couple of links presenting the same (or similar)
derivation:
hyperphysics
Britney Spears' Guide to Semiconductor
Physics: Density of States
To find the constant C, we evaluate
( )
-/kT
n d = C e d
( )
} }
-/kT
0 0
N = n d = C e d
where N is the total number of particles in the system.
Could you do the integral? Could you do the integral? Could I do the integral? Could you do the integral? Could I do the integral? No, not
any more.
Could you do the integral? Could I do the integral? No, not
any more. Could I look the integral up in a table?
Could you do the integral? Could I do the integral? No, not
any more. Could I look the integral up in a table? Absolutely!
The result is
( )
t
3/2 C
N = kT ,
2
so that
( )
( )
t
t
-/kT
3/2
2 N
n d = e d .
kT
This is the number of molecules having energy between and
+d in a sample containing N molecules at temperature T.
Wikipedia says: The Maxwell-Boltzmann distribution is an
important relationship that finds many applications in physics
and chemistry.
It forms the basis of the kinetic theory of gases, which
accurately explains many fundamental gas properties, including
pressure and diffusion.
The Maxwell-Boltzmann distribution also finds important
applications in electron transport and other phenomena.
Webchem.net shows how the Maxwell-Boltzmann distribution is
important for its influence on reaction rates and catalytic
reactions.
Heres a plot of
the distribution:
k has units of [energy]/[temperature] so kT has units of energy.
Notice how no molecules have E=0, few molecules have high
energy (a few kT or greater), and there is no maximum of
molecular energy.
Heres how the distribution changes with temperature (each
vertical grid line corresponds to 1 kT).
Notice how the
distribution for higher
temperature is skewed
towards higher
energies (makes
sense!) but all three
curves have the same
total area (also makes
sense).
Notice how the probability of a particle having energy greater
than 3kT (in this example) increases as T increases.
If you arent interested enough in the derivation of g() to visit
Britney Spears' Guide to Semiconductor Physics: Density of
States
you might miss this graphic. (Can you tell which one has the
Ph.D.?)
( )
( )
t
t
-/kT
3/2
2 N
n d = e d .
kT
Continuing with the physics, the total energy of the system is
( )
( )
t
} }
3/2 -/kT
3/2
0 0
2N
E = n d = e d .
kT
Evaluation of the integral gives
This is the total energy for the N molecules, so the average
energy per molecule is
3NkT
E = .
2
3
= kT ,
2
exactly the result you get from elementary kinetic theory of
gases.
3NkT
E =
2
3
= kT
2
Things to note about our ideal gas energy:
The energy is independent of the molecular mass.
Which gas molecules will move faster at a given
temperature: lighter or heavier ones? Why?
at room temperature is about 40 meV, or (1/25)
eV. This is not a large amount of energy.
kT/2 of energy "goes with" each degree of freedom.
Because = mv
2
/2, we can also calculate the number of
molecules having speeds between v and v + dv.
The result is
( )
( )
t
t
2
3/2
2 -mv /2kT
3/2
2 N m
n v dv = v e dv .
kT
Heres a plot (number
having a given speed vs.
speed):
Looks like n(c ) plot
nothing at speed=0,
long exponential tail.
We (Beiser) call this n(v). The
hyperphysics web page calls it f(v).
The speed of a molecule having the average energy comes from
solving
for v. The result is
2
mv 3
= = kT
2 2
2
rms
3kT
v = v = .
m
v
rms
is the speed of a molecule having the average
energy .
}
}
0
0
v n(v) dv
v = .
n(v) dV
The result is
t
8kT
v = .
m
Comparing this with v
rms
, we find that
rms
v = 1.09 v .
Because the velocity distribution curve is skewed towards high
energies, this result makes sense (why?).
You can also set dn(v) / dv = 0 to find the most probable
speed. The result is
p
2kT
v = .
m
The subscript p means most probable.
Summarizing the different velocity results:
rms
3kT
v =
m
t
8kT
v =
m
rms
v = 1.09 v
p
2kT
v =
m
Plot of velocity distribution again:
This plot comes from the hyperphysics web site. The Rs and
Ms in the equations are a result of a different scaling than we
used. See here for how it works (not testable material).
n
(
v
)
Example 9.4 Find the rms speed of oxygen molecules at
0 C.
You need to know that an oxygen molecule is O
2
. The atomic
mass of O is 16 u (1 u = 1 atomic mass unit = 1.66x10
-27
kg).
( )( )
| |
|
\ .
-27
-26
2
1.6610 kg
mass of O = 2 16 u = 5.3110 kg
u
rms
3kT
v =
m
Would anybody (who hasnt done the calculation yet) care to
guess the rms speed before we do the calculation?
0 m/s? 0 m/s? 10 m/s? 0 m/s? 10 m/s? 100 m/s? 0 m/s? 10 m/s? 100 m/s? 1,000 m/s? 0 m/s? 10 m/s? 100 m/s? 1,000 m/s? 10,000 m/s?
( )
( ) ( )
( )
-23
rms
-26
3 1.3810 J/K 0+273 K
v =
5.3110 kg
rms
v = 461 m/s
( )( )( )
rms
v = 461 m/s mile / 1610 m 3600 s/h
rms
v = 1031 miles / hour .
Holy cow! You think youd feel all these zillions of O
2
molecules
constantly crashing into your skin at more than 1000 mph!
And why no sonic booms?? (Nothis is not a question I expect
you to answer.)
Now, we've gone through a lot of math without thinking much
about the physics of our system of gas molecules. We should
step back and consider what we've done.
A statistical approach has let us calculate the properties of an
ideal gas. We've looked at a few of these properties (energies
and speeds).
This is "modern" physics, but not quantum physics.
Click here and scroll down for a handy molecular speed
calculator.
Who cares about ideal gases? Anybody interested in the
atmosphere, or things moving through it, or machines moving
gases around. Chemists. Biologists. Engineers. Physicists.
Etc.
9.4 Quantum Statistics
Here we deal with ideal particles whose wave functions overlap.
We introduce quantum physics because of this overlap.
Remember:
( ) ( ) ( )
n = g f
The function f() for quantum statistics depends on whether or
not the particles obey the Pauli exclusion principle.
The wierd thing about the half-integral spin particles (also known as
fermions) is that when you rotate one of them by 360 degrees, it's
wavefunction changes sign. For integral spin particles (also known as
bosons), the wavefunction is unchanged. Phil Fraundorf of UMSL,
discussing why Balinese candle dancers have understood quantum
mechanics for centuries.
Links---
http://hyperphysics.phy-astr.gsu.edu/hbase/math/statcon.html#c1
http://www.chem.uidaho.edu/~honors/boltz.html
http://www.wikipedia.org/wiki/Boltzmann_distribution
http://www.webchem.net/notes/how_far/kinetics/maxwell_boltzmann.htm
http://www.physics.nwu.edu/classes/2002Spring/Phyx103taylor/mbdist.html
http://mats.gmd.de/~skaley/pwc/boltzmann/Boltzmann.html
http://britneyspears.ac/physics/dos/dos.htm
Cut and paste stuff: | u u + O t c
s A o |