0% found this document useful (0 votes)
14 views

Entropy

Uploaded by

pradip.mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Entropy

Uploaded by

pradip.mishra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Lecture Note 20S

Supplement to Chapter 20

Entropy
Introduction to Statistical Mechanics
Micro-canonical ensembles
Canonical ensembles

1. Definition of entropy in statistical mechanics


In statistical thermodynamics the entropy is defined as being proportional to the
logarithm of the number of microscopic configurations that result in the observed
macroscopic description of the thermodynamic system:

S = kB lnW

where kB is the Boltzmann's constant and W is the number of microstates corresponding


to the observed thermodynamic macrostate. This definition is considered to be the
fundamental definition of entropy (as all other definitions can be mathematically derived
from it, but not vice versa). In Boltzmann's 1896 Lectures on Gas Theory, he showed that
this expression gives a measure of entropy for systems of atoms and molecules in the gas
phase, thus providing a measure for the entropy of classical thermodynamics.
In 1877, Boltzmann visualized a probabilistic way to measure the entropy of an
ensemble of ideal gas particles, in which he defined entropy to be proportional to the
logarithm of the number of microstates such a gas could occupy. Henceforth, the
essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has
been to determine the distribution of a given amount of energy E over N identical systems.
Statistical mechanics explains entropy as the amount of uncertainty (or
"mixedupness" in the phrase of Gibbs) which remains about a system, after its observable
macroscopic properties have been taken into account. For a given set of macroscopic
variables, like temperature and volume, the entropy measures the degree to which the
probability of the system is spread out over different possible quantum states. The more
states available to the system with higher probability, the greater the entropy. More
specifically, entropy is a logarithmic measure of the density of states. In essence, the
most general interpretation of entropy is as a measure of our uncertainty about a system.
The equilibrium state of a system maximizes the entropy because we have lost all
information about the initial conditions except for the conserved variables; maximizing
the entropy maximizes our ignorance about the details of the system. This uncertainty is
not of the everyday subjective kind, but rather the uncertainty inherent to the
experimental method and interpretative model.

2. Boltzmann’s principle

Microcanonical ensemble
We consider a system in the energy interval between E and E + E, where E is the
order of uncertainty. Each state in this range is assumed to have the same probability.
Thus we have an ensemble characterized by the equal probability wn of each state n, or

wn = w = constant for E<En<E + E,

which expresses the principle of equal probability as it is. The number of microscopic
states accessible to a macroscopic system is called the statistical weight (thermodynamic
weight). The number of states for the system with the energy between E and E + E (the
statistical weight) may be written as

W ( E , E )  ( E )E

where (E) is the density of states. This particular ensemble is known as the
microcanonical ensemble.

Here we show that the entropy S in the equilibrium state of the system can be
expressed by

S  k B ln W

where W is the number of states.


Suppose that the entropy S is expressed by a function W as

S  f (W )

We assume that there are two independent systems (A and B). These systems are in the
macroscopic equilibrium states  a and  b , respectively. The numbers of states for
these states are given by Wa and Wb. We now consider the combined system (C = A + B).
The number of states for the state  c (  a   b ) is given by
Wc=Wa Wb.

The entropy of the system C is

SC  S A  S B

from the additivity of the entropy. Then we have the relation

f (WaWb )  f (Wa )  f (Wb )

For simplicity, we put

 = Wa,  = Wb .

Then the relation is rewritten as

f ( )  f ( )  f ( )

Taking a derivative of the both sides with respect to ,

f ' ( )  f ' ( )

Taking a derivative of the both sides with respect to ,

f ' ( )  f ' ( )

From these we get

f ' ( )  f ' ( )  k

where k is independent of  and . In other words, k should be constant. The solution of


the first-order differential equation is

f ( )  k ln 

When k = kB (Boltzmann constant), the entropy is given by

S  k B ln W

3. Microcanonical ensemble for an ideal gas


The basic assumption of statistical mechanics is

all microstates are equally probable.


We consider the system of N free particles in a container with volume V. Each state of
the system is specified by a point at 6N-dimensional space (3N coordinates, 3N momenta)
(so called, phase space).

(r1, r2,…,rN, p1, p2, … , pN) = (q1, q2, q3,…., q3N, p1, p2,…, p3N)

This state is continuous, but not discrete. For convenience, we assume that the 6N-
dimensional phase space consists of unit cells with the volume of (qp)3N, where the
size of p and q is arbitrary. There is one state per the volume of (qp)3N. Note that
qp= h (Planck’s constant) from the Heisenberg’s principle of uncertainty.
We define the number of states in a volume element dq1dq2dq3….dq3Ndp1dp2dp3N as

1
dq1dq2 dq3 .....dq3 N dp1dp2 ....dp3 N
(qp )3 N

The total energy E is a sum of the kinetic energy of each particle. It is independent of the
3N coordinates {ri}. We calculate the number of states when the total energy is between
E and E + E.

W ( E , E )  ( E )E
1 VN
(qp)3 N  (qp)3 N 
 dq1dq 2 dq3 .....dq3 N dp1dp 2 ....dp3 N  dp1dp2 ....dp3 N

where

E ≤ (p12 + p22 + ….+ p3N2)/ 2m ≤ E + E

The volume of the momentum space is the spherical shell between the sphere with radius
R  2m( E  E ) and the sphere of radius R  2mE .
We calculate the number of states between E and E + E as follows. First we
calculate the volume of the 3N dimensional sphere (momentum) with radius R given by

R  2mE

The volume of the sphere is obtained as

2
(2mE )3 N / 2
(3 N )(3 N / 2)

where (x) is the Gamma function of x. The total number of states N(E) for the system
with energy between 0 and E is given by
1 VN 2
N (E)  (2mE )3 N / 2
(qp ) N ! (3 N )(3 N / 2)
3N

where N! is included because N-particles are non-distinguishable. Note that N(E) is


introduced for the convenience of the mathematics since we consider a system having the
energy only between E and E + E. The number of states between E and E + E is
obtained as

dN ( E ) 1 V N (2mE )3 N / 2 E
W ( E , E )  ( E )E  E 
dE (qp )3 N N ! (3 N / 2) E

where (E) is the density of states. We use the Heisenberg’s principle of uncertainty;
qp = h (Planck’s constant), which comes from the quantum mechanical requirement.
Then we have

dN ( E ) VN 2mE E
W ( E , E )  E  ( 2 )3 N / 2
dE N !(3 N / 2) h E
VN mE 3 N / 2 E
 ( )
N !(3 N / 2) 2 2 E

where ħ = h/(2) (ħ; Dirac constant).

Using the Stirling’s formula

3N 3N 3N 3N
ln ( ) ln( )  for N>>1.
2 2 2 2

ln N ! N (ln N  1) for N>>1

Then we have

3N mE E 3N 3N 3N
ln W ( E , E )  N ln V  ln( )  ln( )  N (ln N  1)  [ ln( )  ]
2 2 2
E 2 2 2
V 3 mE 5 E
 N [ln  ln( )  ]  ln( )
N 2 3 N 2
2 E

4. Entropy S
The entropy S is defined by

S  k B ln W ( E , E )
5N V 3N m E E
 kB[  N ln( )  ln( )  ln ]
2 N 2 3 N
2
E
In the limit of N →∞

5 V 3 m E
S  Nk B [  ln( )  ln( )]
2 N 2 3 2 N

So the entropy S is found to be an extensive parameter. The entropy S depends on N, E


and V:

S  S ( N , E ,V )

S ( N , E ,V ) S ( N , E ,V )
dS  dE  dV
E V
1 P
 dE  dV
T T

Then we have

S ( N , E ,V )  ln W ( E , E ) 1
 kB 
E E T

and

S ( N , E ,V ) P

V T

From this relation, E can be derived as a function of N, V, and T.

E  E ( N ,V , T )

The heat capacity CV is given by

 E 
CV   
 T T

In the above example, we have

S ( N , E ,V ) 1 3 1
  Nk B
E T 2 E

or

3
E Nk BT (internal energy)
2
and

S ( N , E ,V ) P 1
  Nk B
V T V

we have

PV  Nk BT (Boyle's law)

5. Typical Examples
5.1 Adiabatic free expansion (sudden expansion into a vacuum)
The free expansion is an irreversible process. Nevertheless, we can define the entropy
in the reversible process (the final state and initial states are the same), given by

5 V 3 m E
S  Nk B [  ln( )  ln( )]
2 N 2 3 2 N
5 V 3 m 3
 Nk B [  ln( )  ln( k BT )]
2 N 2 3 2 2
5 V 3 mk T
 Nk B [  ln( )  ln( B 2 )]
2 N 2 2
3
 Nk B (ln V  ln T )  const
2

3
since E  Nk BT . This expression is exactly the same as that derived previously
2
(Chapter 20). When the volume changes from V = V1 to V2 at constant T, the change in
entropy ( S ) is obtained as

V
S  Nk B ln( 2 )
V1

5.2 Isentropic process


The entropy remains constant if

ln(VT 3 / 2 )  const .

In an expansion at constant entropy from V1 to V2, we have

3/ 2 3/ 2
V1T1  V2T2

for an ideal monatomic gas. Since


P1V1 P2V2

T1 T2

we have

5/3 5/3
V1 ( P1V1 )3 / 2  V2 ( P2V2 )3 / 2 or P1V1  P2V2

5.3 Mixing entropy of ideal gas


We consider two kinds of identical ideal gases which are separated by a barrier AB.
The temperature T and the pressure P are the same.

PV1  N1k BT V1 V2 k BT
or  
PV2  N 2 k BT N1 N 2 P

or

V1  V2 kT
P(V1+V2) = (N1+N2)kBT  B
N1  N 2 P

Suppose that the barrier is open suddenly. We discuss what is the change of entropy of
this system, using the expression of the entropy

5 V 3
S  Nk B [  ln( )  ln(T )   0 ]
2 N 2

Before opening the barrier AB, we have

S i  S1  S 2

where

5 V 3
S1  N1k B [  ln( 1 )  ln(T )   0 ]
2 N1 2
5 kT 3
 N1k B [  ln( B )  ln(T )   0 ]
2 P 2
5 V 3
S 2  N 2 k B [  ln( 2 )  ln(T )   0 ]
2 N2 2
5 kT 3
 N 2 k B [  ln( B )  ln(T )   0 ]
2 P 2

After opening the barrier AB, the total number of atoms is N1 + N2 and the volume is V1 +
V2. Then we have the final entropy,

5 V V 3
S f  ( N1  N 2 )k B [  ln( 1 2 )  ln(T )   0 ]
2 N1  N 2 2
5 kT 3
 ( N1  N 2 )k B [  ln( B )  ln(T )   0 ]
2 P 2

Since

S  S f  ( S1  S 2 )  0

as is expected (the reversible process), we have no change of entropy in this event.

((Gibbs paradox))
This success is partly due to the N! term in the numerator of W(, E). This is closely
related to the fact that the particles are not distinguishable in our system.

((Micro-canonical ensemble))
So far we consider the micro-canonical ensemble, where the system is isolated from
its surrounding. The energy of the system is conserved. It is assumed that the system of
the energy has E and E  E , where dE is a small width. The concept of the thermal
reservoir does not appear. Since the energy of the system is constant, the possible
microstates have the same probability. For the energy E, the volume V, and the number of
particles, one can obtain the number of the possible microstates. Then the entropy
S ( E , N ,V ) can be defined by

S  k B ln W ( E , E ) .

6. Canonical ensemble(system with constant temperature)


The theory of the micro-canonical ensemble is useful when the system depends on N,
E, and V. In principle, this method is correct. In real calculations, however, it is not so
easy to calculate the number of states W(E, E) in general case. We have an alternative
method, which is much useful for the calculation in the real systems. The formulation of
the canonical ensemble is a little different from that of the micro-canonical ensemble.
Both of these ensembles lead to the same result for the same macro system.

Canonical ensemble: (N, T, V, constant)


Suppose that the system depends on N, T, and V. A practical method of keeping the
temperature of a system constant is to immerse it in a very large material with a large
heat capacity. If the material is very large, its temperature is not changed even if some
energy is given or taken by the system in contact. Such a heat reservoir serves as a
thermostat.

We consider the case of a small system S(I) in thermal contact with a heat reservoir
(II). The system S(I) is in thermal equilibrium with a reservoir W(II). S(I) and W(II) have
a common temperature T. The system S(I) is a relatively small macroscopic system. The
energy of S(I) is not fixed. It is only the total energy of the combined system.

ET  EII  Ei

We assume that WII(EII) is the number of states where the thermal bath has the energy EII.
If S(I) is in the one definite state i , the probability of finding the system (I) in the state
i , is proportional to WII(EII). The thermal bath is in one of the many states with the
energy ET - Ei

pi  WII ( EII )  WII ( ET  Ei )

or

ln pi  ln[WII ( ET  Ei )]  const

Since

ET  Ei
ln[WII ( ET  Ei )] can be expanded as

d ln WII ( EII )
ln[WII ( ET  Ei )]  ln WII ( ET )  |ET Ei
dEII

Then we obtain

d ln WII ( EII )
pi  exp( |ET Ei ) (1)
dEII

Here we notice the definition of entropy and temperature for the reservoir as the micro-
canonical ensemble:

S II  k B ln WII ( EII )

and

S II 1

EII TII

or

d ln WII ( EII ) 1 S II 1
 
dEII k B EII k BTII

In thermal equilibrium, we have

TII  Ti  T

Then Eq.(1) can be rewritten as

Ei
pi  exp( )  exp( Ei )
k BT

where  = 1/(kBT). This is called a Boltzmann factor. We define the partition function Z
as

Z   e   Ei
i

The probability is expressed by


1  E i
pi  e
Z

The summation in Z is over all states i of the system. We note that

 p( E )  1
i
i

The average energy of the system is given by

1  ln Z
E 
Z i
Ei e  Ei  


since

 ln Z 1 Z 1 1
   ( Ei )e  Ei    Ei e  Ei
 Z  Z i Z i

Note that

 ln Z 1  ln Z  ln Z
E   k BT 2 .
  T T
T

In summary, the representative points of the system I are distributed with the
probability density proportional to exp(-Ei). This is called the canonical ensemble, and
this distribution of representative points is called the canonical distribution. The factor
exp(-Ei) is often referred to as the Boltzmann factor.

7. Pressure
The pressure P is defined as

1  E i 1 E 1 1 Z 1  ln Z
P   Pi e   (  i )e   E i  
i Z Z i V Z  V  V

Here we define the Helmholtz free energy F as

F  E  ST .

dF  dE  SdT  TdS
 TdS  PdV  SdT  TdS
  PdV  SdT

F is a function of T and V. From this equation, we have


 F 
S   
 T V
 F 
P   
 V T

8. Helmholtz free energy and entropy


The Helmholtz free energy F is given by

F   k BT ln Z

((Proof))
We note that

F
T F
 F T  ST  F E 
( )    2  k B ln Z ,
T T T 2
T 2
T T

which leads to

F   k BT ln Z

What is the expression for the entropy in a canonical ensemble? The entropy is given by

EF
S
T

where E is the average energy of the system,

 ln Z
E


Then entropy S is rewritten as


1  ln Z
S   k B ln Z
T 
1 1
  Eie Ei  kB ln Z
T Z i
e  E i
 k B   Ei  k B ln Z
i Z
 k B   Ei pi  k B ln Z
i

 k B  ( ln pi  ln Z ) pi  k B ln Z
i

  k B  pi ln pi
i

or

S   k B  pi ln pi ,
i

where pi is that the probability of the i state and is given by

1  Ei
pi  e
Z

The logarithm of pi is described by

ln pi   Ei  ln Z

((Note))
We finally get a useful expression for the entropy which can be available for the
information theory.

S   k B  pi ln pi .
i

Suppose that there are 50 boxes. There is one jewel in one of 50 boxes. pi is the
probability of finding one jewel in the i-th box for one trial.

(a) There is no hint where the jewel is.

1
p1 = p2 = ….. = p50=
50
50
1 1
S   k B  ps ln ps  k B  ln( )  3.91k B
s s 1 50 50

(b) There is a hint that the jewel is in one of the box with even number.

p1 = p3 = ….. = p49=0
1
p2 = p4 = ….. = p50=
25

1 1
S  k B  ps ln ps  k B  ln( )  3.219k B
s s even 25 25

(c) If you know that the jewel is in the 10-th box,

p10=1
ps = 0 (s ≠ 10)

S   k B p10 ln p10  0

If you know more information, the information entropy becomes smaller.

9. Application
9.1. Partition function Z
The partition function Z for the ideal gas can be calculated as


VN p2 V N 2mk BT 3 N / 2
N !h3 N 
Z [ dp exp(  ) ]3N
 ( )
2mk BT N! h2

Using this expression of Z, the Helmholtz free energy F can be calculated as

V 3 2mk BT
F  Nk BT [ ln( )  ln( )  1] .
N 2 h2

The internal energy E is

 ln Z 3
E  Nk BT .
 2

The heat capacity CV at constant volume is given by

E 3
CV  ( )V  Nk B .
T 2
The entropy S is

EF V 3 mk T 5
S  Nk B [ln( )  ln( B 2 )  ] .
T N 2 2 2

The pressure P is

F Nk BT
P  ( )T  .
V V

9.2 Maxwell’s distribution function


The Maxwell distribution function can be derived as follows.

mv 2
n( v)dv  n(v)4v 2 dv  f (v)dv  A exp( )4v 2 dv
2k BT

The normalization condition:

 n( v)dv   n(v)4v dv   f (v)dv  1


2

The constant A is calculated as

3/ 2
 m 
A    .
 2k BT 

Then we have

3/ 2
 m  mv 2
f (v)  4   v exp(
2
).
 2k T
B  2k BT

Since M = m NA and R = NA kB, we have

3/ 2
 M  Mv 2
f (v )    4v exp(
2
)
 2RT  2 RT

which agrees with the expression of f(v) in Chapter 19.

((Mathematica))
f1  IntegrateExp  4  v2 , v, 0, ,
 m v2
2 kB T
GenerateConditions  False

2 32
32
 m 
2

kB T

eq1  A f1  1;

Solveeq1, A 
32
 m 
 A  
2 32
kB T
2

10. Comparison of the expression of S in the canonical ensemble with the


original definition of S in the microcanonical ensemble

The partition function Z can be written as

Z   e  Ei   ( )e   d
i

where we use  instead of E (or Ej) in the expression. The partition function Z is the
Laplace transform of the density of states, ( ) . Here we show that

Z   ( )e   d   ( )d  2  *( * ) exp(  * )

If the function () is given by

 ( )  ( )e   .

We assume that () can be approximated by a Gaussian function

(   * ) 2
 ( )   ( * ) exp[ ]
2( * ) 2

where

 ( * )  ( * ) exp(  * )
Fig. () vs . () has a Gaussian distribution with the width  around
 = * = E.

The function () has a local maximum at    *  E . The logarithm of  ( ) is


rewritten as

ln ( )  ln[( ) exp(  )]


 ln[( )]  

We take the derivative of ln ( ) with respect to ,

 ' ( ) ' ( )
 
 ( ) ( )
d ln ( )
 
d

where

d ln ( )
|  *  
d

since

 ' ( * )  0 .

Here we define the number of states W ( ,  ) by


W ( ,  )  ( * )  2  *( * )

Then we have

d ln W ( ,  )
|  *   (1)
d

since

ln W ( ,  )  ln ( * )  ln 
 ln W ( ,  )  ln ( )
|  *  |  *
 

with fixed . We note that


(   * ) 2
Z   ( )d   ( )  exp[
*
]d   ( * ) 2  *
0
2( )
* 2

 2  *( * ) exp(  * )

Then we have

ln Z  ln[ 2  *( * )]   *

The entropy S is calculated as

1  ln Z
S   k B ln Z
T 
* *
  k B ln[ 2  *( * )]  (2)
T T
 k B ln[ 2  *( * )]
 k B ln W ( ,  )

Using Eqs.(1) and (2), we get

S 1
 kB  
E T
or

 ln W ( ,  ) 1

 T
11. Boltzmann-Planck’s method
Finally we show the standard method of the derivation, which characterizes well the
theory of canonical ensembles.

We consider the way of distributing M total ensembles among states with energies Ej.
Let Mj be the number of ensembles in the energy level Ej; M1 ensembles for the energy E1,
the M2 ensembles for the energy E2, and so on. The number of ways of distributing M
ensembles is given by

M!
W
M 1!M 2!...

where

M
j
j M

and the average energy E is given by

Mj
E   Ej
j M

The entropy S is proportional to lnW,


ln W  ln M ! ln(M j !)
j

Using the Stirling's formula

ln W  M (ln M  1)   M j (ln M j  1)
j

 M ln M   M j ln M j
j

in the limit of large M and Mj. We note that the probability of finding the state j is
simply given by

Mj
P( E j ) 
M

Then we have

1 1
M
ln W  ln M 
M
M
j
j ln M j

1
 ln M 
M
 MP( E ) ln[MP( E )]
j
j j

 ln M   P( E j ) ln[ P( E j )  ln M ]
j

  P( E j ) ln[ P( E j )]
j

which is subject to the conditions

 P( E )  1
j
j

 E P( E )  j j E
j
.

Treating P(Ej) as continuous variables, we have the variational equation

 [ P( E j ) ln P( E j )  P( E j )  E j P( E j )
j

 {ln P( E j )  (  1)  E j P( E j )}  0
j
which gives P(Ej) for the maximum W. Here  and  are Lagrange’s indeterminate

multipliers. Thus we obtain

ln P( E j )  (  1)   E j  0

or

P ( E j )  C exp[ E j ]

or

1
P( E j )  exp( E j )
Z ( )

where

Z (  )   exp( E j )
j

and

 = 1/kBT.

With the above P ( E j ) , the entropy S is expressed by

S  k B ln W 
  Mk B  P( E j ) ln[ P( E j )]
j

for the total system composed of M ensembles. Therefore, the entropy of each ensemble
is

S   k B  P( E j ) ln[ P( E j )]
j

12. Density of states for quantum box (ideal gas)


(a) Energy levels in 1D system
We consider a free electron gas in 1D system. The Schrödinger equation is given by

p2  2 d 2 k ( x)
H k ( x)   k ( x)     k k ( x) , (1)
2m 2m dx 2
where

 d
p ,
i dx

and  k is the energy of the particle in the orbital.


The orbital is defined as a solution of the wave equation for a system of only one
electron:one-electron problem.
Using a periodic boundary condition:  k ( x  L)   k ( x) , we have

 k ( x) ~ eikx , (2)

with

2
 2 2  2  2 
k  k   n ,
2m 2m  L 

2
eikL  1 or k  n,
L

where n = 0, ±1, ±2,…, and L is the size of the system.

(b) Energy levels in 3D system


We consider the Schrödinger equation of an electron confined to a cube of edge L.

p2 2 2
H k  k     k   k k . (3)
2m 2m

It is convenient to introduce wavefunctions that satisfy periodic boundary conditions.


Boundary condition (Born-von Karman boundary conditions).

 k ( x  L, y , z )   k ( x, y , z ) ,

 k ( x , y  L, z )   k ( x , y , z ) ,

 k ( x, y , z  L )   k ( x , y , z ) .

The wavefunctions are of the form of a traveling plane wave.

 k (r )  eik r , (4)

with
kx = (2/L) nx, (nx = 0, ±1, ±2, ±3,…..),

ky = (2/L) ny, (ny = 0, ±1, ±2, ±3,…..),

kz = (2/L) nz, (nz = 0, ±1, ±2, ±3,…..).

The components of the wavevector k are the quantum numbers, along with the quantum
number ms of the spin direction. The energy eigenvalue is

2 2 2
 (k )  2 2 2
(k x  k y  k z )  k . (5)
2m 2m

Here


p k (r )   k k (r )  k k (r ) . (6)
i

So that the plane wave function  k (r ) is an eigen-function of p with the eigenvalue  k .


The ground state of a system of N electrons, the occupied orbitals are represented as a
point inside a sphere in k-space.

(c) Density of states


Because we assume that the electrons are non-interacting, we can build up the N-
electron ground state by placing electrons into the allowed one-electron levels we have
just found. The one-electron levels are specified by the wave-vectors k and by the
projection of the electron’s spin along an arbitrary axis, which can take either of the two
values ±ħ/2. Therefore associated with each allowed wave vector k are two levels:

k, , k,  .
Fig. Density of states in the 3D k-space. There is one state per (2/L)3.

There is one state per volume of k-space (2/L)3. We consider the number of one-
electron levels in the energy range from  to +d; D()d

L3
 D( )d  2 3  4k dk ,
2
(13)

where D() is called a density of states.

13. Application of canonical ensemble for ideal gas


(b) Partition function for the system with one atom; Z1
The partition function Z1 is given by
 2
Z   exp( k2)
k 2m
V  2 2
(2 )3 
 dk exp(  k )
2m
V  2 2
(2 )3 
 4k 2
dk exp(  k )
2m
V
 2  C 3 / 2
8

where V  L3 ,

 2 1
C , 
2m k BT

((Mathematica))

Clear "Global` " ;


V
f1 3
4 k2 Exp C1 k2 ;
2
Integrate f1, k, 0,
Simplify , C1 0 &

V
8 C13 2 3 2

Then the partition function Z1 can be rewritten as

3/ 2
V  V  mk T 
Z1   V B2   nQV
 2 
3/ 2 3/ 2
 2   2 2 
8 2    
 2mk BT   mk BT 

where nQ is a quantum concentration and is defined by

3/ 2
 mk T 
nQ   B 2  .
 2 
nQ is the concentration associated with one atom in a cube of side equal to the thermal
average de Broglie wavelength.

Fig. Definition of quantum concentration. The de Broglie wavelength is on the order


of interatomic distance.

h 2
pm v   ,
 

2
 ,
mv

where v is the average thermal velocity of atoms. Using the equipartition law, we get
the relation

1 2 3 3k BT
mv  k BT , or v  ,
2 2 m

Then we have

2 2 2 2 2 2 1 1


     1.447 1 / 3  1 / 3
mv 3k BT 3 mk BT 3 mk BT nQ nQ
m
m

where

3/ 2
 mk T 
nQ   B 2 
 2 
It follows that

1
nQ  .
3

((Definition))

n
 1 → classical regime
nQ

An ideal gas is defined as a gas of non-interacting atoms in the classical regime.

((Example))
4
He gas at P = 1 atm and T = 300 K, the concentration n is evaluated as

N P
n   2.446  1019 /cm3.
V k BT

The quantum concentration nQ is calculated as

nQ  7.8122  10 24 /cm3

which means that n  nQ in the classical regime. Note that the mass of 4He is given by

m  4u  6.6422  1024 g.

where u is the atomic unit mass.

((Mathematica))
Clear "Global` " ;
16
rule1 kB 1.3806504 10 ,
NA 6.02214179 1023 ,
1.054571628 10 27 ,
amu 1.660538782 10 24 ,
atm 1.01325 106 ;

T1 300; P1 1 atm . rule1;

m1 4 amu . rule1
24
6.64216 10

m1 kB T1 3 2
nQ 2
. rule1
2

7.81219 1024

P1
n1 . rule1
kB T1

2.44631 1019

(b) Partition function of the system with N atoms


Suppose that the gas contains N atoms in a volume V. The partition function ZN,
which takes into account of indistinguishability of the atoms (divided by the factor N!), is
given by

N
Z1
ZN  .
N!

Using Z1  nQV , we get

ln Z N  N ln(nQV )  ln N !
 N [ln(nQV )  1  ln N ]

where we use the Stirling’s formula

N ! N ln N  N  N (ln N  1) ,
in the limit of large N. The Helmholtz free energy is given by

F  k BT ln Z N
  Nk BT [ln(nQV )  1  ln N ]
nQV
  Nk BT [ln( )  1]
N
nQ
  Nk BT ln( )  Nk BT
n
V 3 3 mk
  Nk BT [ln  ln T  ln( B2 )  1]
N 2 2 2

since

3/ 2
nQV  mk T  V V 3 3  mk B 
ln( )  ln[ B 2  ]  ln  ln T  ln  1
N  2  N N 2 2  2 2 

The entropy S is obtained as

 F 
S   
 T V
V 3 3 mk 5
 Nk B [ln  ln T  ln( B2 )  ]
N 2 2 2 2
V 3
 Nk B (ln  ln T   0 )
N 2

where

5 3 mk
0   ln( B2 )
2 2 2

Note that S can be rewritten as

nQ 5
S  Nk B ln  Nk B (Sackur-Tetrode equation)
n 2

((Sackur-Tetrode equation))
The Sackur–Tetrode equation is named for Hugo Martin Tetrode (1895–1931) and
Otto Sackur (1880–1914), who developed it independently as a solution of Boltzmann's
gas statistics and entropy equations, at about the same time in 1912.
https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation
n n
In the classical region (  1 or Q  1 ), we have
nQ n

nQ
ln 0
n

The internal energy E is given by

E  F  ST
nQ nQ 5
  Nk BT ln( )  Nk BT  Nk BT ln  Nk BT
n n 2
3
 Nk BT
2

Note that E depends only on T for the ideal gas (Joule’s law). The factor 3/2 arises from
the exponent of T in nQ because the gas is in 3D. If nQ were in 1D or 2D, the factor
would be 1/2 or 2, respectively.

(c) Pressure P
The pressure P is defined by

 F  Nk BT
P    
 V T V

leading to the Boyle’s law. Then PV is

2E
PV  Nk BT  (Bernoulli’s equation)
3

(d) Heat capacity


The heat capacity at fixed volume V is given by

 S  3
CV  T    Nk B .
 T V 2

When N  N A , we have

3
CV  R.
2

Cp is the heat capacity at constant P. Since


dE  TdS  PdV

or

TdS  dE  PdV

then we get

 S   E   V 
Cp  T      P 
 T  p  T  p  T  p

 V  N k
P   P A B  N Ak B  R
 T  p P

We note that

3
E N Ak BT .
2

E is independent of P and V, and depends only on T. (Joule’s law)

 E   E  3
     N A k B  CV
 T  p  T V 2

Thus we have

3 5
CP  CV  R  R  R  R.
2 2

((Mayer’s relation))

CP  CV  R for ideal gas with 1 mole.

The ratio  is defined by

CP 5
  .
CV 3

(e) Isentropic process (constant entropy)


The entropy S is given by

V 3
S  Nk B (ln  ln T   0 )  Nk B [ln(VT 3 / 2 )  ln N   0 ]
N 2
The isentropic process is described by

VT 3 / 2 =const, or TV 2 / 3 =const,

Using the Boyle’s law ( PV  RT ), we get

PV 2 / 3
V =const,, or PV 5 / 3 =const
R

Since  = 5/3, we get the relation

PV  = constant

14. The expression of entropy: S  k B ln W ( E )


The entropy is related to the number of states. It is in particular, closely related to
ln W . In order to find such a relation, we start with the partition function

Z   exp(   )   dEW ( E ) exp( E )   exp[ Nf ]dE


where W (E ) is the number of states with the energy E. The function f (E ) is defined by

E  ln W ( E ) E  Tk B ln W ( E )
f (E)  
N N

In the large limit of N, f (E ) is expanded using the Taylor expansion, as

f ( E )
f (E)  f (E* )  |E  E * ( E  E * )  ...
E

where

f ( E ) 1  ln W ( E )
 [1  k BT ]0 at E  E *
E N E

or

1  ln W ( E )
 kB |E  E *
T E

At at E  E * ,

Z  exp[ Nf ( E * )]
For simplicity, we use E instead of E * . The Helmholtz free energy F is dsefined by

F   k BT ln Z  k BT [ N f ( E )]  Nf ( E )

or

F  E  k BT ln W ( E )  E  ST

leading to the expression of the entropy S as

S  k B ln W ( E ) ,

and

1 S
 .
T E

15. The expression of entropy: S  k B  p ln p



We consider the probability given by

1  E
p  e ,
Z

where

Z   e   E ,

ln p   ln Z  E ,

The energy E is given by

E   p E

1
  E e E
Z 
1 Z

Z 

 ln Z

The entropy is a logarithmic measure of the number of states with significant probability
of being occupied. The Helmholtz energy F is defined by

F  E  ST   k BT ln Z .

The entropy S is obtained as

EF E
S  k B ln Z 
T T

We note that
 k B  p ln p   k B  p ( ln Z    )
 

 k B (  E  ln Z )
E k BT
  ln Z
T T
EF

T
S

Thus it follows that the entropy S is given by

S   k B  p ln p .

16. Thermal average of energy fluctuation

1 1
 E e  E  2 ( E e  E ) 2
2 2
E2  E 
Z  Z 

d d d
E  E
dT dT d
1 d
 E
k BT 2 d
1 d 1
 [  E e  E ]
k BT d Z 
2

1 1 dZ
[ Z  ( E )e  E   E e 
 E
2
 ]
2
k BT Z 2
 d 

1 2
 2
[ E2  E ]
k BT

where
dZ
  E e  E   Z E
d 

Then we have

d 1 2
E  2
[ E2  E ]
dT k BT

d
Since E  CV , we get the relation
dT

CV 1 2
 2
[ E2  E ]
k B (k BT )

17. Example: 4He atom as ideal gas


We consider the 4He atom with mass

m  4u = 6.64216 x 10-24 g

The number density n at at P = 1 atm and T = 300 K, is

n = 2.44631 x 1019/cm3

The number of atoms in the volume V = 103 cm3 is

N = nV = 2.44631 x 1022

The internal energy

3
E Nk BT  151.987 J
2

The entropy S

V 3 3 mk 5
S  Nk B [ln  ln T  ln( B2 )  ] = 5.125 J/K.
N 2 2 2 2

((Mathematica))
Clear "Global` " ;
16
rule1 kB 1.3806504 10 ,
NA 6.02214179 1023 ,
1.054571628 10 27 ,
amu 1.660538782 10 24 ,
atm 1.01325 106 , bar 106 , J 107 ;

n1 2.44631 1019 ; V1 103 ; T1 300;

N1 n1 V1
2.44631 1022

m1 4 amu . rule1
24
6.64216 10

kB N1
P1 T1 . rule1
V1

1.01325 106
P1 bar . rule1

1.01325

3
E1 kB N1 T1 . rule1
2

1.51987 109

E1 J . rule1

151.987

3 V1
S1 kB N1 Log T1 Log
2 N1
3 m1 kB 5
Log 2
. rule1
2 2 2

5.12503 107

S1 J . rule1

5.12503

17. Link
Entropy (Wikipedia)
http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)

_______________________________________________________________________
REFERENCES
M. Toda, R. Kubo, and N. Saito, Statistical Physics I: Equilibrium Statistical (Springer,
1983).
L.D. Landau and E.M. Lifshitz, Statistical Physics (Pergamon Press, 1980).
R.P. Feynman, Statistical Mechanics A Set of Lectures (The Benjamin/Cummings, 1982).
E. Fermi, Thermodynamics (Dover, 1956).
S.J. Blundell and K.M. Blundel, Concepts in Thermal Physics (Oxford, 2006).
C. Kittel and H. Kroemer, Thermal Physics (W.H. Freeman and Company (1980).
C. Kittel, Elementary Statistical Physics (Wiley & Sons, 1958).
P. Atkins, The Laws of Thermodynamics: A Very Short Introduction (Oxford, 2010).
D.V. Schroeder, An Introduction to Thermal Physics (Addison-Wesley, 2000).
R. Reif, Fundamentals of Statistical and Thermal Physics (McGraw-Hill, 1965).
F. Bloch, Fundamentals of Statistical Mechanics (World Scientific, 2000).
D. ter Haar, Elements of Statistical Mechanics (Butterworth Heinemann, 1995).
F. Schwabl, Statistical Mechanics, 2nd edition (Springer, 2006).
J.W. Halley, Statistical Mechanics (Cambridge, 2007).
H. Kawamura, Statistical Physics (Maruzen, 1997) [in Japanese]
D. Yoshioka, Statistical Physics, An Introduction (Springer, 2007).

________________________________________________________________________
APPENDIX

Density of states for N particles for micro-canonical ensemble


Using the density of states for the 1-particle system, the density of states for the 2-
particle system is estimated as

E2

D2 ( E2 )   D1 ( 1 ) D1 ( E2  1 )d 1
0
E2

C   1 E 2   1 d 1
2

0
1


2 2
 C E2 2
x 1  x dx  C 2 E2 f (2)
0

where 2 is the total energy of two-particle system, and

3 3
1 ( )[ (n  1)]
f (n)   x1/ 2 (1  x) 3( n1) / 21 dx  2 2
3
0  ( n)
2

Similarly, the density of states for the 3-particle system is

E3

D3 ( E3 )   D1 ( 1 ) D2 ( E3   1 )d 1
0
E3

 C f (2)   1 E3   1  d 1
3 2

0
1

 x 1  x  dx
7/2 2
 C 3 E3
0
7/2
 C E3 3
f (2) f (3)

where 3 is the total energy of three particles.


Suppose that Dn has the form
3( n 1) / 2 1 / 2
Dn ( En )  C n f (2) f (3).... f (n) En

then  n 1 can be described as

E n 1

Dn 1 ( En 1 )  C n f (2) f (3)... f (n)  C 1 ( En 1  1 )3( n 1) / 2 1 / 2 d1


0
n 1 3 n / 2 1 / 2
C f (2) f (3)... f (n) f (n  1) En 1

Hence the density of states for the N-particles system is given by

DN ( E )  C N f (2) f (3).... f ( N ) E 3( N 1) / 21 / 2

where E = EN

3 3 3 3 9 3 3 3 3
( )[ ] ( )[3)] ( )[ ] ( )[ ( N  2)] ( )[ ( N  1)]
f (2) f (3).... f ( N )  2 2 2 2 2 ... 2 2 2 2
(3) 9 12 3 3
( ) ( ) ( ( N  1) ( N )
2 2 2 2
3
[( )]N
 2
3N
( )
2

One can get

3
[( )]N
DN ( E )  C N 2 E 3( N 1) / 2 1 / 2
3N
( )
2
3
N [( )]N
V 2 1
 ( 2 m) 3 N / 2 E3N / 2
(4  )
2 3 N
( )
3 N E
2
VN 3N / 2 
N /2
1 1
 ( 2m) E3N / 2
(4  ) 2
2 3 N N
3
( )
N E
2

where

3 
( ) 
2 2
The number of states whose energy lies between E and  + E is given by

1
W ( E , E )  DN ( E )E
N!
VN 3N / 2  E
N /2
1 1
 ( 2 m ) E 3N / 2
N ! (4  ) 2
2 3 N N
3N
( ) E
2

where N! is included because N-particles are non-distinguishable.

This can be rewritten as

VN mE 3 N / 2 E
W ( E , E )  ( ) ,
N !(3 N / 2) 2 2 E

which is exactly the same as the expression derived from the classical approach.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy