Statistical mechanics-1 notes 1
Statistical mechanics-1 notes 1
“Maxwellian particles: refer to that particles that obey the Maxwell-Boltzmann distribution is, their
energies or velocities follow the Maxwell-Boltzmann statistics. Maxwellian Particles are
4. Obeying the Maxwell-Boltzmann distribution – i.e., their velocities, speeds, or energies are
distributed according to the formulas from classical statistical mechanics.
4. Plasma systems (under some conditions, e.g., in fusion devices) are often approximated as
Maxwellian plasmas.
5. they can have any spin — integer or half-integer — but we're ignoring spin-related quantum
effects.
6.
Note: At high temperatures and/or low densities, both bosons and fermions behave like classical
particles — so you can use Maxwell-Boltzmann statistics, and they are effectively Maxwellian particles,
regardless of their spin.
Bose-Einstein Statistics
Particles are indistinguishable, and multiple particles can occupy the same state.
Instead of distributing n particles over g states and dividing by n! we count the number of ways
to put indistinguishable particles into distinguishable boxes (states), allowing multiple
occupancy.
ANS: In MB statistics, we count how many ways we can arrange particles into energy levels. Since the
particles are distinguishable, order matters, so we use permutations.
Fermi-Dirac (FD) statistics: describe the distribution of particles over energy states in systems consisting
of fermions, which are particles that obey the Pauli exclusion principle (no two identical fermions can
occupy the same quantum state simultaneously). These statistics are crucial in understanding the
behavior of systems like electrons in metals, neutrons in neutron stars, and semiconductors.
Pauli Exclusion Principle: No two fermions can occupy the same quantum state.
Quantum States: Specific configurations characterized by quantum numbers (e.g., energy, spin).
Given:
Let S = f(ω)
As S=S1+S2
so f(ω₁ω₂) = f(ω₁) + f(ω₂) ...(1)
∂ f (ω ₁ ω ₂) ∂ f (ω ₁) ∂ f (ω ₂) ∂(ω ₁ ω ₂) '
= + Since ω ₂is constant :f ' (ω ₁ ω ₂) =f (ω ₁)+0
∂ω ₁ ∂ω₁ ∂ω₁ ∂ω₁
⇒ f′(ω₁ω₂) · ω₂ = f′(ω₁)
⇒ ω₂ f′(ω₁ω₂) = f′(ω₁)………(2)
⇒
'
ω ₁ f (ω ₂)
=
ω ₂ f ' (ω ₁)
⇒ ω f ' ( ω )=constant
⇒ω
df (ω)
=K B
dω
⇒
df (ω) K B
=
dω ω
⇒ df ( ω)=
KB
dω
ω
Integrating both sides:
f(ω) = K B ln(ω)
If f(ω) = S (entropy), the above equation becomes:
S = K B ln(ω)
H-function: two H- function (i) H(x) (Information Entropy) (ii) H(t) (Boltzmann's H-function)
Information Theory Entropy – H(X)
1st Definition:
2nd Definition:
In Boltzmann’s H-theorem, the H-function is used to describe the approach of a gas towards
thermodynamic equilibrium. i.e more microstates corresponding a macrostate
H(t)= ∫ f (v ⃗ ,t)lnf (v ⃗ , t)dv ⃗ where f(v,t) is the velocity distribution of particles at time t.
Purpose: Describes the entropy-like behavior of a gas particle distribution over time.
Function: f(v⃗,t) is the probability density (not discrete probabilities!) of finding a particle with
velocity v⃗ at time t.
Units & values: Not always non-negative, and not the same units as Shannon entropy.
H-Theorem: H-theorem is physics’ explanation of the "arrow of time" from a statistical point of view. Its
main purpose is to provide a statistical explanation for the second law of thermodynamics, specifically
the idea that entropy tends to increase over time in an isolated system. The H-theorem shows that, for
a dilute gas described by Boltzmann's equation, a certain quantity called H (related to entropy) always
decreases over time. Since entropy is inversely related to H (entropy ∝ -H), this means that entropy
increases, consistent with the second law of thermodynamics. The function H is minimized when the
gas reaches equilibrium (the Maxwell-Boltzmann distribution), which corresponds to maximum entropy.
dH
H-theorem state that ≤0
dt
ni
Proof: we know that ni =f i δ τ i ⇒ f i= now H-function can be written as H(t)= ∑ f iln f i δ τ i
δ τi
ni ni
⇒ H(t)= ∑ ln δ τi
δ τi δ τi
ni ni
⇒H(t)= ∑ni ln ∵ δ τ =n
δ τi δ τi i i
⇒ H ( t )=∑ ni (ln ni−ln δ τ i )
⇒ H ( t )=∑(n ¿ ¿ i ln ni−ni ln δ τ i)¿
S S
⇒ H ( t )=−N + constant ∵ ∑ pi ln p i=¿-
N KB N KB
−S
⇒ H ( t )= + constant
KB
⇒ K B H ( t )=−S+ constant
⇒ K B H ( t )=−S+ constant
dH ( t ) dS
⇒−K B = −¿ 0
dt dt
dH ( t ) dS
⇒−K B = ………. (2)
dt dt
dS dH ( t )
As ≥ 0 so eq (2) ⇒−K B ≥0
dt dt
dH ( t )
Or K B ≤0
dt
dH ( t )
Or ≤0
dt
Q: H-theorem provide a statistical explanation for the second law of thermodynamics. this is done by
entropy as well, then what is the need of H-theorem
This term is used to describe the real system with actual particles (atoms, molecules, etc.) interacting
and evolving in time.
Imagine you have a box containing a gas of 1,000 identical particles at room temperature and fixed
volume.
It exists in a single microstate at any instant, though it keeps changing over time.
What you see in a lab: A gas with some macroscopic properties like pressure, volume,
temperature.
So, when we say "Maxwell-Boltzmann assembly", we’re talking about this collection of real gas
particles whose statistical behavior follows Maxwell-Boltzmann statistics.
e−βEi tells you how likely a system is to be found in the microstate with energy Ei.
Lower-energy states (small Ei) have higher weight — they are more probable.
Higher-energy states (large Ei) have lower weight — they are less probable.
Weight=
{ 1if E i=E
0 otherwise
(Microcanonical) In the microcanonical ensemble, the "partition
function" is just the count of microstates Ω, not a sum over exponentials like in canonical
ensembles. Or δ(H(p,q)−E) is the mathematical expression of the weight factor for the
microcanonical ensemble.
−β (E −μN)
Weight=e (for grand canonical)
Partition Function: A key mathematical quantity in many ensembles (especially the canonical ensemble)
is the partition function Z, which is the sum of the probabilities of all possible microstates of the system:
Z=∑e−βEi Or The partition function is just the sum of all weight factors but not in microcanonical
ensemble. In the microcanonical ensemble, the "partition function" is simply the total number of
accessible microstates at that fixed energy E. Sometimes, instead of a discrete count, we use the density
of states ρ(E) ΔE if the energy levels are continuous. Then: Ω (E, V, N) =ρ(E)ΔE
In the microcanonical ensemble, the "partition function" is just the count of microstates Ω, not a sum
over exponentials like in canonical ensembles.
Non-degenerate states
In a non-degenerate system, each energy level is unique, meaning there's only one microstate
corresponding to each energy level.
Z=∑e−βEi
in a degenerate system, some energy levels have multiple microstates associated with the same energy.
This multiplicity is called degeneracy gi, the number of states with energy Ei.
Z=∑gi e−βEi
Here, gi reflects how many states have the same energy Ei, increasing that level's contribution to Z.
This function is crucial because it allows us to calculate thermodynamic properties like free energy,
entropy, and heat capacity.
System: a large collection of microscopic particles (like atoms or molecules) whose collective behavior
we're trying to understand using statistical methods.
The term "assembly" is used more generally to describe the collection of systems, while "ensemble" is
used to refer to a mathematical construct for studying these systems.
Definition: In physics and especially in statistical mechanics, an ensemble is a large collection of virtual
copies of a system, considered all at once, where each copy represents a possible state, the system could
be in — according to certain physical conditions.
Imagine you're studying a gas in a box — but instead of just one box, you imagine infinitely many
identical boxes, each representing the gas with:
the same macroscopic properties (like temperature, volume, number of particles), but
possibly different microscopic arrangements (like positions and velocities of the particles)
When you observe one physical system over a long time, you can (sometimes) gather enough
info about all its possible states — this is called the time average.
But in statistical mechanics, it's hard to follow one system over time.
So instead, we imagine an ensemble — many virtual copies of the system, each in a different
allowed state at the same moment.
Then we calculate the ensemble average — the average over all these possible states.
examples:
The individual systems of canonical ensembles are separated by rigid, impermeable but
conducting walls.
As the separating walls are conducting, heat can be exchanged by the systems. As a result, all the
systems will arrive at a common temperature (T).
You have a system: a box with a gas of N=1000N = 1000N=1000 particles.
The box is in thermal contact with a heat reservoir (like a giant thermal bath).
The temperature T, volume V, and number of particles N are all fixed.
But the energy of the system is not fixed — it can fluctuate as heat is exchanged with the
reservoir.
Instead of just this one gas box, you imagine infinitely many identical copies of it — a huge
number of boxes, each obeying the same macroscopic conditions (same T, N, V,).
Each box is in a different microstate: the particles have different positions and velocities in each
copy.
The collection of these boxes is called a canonical ensemble.
System can exchange energy with a heat bath, but not particles.
Where:
β=1/kBT
1
z (T ,V , N )= 3N
h N!
∫ e
−βH ( p, q )
d
3N
pd
3N
q
CV=
∂⟨E⟩
∂T
¿ & Pressure (for gases) (P=-
∂F
∂V ( )¿
T
E0=0
E1=ϵ=2 eV
T=300 K
Z (T, V, N) =∑e−βEi
Z≈1
1 1
⟨ E ⟩ = ¿ E0 e− β E + E1 e−β E ¿= ( 0+ 2e−77.4 ) ≈ 0
0 1
z 1
Interpretation: At room temperature, the system almost never reaches the excited state. It stays in the
ground state.
2. Microcanonical Ensemble: it consists of large number of essentially independent system having same
energy, volume and same number of particles. The individual system of microcanonical ensemble is
separated by rigid, impermeable and well-insulated walls such that values of E, V, and N for a particular
system are not affected by the presence of other systems.
All accessible states have the same energy and are equally probable.
Entropy: S=kBlnΩ
Fluctuate: nothing
Ω (E, V, N). This represents the number of microstates accessible to the system at exactly energy E.
1
ω ( E ,V ,N )= 3N
h N!
∫ 3N 3N
δ (H ¿ ¿( p , q)−E)d pd q ¿
3N 3N
2 2
V ( 2 πmE )
N
( 2 πmE )
Or ω ( E , V , N )= 3N
.
h N ! Γ( 3 N )
where ∫ d
3N
q=V
N
& ∫ δ( H −E)d 3 N p=
3N
¿
Γ( )
2 2
Where:
Use: The most direct use of the microcanonical partition function is to calculate the entropy:
S (E, V, N) =kBlnΩ
It tells you the amount of disorder or the number of accessible microstates at energy E.
Example: N=2
V=1.0V
h=6.626×10−34h
m=4.65×10−26m
E=1.0E
N! =2
3N
Γ( )=Γ (3) =2! =2
2
−26 −25
2 πmE =2⋅ π ⋅4.65 × 10 .1=2.922 ×10
3N
Now raise it to the power =3
2
3N
−74
and ( 2 πmE ) 2
=( 2.922 ×10 ) =2.50×10
−25 3
6
and h3 N =( 6.626 ×10−34 ) =8.92 ×10−119
∞
Z (T,V,μ) = ∑ e βμN Z N (T , V )
N=0
Where:
β=1/kBT
It’s a weighted sum over all particle numbers N, where each term is the canonical partition function ZN,
weighted by the fugacity eβμN.
Points to be noted:
(i) Microcanonical: Are all microstates equally probable & probability pi∝1/Ω
∑ e−βE i
(ii) Canonical: Are all microstates being not equally probable & pi∝ Ω =
z
(iii) Energy of ith single state is Ei then energy of Pi (no of microstates) is Pi Ei and energy of an
ensemble is ∑ P i E i
Q: Partition Function of micro canonical ensemble does not have simple form like canonical partition
function and Grand Canonical Ensemble?
Ans: “The microcanonical partition function does not have a simple form like the canonical and grand
canonical partition functions.”
Canonical and grand canonical ensembles involve exponential functions (easy to integrate)
The microcanonical partition function counts the number of microstates with exactly energy E
→ It involves a delta function: δ(H−E)
The canonical and grand canonical partition functions use an exponential weighting:
→ e−Eβ, which is smoother and easier to integrate or sum.
Partition Function in Statistical Mechanics:
The partition function is a mathematical function that encodes the statistical properties of a system in
thermodynamic equilibrium. It serves as the cornerstone of the canonical ensemble.
Derivation of β=1/KT
Now, to establish the connection between β and T, we look at the partition function and internal energy
from statistical mechanics.
For a system described by the canonical ensemble, the average energy E is related to the partition
function Z as follows:
−∂ ln Z 1
E= where β=
∂β KB T
We know dE=TdS
dE
=¿ T ∵dE=TdS
dS
∂S 1
0R = -------------(1)
∂E T
Also, we know that
S= K β E+ K Nln Z
∂S
⇒ =K β ……… (2)
∂E
Now comparing (1) & (2) we get
1
β=
KT
Conclusion:
The parameter β is the inverse temperature, and it plays a central role in the connection between
thermodynamic and statistical mechanical descriptions of a system.
N!
lnω=ln
∏ ni !
lnω=ln N !−∑ ln ni !
⇒ lnω=Nln N −N−∑ n i ln n i+ ∑ ni
⇒ lnω=Nln N −N−∑ n i ln n i+ N
N −βE
ni = e in eq (2) as
Z
N −βE
lnω=Nln N−∑(ni ln e ) i
Z
−β Ei
⇒ lnω=Nln N −∑( ni ln N + ni ln e −n i ln Z)
⇒ lnω=Nln N −∑ ni ln N +∑ β ni E i+∑ ni ln Z ¿
⇒ lnω=β E+ Nln Z
⇒ P=− ( ∂(−K∂VTlnZ ) )
β
⇒ P=K β T ( ∂lnZ ¿ ¿¿ ∂ V )T
This is a fundamental equation in statistical mechanics that allows you to compute the pressure from the
partition function, provided Z depends on volume (which it usually does, e.g., in an ideal gas).
∂ (−K β TlnZ )
S=
∂T
∂ lnZ
⇒ S=K β lnZ + K β T ………… (3)
∂T
∂ lnZ
We know =−U
∂β
∂ lnZ ∂T ∂T
It can be written as . =−U
∂T ∂ β ∂β
∂ lnZ ∂ β −1
⇒ . =−U ( )
∂ β ∂T KβT
2
∂ lnZ ∂ β U
⇒ . =
∂ β ∂ T Kβ T 2
∂ lnZ ∂ β U
⇒T . =
∂β ∂T KβT
∂ lnZ U
⇒T =
∂T K βT
∂ lnZ U
Put value of T = in eq (3) we get
∂T Kβ T
∂ lnZ
S= K β lnZ + K β T
∂T
U
S= K B lnZ + K β
K BT
1
⇒ S=K B lnZ + K β Uβ ∵ =1/ K β T
KBT
This links entropy directly to the partition function and internal energy in the canonical ensemble.
F=U−TS………… (1)
We also know
U
S= K B lnZ + K β
K BT
U
⇒ S=K B lnZ + ……………. (2)
T
Put value of S in (1) we get
U
F=U−T ( K B lnZ + )
T
⇒F=U−T K B lnZ + U
⇒F=−T K B lnZ
This expression is foundational in statistical mechanics because it connects microscopic properties (via Z)
to a macroscopic thermodynamic potential F, from which many other quantities (entropy, pressure,
energy) can be derived.
−∂lnZ
Q: prove that the internal energy is U =
∂β
PROOF: the internal energy is the average energy of the system:
U =∑ Pi E i∵Energy of ith single state is Ei then energy of Pi (no of microstates) is Pi Ei and
energy of an ensemble is ∑ P i E i
U =∑ Pi E i