0% found this document useful (0 votes)
7 views

Statistical mechanics-1 notes 1

The document provides an overview of statistical mechanics, focusing on Maxwellian particles, Bose-Einstein, and Fermi-Dirac statistics. It explains the characteristics of these particles, their statistical behaviors, and the significance of the partition function in thermodynamics. Additionally, it discusses the H-theorem and its relation to entropy, along with the definitions of assembly and ensemble in the context of statistical mechanics.

Uploaded by

tano.mh.gaj.nvi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Statistical mechanics-1 notes 1

The document provides an overview of statistical mechanics, focusing on Maxwellian particles, Bose-Einstein, and Fermi-Dirac statistics. It explains the characteristics of these particles, their statistical behaviors, and the significance of the partition function in thermodynamics. Additionally, it discusses the H-theorem and its relation to entropy, along with the definitions of assembly and ensemble in the context of statistical mechanics.

Uploaded by

tano.mh.gaj.nvi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Statistical mechanics-I NOTES

“Maxwellian particles: refer to that particles that obey the Maxwell-Boltzmann distribution is, their
energies or velocities follow the Maxwell-Boltzmann statistics. Maxwellian Particles are

1. Classical – quantum effects (like wavefunction overlap) are negligible.

2.Distinguishable – each particle is treated as unique.

3. Non-interacting (or weakly interacting) – their behavior can be described independently.

4. Obeying the Maxwell-Boltzmann distribution – i.e., their velocities, speeds, or energies are
distributed according to the formulas from classical statistical mechanics.

Examples of Maxwellian Systems

1. Ideal gases at relatively high temperatures and low densities.


2. Air molecules at room temperature.
3. Noble gases like helium or argon in a container.

4. Plasma systems (under some conditions, e.g., in fusion devices) are often approximated as
Maxwellian plasmas.

5. they can have any spin — integer or half-integer — but we're ignoring spin-related quantum
effects.
6.

Note: At high temperatures and/or low densities, both bosons and fermions behave like classical
particles — so you can use Maxwell-Boltzmann statistics, and they are effectively Maxwellian particles,
regardless of their spin.

Bose-Einstein Statistics

 Particles are indistinguishable, and multiple particles can occupy the same state.

 Instead of distributing n particles over g states and dividing by n! we count the number of ways
to put indistinguishable particles into distinguishable boxes (states), allowing multiple
occupancy.

Main difference among three statistics

Which particle in which state (MB Statistic)

How many particles per state (BE Statistic)

How many in each, max 1(FD Statistic)

Q: Why is Permutation Used in MB?

ANS: In MB statistics, we count how many ways we can arrange particles into energy levels. Since the
particles are distinguishable, order matters, so we use permutations.
Fermi-Dirac (FD) statistics: describe the distribution of particles over energy states in systems consisting
of fermions, which are particles that obey the Pauli exclusion principle (no two identical fermions can
occupy the same quantum state simultaneously). These statistics are crucial in understanding the
behavior of systems like electrons in metals, neutrons in neutron stars, and semiconductors.

 Fermions: Particles with half-integer spin (e.g., electrons, protons, neutrons).

 Pauli Exclusion Principle: No two fermions can occupy the same quantum state.

 Quantum States: Specific configurations characterized by quantum numbers (e.g., energy, spin).

Bose-Einstein (BE) statistics uses COMBINATION, not permutation.

Fermi-Dirac (FD) statistics uses COMBINATION, not permutation.

MAXWELL-BOLTZMANN (MB) statistics uses, permutation, not COMBINATION.

Statistical interpretation of thermodynamic probability

Given:

Let S = f(ω)

As S=S1+S2
so f(ω₁ω₂) = f(ω₁) + f(ω₂) ...(1)

Differentiating with respect to ω₁ keeping ω₂ constant:

∂ f (ω ₁ ω ₂) ∂ f (ω ₁) ∂ f (ω ₂) ∂(ω ₁ ω ₂) '
= + Since ω ₂is constant :f ' (ω ₁ ω ₂) =f (ω ₁)+0
∂ω ₁ ∂ω₁ ∂ω₁ ∂ω₁

⇒ f′(ω₁ω₂) · ω₂ = f′(ω₁)

⇒ ω₂ f′(ω₁ω₂) = f′(ω₁)………(2)

Differentiating eq (1) w.r.t. ω₂ (keeping ω₁ constant):


∂ f (ω ₁ ω ₂) ∂ f (ω ₁) ∂ f (ω ₂)
= + Since ω₁ is constant:
∂ω ₂ ∂ω₂ ∂ω₂
So
=f (ω ₂)+0⇒ ω ₁ f ' (ω ₁ ω ₂)=f ' (ω ₂)……. (3)
∂(ω ₁ ω ₂) '
f ' (ω ₁ ω ₂)
∂ω₂
Dividing (3) by (2), we get:
' '
ω ₁ f (ω ₁ ω ₂) f (ω ₂)
=
ω ₂ f '(ω ₁ω ₂) f ' (ω ₁)


'
ω ₁ f (ω ₂)
=
ω ₂ f ' (ω ₁)

⇒ ω ₁ f ' ( ω ₁ )=ω ₂ f ' (ω ₂)

⇒ ω f ' ( ω )=constant

⇒ω
df (ω)
=K B


df (ω) K B
=
dω ω

⇒ df ( ω)=
KB

ω
Integrating both sides:
f(ω) = K B ln(ω)
If f(ω) = S (entropy), the above equation becomes:
S = K B ln(ω)

H-function: two H- function (i) H(x) (Information Entropy) (ii) H(t) (Boltzmann's H-function)
Information Theory Entropy – H(X)

1st Definition:

H(x)= −∑p(xi)log p(xi)

i. Context: Information theory (Claude Shannon).


ii. Purpose: Measures the uncertainty or information content of a discrete random
variable x.
iii. Interpretation: How unpredictable or "random" a source is. Higher entropy → more
uncertainty.
iv. Probabilities: p(xi) is the probability of outcome xi.
v. Logarithm base: Often base 2 (bits), but sometimes natural log (nats).
vi. Always non-negative, with 0 meaning no uncertainty (one outcome is certain).
vii.

2nd Definition:
In Boltzmann’s H-theorem, the H-function is used to describe the approach of a gas towards
thermodynamic equilibrium. i.e more microstates corresponding a macrostate

H(t)= ∫ f (v ⃗ ,t)lnf (v ⃗ , t)dv ⃗ where f(v,t) is the velocity distribution of particles at time t.

 Context: Statistical mechanics (Ludwig Boltzmann).

 Purpose: Describes the entropy-like behavior of a gas particle distribution over time.

 Interpretation: As time progresses, this function tends to decrease (according to Boltzmann’s H-


theorem), leading toward equilibrium (maximum entropy).

 Function: f(v⃗,t) is the probability density (not discrete probabilities!) of finding a particle with
velocity v⃗ at time t.

 Units & values: Not always non-negative, and not the same units as Shannon entropy.

 Natural log is always used.

H-Theorem: H-theorem is physics’ explanation of the "arrow of time" from a statistical point of view. Its
main purpose is to provide a statistical explanation for the second law of thermodynamics, specifically
the idea that entropy tends to increase over time in an isolated system. The H-theorem shows that, for
a dilute gas described by Boltzmann's equation, a certain quantity called H (related to entropy) always
decreases over time. Since entropy is inversely related to H (entropy ∝ -H), this means that entropy
increases, consistent with the second law of thermodynamics. The function H is minimized when the
gas reaches equilibrium (the Maxwell-Boltzmann distribution), which corresponds to maximum entropy.

dH
H-theorem state that ≤0
dt
ni
Proof: we know that ni =f i δ τ i ⇒ f i= now H-function can be written as H(t)= ∑ f iln f i δ τ i
δ τi
ni ni
⇒ H(t)= ∑ ln δ τi
δ τi δ τi
ni ni
⇒H(t)= ∑ni ln ∵ δ τ =n
δ τi δ τi i i
⇒ H ( t )=∑ ni (ln ni−ln δ τ i )
⇒ H ( t )=∑(n ¿ ¿ i ln ni−ni ln δ τ i)¿

⇒ H ( t )=∑(n ¿ ¿ i ln ni−ni ln N +ni ln N−ni ln δ τ i )¿ ∵Add & subtract ni ln N

⇒ H ( t )=∑(n ¿ ¿ i ln ni−ni ln N )+ ∑(n i ln N −ni ln δ τ i)¿

⇒ H ( t )=∑ ni (ln ni−ln N )+∑ ni ln N−∑ ni ln δ τ i ¿


ni
⇒ H ( t )=N ∑ (ln ni−ln N )+ln N ∑ ni−ln δ τ i ∑ ni ¿
N
ni ni
⇒ H ( t )=N ∑ ln + ln N ∑ ni −ln δ τ i ∑ n i ¿
N N
ni ni
⇒ H ( t )=N ∑ ln +(ln N . N −ln δ τ i . N )
N N
ni ni
⇒ H ( t )=N ∑ ln +constant ………. (1) where ln N . N−ln δ τ i . N =constant
N N
ni
If is probability of particles in the ith cell of phase space. So eq (1) becomes
N
H ( t ) =N ∑ p i ln pi +constant

S S
⇒ H ( t )=−N + constant ∵ ∑ pi ln p i=¿-
N KB N KB

−S
⇒ H ( t )= + constant
KB

⇒ K B H ( t )=−S+ constant

⇒ K B H ( t )=−S+ constant

dH ( t ) dS
⇒−K B = −¿ 0
dt dt

dH ( t ) dS
⇒−K B = ………. (2)
dt dt

dS dH ( t )
As ≥ 0 so eq (2) ⇒−K B ≥0
dt dt

dH ( t )
Or K B ≤0
dt

dH ( t )
Or ≤0
dt
Q: H-theorem provide a statistical explanation for the second law of thermodynamics. this is done by
entropy as well, then what is the need of H-theorem

ANSWER: H-theorem doesn't replace entropy — it justifies it.

Chapter no 3: formulation of statistical methods


Assembly

Definition: the physical collection of systems being studied.

Definition: An assembly is an actual physical collection of many particles or subsystems.

This term is used to describe the real system with actual particles (atoms, molecules, etc.) interacting
and evolving in time.

Often used when talking about Bose-Einstein, Fermi-Dirac, or Maxwell-Boltzmann assemblies —


referring to the way particles are statistically distributed according to their quantum behavior.

Example: Ideal Gas in a Box

Imagine you have a box containing a gas of 1,000 identical particles at room temperature and fixed
volume.

 This is the actual gas in the box.

 It has real atoms bouncing around, colliding, exchanging energy.

 It exists in a single microstate at any instant, though it keeps changing over time.

 What you see in a lab: A gas with some macroscopic properties like pressure, volume,
temperature.

So, when we say "Maxwell-Boltzmann assembly", we’re talking about this collection of real gas
particles whose statistical behavior follows Maxwell-Boltzmann statistics.

Q: what is meant by weight factor e−βEi?

e−βEi tells you how likely a system is to be found in the microstate with energy Ei.

 Lower-energy states (small Ei) have higher weight — they are more probable.

 Higher-energy states (large Ei) have lower weight — they are less probable.

 Each ensemble has its own weight factor

 Weight=
{ 1if E i=E
0 otherwise
(Microcanonical) In the microcanonical ensemble, the "partition

function" is just the count of microstates Ω, not a sum over exponentials like in canonical
ensembles. Or δ(H(p,q)−E) is the mathematical expression of the weight factor for the
microcanonical ensemble.
−β (E −μN)
 Weight=e (for grand canonical)

Partition Function: A key mathematical quantity in many ensembles (especially the canonical ensemble)
is the partition function Z, which is the sum of the probabilities of all possible microstates of the system:
Z=∑e−βEi Or The partition function is just the sum of all weight factors but not in microcanonical
ensemble. In the microcanonical ensemble, the "partition function" is simply the total number of
accessible microstates at that fixed energy E. Sometimes, instead of a discrete count, we use the density
of states ρ(E) ΔE if the energy levels are continuous. Then: Ω (E, V, N) =ρ(E)ΔE

In the microcanonical ensemble, the "partition function" is just the count of microstates Ω, not a sum
over exponentials like in canonical ensembles.

Non-degenerate states

In a non-degenerate system, each energy level is unique, meaning there's only one microstate
corresponding to each energy level.

The partition function Z is given by:

Z=∑e−βEi

in a degenerate system, some energy levels have multiple microstates associated with the same energy.
This multiplicity is called degeneracy gi, the number of states with energy Ei.

The partition function becomes:

Z=∑gi e−βEi

Here, gi reflects how many states have the same energy Ei, increasing that level's contribution to Z.

This function is crucial because it allows us to calculate thermodynamic properties like free energy,
entropy, and heat capacity.

System: a large collection of microscopic particles (like atoms or molecules) whose collective behavior
we're trying to understand using statistical methods.

Assembly: definition: the physical collection of systems being studied.

Ensemble: definition: the virtual collection of systems being studied.

The term "assembly" is used more generally to describe the collection of systems, while "ensemble" is
used to refer to a mathematical construct for studying these systems.

Definition: In physics and especially in statistical mechanics, an ensemble is a large collection of virtual
copies of a system, considered all at once, where each copy represents a possible state, the system could
be in — according to certain physical conditions.

Imagine you're studying a gas in a box — but instead of just one box, you imagine infinitely many
identical boxes, each representing the gas with:

 the same macroscopic properties (like temperature, volume, number of particles), but
 possibly different microscopic arrangements (like positions and velocities of the particles)

 When you observe one physical system over a long time, you can (sometimes) gather enough
info about all its possible states — this is called the time average.
 But in statistical mechanics, it's hard to follow one system over time.
 So instead, we imagine an ensemble — many virtual copies of the system, each in a different
allowed state at the same moment.
 Then we calculate the ensemble average — the average over all these possible states.

examples:

1. Canonical Ensemble: It is the collection of a large number of essentially independent systems


having the same temperature (T), volume (V), and same number of particles (N).

The individual systems of canonical ensembles are separated by rigid, impermeable but
conducting walls.
As the separating walls are conducting, heat can be exchanged by the systems. As a result, all the
systems will arrive at a common temperature (T).
 You have a system: a box with a gas of N=1000N = 1000N=1000 particles.
 The box is in thermal contact with a heat reservoir (like a giant thermal bath).
 The temperature T, volume V, and number of particles N are all fixed.
 But the energy of the system is not fixed — it can fluctuate as heat is exchanged with the
reservoir.
 Instead of just this one gas box, you imagine infinitely many identical copies of it — a huge
number of boxes, each obeying the same macroscopic conditions (same T, N, V,).
 Each box is in a different microstate: the particles have different positions and velocities in each
copy.
 The collection of these boxes is called a canonical ensemble.

 System can exchange energy with a heat bath, but not particles.

 Energy fluctuates; temperature is fixed.


− Ei
−Ei / K B T KB T
 Probability of a state with energy Ei: is the partition function
Pi=e Z
, where Z=∑ e Z

 Fixed Quantities are T, V, N

 Fluctuate: energy only

 Distribution: Boltzmann distribution

canonical partition function: The canonical partition function is defined as:


Z (T, V, N) =∑e−βEi

Where:

 β=1/kBT

 Ei is the energy of the ith microstate

 The sum runs over all possible microstates

For classical systems, this becomes an integral over phase space:

1
z (T ,V , N )= 3N
h N!
∫ e
−βH ( p, q )
d
3N
pd
3N
q

Use: from Canonical Partition Function Z we can compute

Helmholtz Free Energy(F=KBTlnZ), Average Energy (⟨ E ⟩ =


−∂lnZ
∂β
), Entropy(S=-( )
∂F
∂T V
), Heat Capacity (

CV=
∂⟨E⟩
∂T
¿ & Pressure (for gases) (P=-
∂F
∂V ( )¿
T

EXAMPLE: find partition function and average energy.

 E0=0

 E1=ϵ=2 eV

 T=300 K

 β=1/kBT≈1/ (8.617×10−5⋅300) ≈38.7 eV−1

Now Compute Partition Function

Z (T, V, N) =∑e−βEi

⇒Z=e− β E +e− β E =1+e−(38.7)2=1+e−77.4≈1


0 1

Because e−77.4 is basically zero, we get:

Z≈1

Computing Average Energy we have

1 1
⟨ E ⟩ = ¿ E0 e− β E + E1 e−β E ¿= ( 0+ 2e−77.4 ) ≈ 0
0 1

z 1
Interpretation: At room temperature, the system almost never reaches the excited state. It stays in the
ground state.

2. Microcanonical Ensemble: it consists of large number of essentially independent system having same
energy, volume and same number of particles. The individual system of microcanonical ensemble is
separated by rigid, impermeable and well-insulated walls such that values of E, V, and N for a particular
system are not affected by the presence of other systems.

 Used for isolated systems (no exchange of energy or particles).

 All accessible states have the same energy and are equally probable.

 Entropy: S=kBlnΩ

 Fixed Quantities are E, V, N

 Fluctuate: nothing

 Distribution: Equal probability for all states

Partition Function of micro canonical ensemble:


In the microcanonical ensemble, the system is isolated, meaning it has a fixed energy (E), volume (V),
and number of particles (N). Unlike the canonical ensemble, where the partition function is a sum over
states weighted by e−βE, in the microcanonical ensemble, all accessible microstates with energy E are
considered equally probable. In this ensemble, the “partition function” is better known as the density of
states or the multiplicity function, denoted by:

Ω (E, V, N). This represents the number of microstates accessible to the system at exactly energy E.

1
ω ( E ,V ,N )= 3N
h N!
∫ 3N 3N
δ (H ¿ ¿( p , q)−E)d pd q ¿

3N 3N
2 2
V ( 2 πmE )
N
( 2 πmE )
Or ω ( E , V , N )= 3N
.
h N ! Γ( 3 N )
where ∫ d
3N
q=V
N
& ∫ δ( H −E)d 3 N p=
3N
¿
Γ( )
2 2
Where:

 H(p,q) is the Hamiltonian (total energy) of the system,

 δ is the Dirac delta function ensuring energy is exactly E,

 h is Planck's constant to make the expression dimensionless (important in statistical mechanics),

 N! accounts for indistinguishability of particles.

Use: The most direct use of the microcanonical partition function is to calculate the entropy:

S (E, V, N) =kBlnΩ

This is the fundamental thermodynamic quantity in the microcanonical ensemble.

It tells you the amount of disorder or the number of accessible microstates at energy E.
 Example: N=2

 V=1.0V

 h=6.626×10−34h

 m=4.65×10−26m

 E=1.0E

 N! =2

3N
 Γ( )=Γ (3) =2! =2
2
−26 −25
2 πmE =2⋅ π ⋅4.65 × 10 .1=2.922 ×10
3N
Now raise it to the power =3
2
3N
−74
and ( 2 πmE ) 2
=( 2.922 ×10 ) =2.50×10
−25 3

6
and h3 N =( 6.626 ×10−34 ) =8.92 ×10−119

and h3 N N !=8.92 ×10−119 ×2=1.784 × 10−198


3N
2
V N ( 2 πmE )
now ω ( E , V , N ) = .
h3 N N ! Γ ( 3 N )
2
2 −74
1 2.50 × 10
ω ( E , V , N )= −198
×
1.784 ×10 2
123
ω ( E , V , N )=7.01 ×10

This is the number of microstates available to the system at energy E=1J.

2.Grand Canonical Ensemble: It is the collection of a large number of essentially independent


systems having the same temperature (T), volume (V), and chemical potential (μ). The individual systems
of grand canonical ensemble are separated by rigid, permeable and conducting walls.

 System can exchange both energy and particles with a reservoir.


 Described by T,V,μ
 Used to derive Fermi-Dirac and Bose-Einstein statistics.
 Fluctuate: Energy & particle number both

 Distribution: Fermi-Dirac / Bose-Einstein etc.


Partition Function of Grand Canonical Partition Function:

the grand partition function is defined as:



Z (T,V,μ) = ∑ e− β (E− μN) Z N (T ,V )
N=0


Z (T,V,μ) = ∑ e βμN Z N (T , V )
N=0

Where:

 β=1/kBT

 μ is the chemical potential

 ZN is the canonical partition function for N particles

It’s a weighted sum over all particle numbers N, where each term is the canonical partition function ZN,
weighted by the fugacity eβμN.

Points to be noted:

(i) Microcanonical: Are all microstates equally probable & probability pi∝1/Ω
∑ ​e−βE i
(ii) Canonical: Are all microstates being not equally probable & pi∝ Ω =
z
(iii) Energy of ith single state is Ei then energy of Pi (no of microstates) is Pi Ei and energy of an
ensemble is ∑ P i E i

Q: Partition Function of micro canonical ensemble does not have simple form like canonical partition
function and Grand Canonical Ensemble?

Ans: “The microcanonical partition function does not have a simple form like the canonical and grand
canonical partition functions.”

Canonical and grand canonical ensembles involve exponential functions (easy to integrate)

Microcanonical involves delta functions (harder to work with)

The microcanonical partition function counts the number of microstates with exactly energy E
→ It involves a delta function: δ(H−E)

The canonical and grand canonical partition functions use an exponential weighting:
→ e−Eβ, which is smoother and easier to integrate or sum.
Partition Function in Statistical Mechanics:

The partition function is a mathematical function that encodes the statistical properties of a system in
thermodynamic equilibrium. It serves as the cornerstone of the canonical ensemble.

CH#4 PARTITION FUNCTION

Derivation of β=1/KT

Now, to establish the connection between β and T, we look at the partition function and internal energy
from statistical mechanics.

For a system described by the canonical ensemble, the average energy E is related to the partition
function Z as follows:

−∂ ln Z 1
E= where β=
∂β KB T

We know dE=TdS

Now, differentiating E with respect to s

dE
=¿ T ∵dE=TdS
dS
∂S 1
0R = -------------(1)
∂E T
Also, we know that
S= K β E+ K Nln Z

∂S
⇒ =K β ……… (2)
∂E
Now comparing (1) & (2) we get

1
β=
KT
Conclusion:

The parameter β is the inverse temperature, and it plays a central role in the connection between
thermodynamic and statistical mechanical descriptions of a system.

Q: Prove that S=KN lnz+ K β E

Proof: we know that S=K lnω ………………… (1)

The maxwell- Boltzmann statistic for non-degenerate state is


N!
ω=
∏ ni !
Taking ln on b/s

N!
lnω=ln
∏ ni !
lnω=ln N !−∑ ln ni !

Using Stirling approximation

⇒lnω=Nln N −N−∑(ni ln ni−ni)

⇒ lnω=Nln N −N−∑ n i ln n i+ ∑ ni

⇒ lnω=Nln N −N−∑ n i ln n i+ N

⇒ lnω=Nln N −∑ ni ln ni……………. (2)

From maxwell-Boltzmann distribution function for non-degenerate state, put

N −βE
ni = e in eq (2) as
Z
N −βE
lnω=Nln N−∑(ni ln e ) i

Z
−β Ei
⇒ lnω=Nln N −∑( ni ln N + ni ln e −n i ln Z)

⇒lnω=Nln N −∑(ni ln N −β E i ni ln e e−ni ln Z )

⇒ lnω=Nln N −∑(ni ln N −β n i Ei −ln Z )

⇒ lnω=Nln N −∑ ni ln N +∑ β ni E i+∑ ni ln Z ¿

⇒ lnω=Nln N −Nln N + β ∑ ni Ei + Nln Z ¿ ∵ ∑ is withrunning index

⇒ lnω=β E+ Nln Z

Now putting value of lnω in eq (1) we get


S= K β E+ K Nln Z

Q: Prove that P=K β T ( ∂lnZ


∂V ) T

From canonical ensemble the relation between Helmholtz Free Energy

And partition function is F=−kBTlnZ…………… (1)


Since P=− ( ∂∂ VF ) T

plug in value F we get

⇒ P=− ( ∂(−K∂VTlnZ ) )
β

⇒ P=K β T ( ∂lnZ ¿ ¿¿ ∂ V )T

This is a fundamental equation in statistical mechanics that allows you to compute the pressure from the
partition function, provided Z depends on volume (which it usually does, e.g., in an ideal gas).

Q: Prove that S=− ( ∂∂TF ) =K ¿)


V
β

Proof: From statistical mechanics:

F=−K β TlnZ ………………. (1)

We know S=− ( ∂∂TF ) ……………... (2)


V

Put F=−K β TlnZ in (2) we get

∂ (−K β TlnZ )
S=
∂T
∂ lnZ
⇒ S=K β lnZ + K β T ………… (3)
∂T
∂ lnZ
We know =−U
∂β
∂ lnZ ∂T ∂T
It can be written as . =−U
∂T ∂ β ∂β
∂ lnZ ∂ β −1
⇒ . =−U ( )
∂ β ∂T KβT
2

∂ lnZ ∂ β U
⇒ . =
∂ β ∂ T Kβ T 2

∂ lnZ ∂ β U
⇒T . =
∂β ∂T KβT

∂ lnZ U
⇒T =
∂T K βT
∂ lnZ U
Put value of T = in eq (3) we get
∂T Kβ T

∂ lnZ
S= K β lnZ + K β T
∂T
U
S= K B lnZ + K β
K BT

1
⇒ S=K B lnZ + K β Uβ ∵ =1/ K β T
KBT

⇒ S=K B (lnZ +Uβ)

This links entropy directly to the partition function and internal energy in the canonical ensemble.

Q: prove that F=−kBTlnZ

Ans: we know that the Helmholtz free energy is:

F=U−TS………… (1)

We also know

U
S= K B lnZ + K β
K BT

U
⇒ S=K B lnZ + ……………. (2)
T
Put value of S in (1) we get

U
F=U−T ( K B lnZ + )
T

⇒F=U−T K B lnZ + U

⇒F=−T K B lnZ

This expression is foundational in statistical mechanics because it connects microscopic properties (via Z)
to a macroscopic thermodynamic potential F, from which many other quantities (entropy, pressure,
energy) can be derived.

−∂lnZ
Q: prove that the internal energy is U =
∂β
PROOF: the internal energy is the average energy of the system:
U =∑ Pi E i∵Energy of ith single state is Ei then energy of Pi (no of microstates) is Pi Ei and
energy of an ensemble is ∑ P i E i

U =∑ Pi E i

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy