Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
Lecture Note 20S Supplement to Chapter 20 Entropy Introduction to Statistical Mechanics Micro-canonical ensembles Canonical ensembles 1. Definition of entropy in statistical mechanics
In statistical thermodynamics the entropy is defined as being proportional to the logarithm of the number of microscopic configurations that result in the observed macroscopic description of the thermodynamic system:
S = kB lnW where kB is the Boltzmann's constant and W is the number of microstates corresponding to the observed thermodynamic macrostate. This definition is considered to be the fundamental definition of entropy (as all other definitions can be mathematically derived from it, but not vice versa). In Boltzmann's 1896 Lectures on Gas Theory, he showed that this expression gives a measure of entropy for systems of atoms and molecules in the gas phase, thus providing a measure for the entropy of classical thermodynamics.
In 1877, Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy. Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrödinger, has been to determine the distribution of a given amount of energy E over N identical systems.
Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic variables, like temperature and volume, the entropy measures the degree to which the probability of the system is spread out over different possible quantum states. The more states available to the system with higher probability, the greater the entropy. More specifically, entropy is a logarithmic measure of the density of states. In essence, the most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system. This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model. 2. Boltzmann’s principle Microcanonical ensemble
We consider a system in the energy interval between E and E + E, where E is the order of uncertainty. Each state in this range is assumed to have the same probability. Thus we have an ensemble characterized by the equal probability wn of each state n, or
wn = w = constant for E<En<E + E, which expresses the principle of equal probability as it is. The number of microscopic states accessible to a macroscopic system is called the statistical weight (thermodynamic weight). The number of states for the system with the energy between E and E + E (the statistical weight) may be written as
EEEEW )(),( where (E) is the density of states. This particular ensemble is known as the microcanonical ensemble.
Here we show that the entropy S in the equilibrium state of the system can be expressed by
WkS B ln where W is the number of states.
Suppose that the entropy S is expressed by a function W as
)(WfS We assume that there are two independent systems (A and B). These systems are in the macroscopic equilibrium states a and b , respectively. The numbers of states for
these states are given by Wa and Wb. We now consider the combined system (C = A + B). The number of states for the state c )( ba is given by
Wc=Wa Wb. The entropy of the system C is
BAC SSS
from the additivity of the entropy. Then we have the relation
)()()( baba WfWfWWf
For simplicity, we put
= Wa, = Wb. Then the relation is rewritten as
)()()( fff Taking a derivative of the both sides with respect to ,
)(')(' ff Taking a derivative of the both sides with respect to ,
)(')(' ff From these we get
kff )(')(' where k is independent of and . In other words, k should be constant. The solution of the first-order differential equation is
ln)( kf When k = kB (Boltzmann constant), the entropy is given by
WkS B ln 3. Microcanonical ensemble for an ideal gas
The basic assumption of statistical mechanics is
all microstates are equally probable.
We consider the system of N free particles in a container with volume V. Each state of the system is specified by a point at 6N-dimensional space (3N coordinates, 3N momenta) (so called, phase space).
(r1, r2,…,rN, p1, p2, … , pN) = (q1, q2, q3,…., q3N, p1, p2,…, p3N) This state is continuous, but not discrete. For convenience, we assume that the 6N-dimensional phase space consists of unit cells with the volume of (qp)3N, where the size of p and q is arbitrary. There is one state per the volume of (qp)3N. Note that qp= h (Planck’s constant) from the Heisenberg’s principle of uncertainty.
We define the number of states in a volume element dq1dq2dq3….dq3Ndp1dp2dp3N as
NNNdpdpdpdqdqdqdq
pq 32133213.........
)(
1
The total energy E is a sum of the kinetic energy of each particle. It is independent of the 3N coordinates {ri}. We calculate the number of states when the total energy is between E and E + E.
NN
N
NNNdpdpdp
pq
Vdpdpdpdqdqdqdq
pq
EEEEW
321332133213....
)(.........
)(
1
)(),(
where
E ≤ (p12 + p2
2 + ….+ p3N2)/ 2m ≤ E + E
The volume of the momentum space is the spherical shell between the sphere with radius
)(2 EEmR and the sphere of radius mER 2 .
We calculate the number of states between E and E + E as follows. First we calculate the volume of the 3N dimensional sphere (momentum) with radius R given by
mER 2 The volume of the sphere is obtained as
2/3)2()2/3()3(
2 NmENN
where (x) is the Gamma function of x. The total number of states N(E) for the system with energy between 0 and E is given by
2/33 )2(
)2/3()3(
2
!)(
1)( N
N
N mENNN
V
pqEN
where N! is included because N-particles are non-distinguishable. Note that N(E) is introduced for the convenience of the mathematics since we consider a system having the energy only between E and E + E. The number of states between E and E + E is obtained as
E
E
N
mE
N
V
pqE
dE
EdNEEEEW
NN
N
)2/3(
)2(
!)(
1)()(),(
2/3
3
where (E) is the density of states. We use the Heisenberg’s principle of uncertainty;
pq = h (Planck’s constant), which comes from the quantum mechanical requirement. Then we have
E
EmE
NN
V
E
E
h
mE
NN
VE
dE
EdNEEW
NN
NN
2/32
2/32
)2
()2/3(!
)2
()2/3(!
)(),(
where ħ = h/(2) (ħ; Dirac constant).
Using the Stirling’s formula
2
3)
2
3ln(
2
3)
2
3(ln
NNNN for N>>1.
)1(ln!ln NNN for N>>1
Then we have
)ln(]2
5)
3ln(
2
3[ln
]2
3)
2
3ln(
2
3[)1(ln)ln()
2ln(
2
3ln),(ln
2
2
E
E
N
mE
N
VN
NNNNN
E
EmENVNEEW
4. Entropy S
The entropy S is defined by
]ln)3
ln(2
3)ln(
2
5[
),(ln
2 E
E
N
EmN
N
VN
Nk
EEWkS
B
B
In the limit of N →∞
)]3
ln(2
3)ln(
2
5[
2 N
Em
N
VNkS B
So the entropy S is found to be an extensive parameter. The entropy S depends on N, E and V:
),,( VENSS
dVT
PdE
T
dVV
VENSdE
E
VENSdS
1
),,(),,(
Then we have
TE
EEWk
E
VENSB
1),(ln),,(
and
T
P
V
VENS
),,(
From this relation, E can be derived as a function of N, V, and T.
),,( TVNEE The heat capacity CV is given by
TV T
EC
In the above example, we have
ENk
TE
VENSB
1
2
31),,(
or
TNkE B2
3 (internal energy)
and
VNk
T
P
V
VENSB
1),,(
we have
TNkPV B (Boyle's law) 5. Typical Examples 5.1 Adiabatic free expansion (sudden expansion into a vacuum)
The free expansion is an irreversible process. Nevertheless, we can define the entropy in the reversible process (the final state and initial states are the same), given by
constTVNk
Tmk
N
VNk
Tkm
N
VNk
N
Em
N
VNkS
B
BB
BB
B
)ln2
3(ln
)]2
ln(2
3)ln(
2
5[
)]2
3
3ln(
2
3)ln(
2
5[
)]3
ln(2
3)ln(
2
5[
2
2
2
since TNkE B2
3 . This expression is exactly the same as that derived previously
(Chapter 20). When the volume changes from V = V1 to V2 at constant T, the change in entropy ( S ) is obtained as
)ln(1
2
V
VNkS B
5.2 Isentropic process
The entropy remains constant if
constVT )ln( 2/3 . In an expansion at constant entropy from V1 to V2, we have
2/322
2/311 TVTV
for an ideal monatomic gas. Since
2
22
1
11
T
VP
T
VP
we have
2/3222
2/3111 )()( VPVVPV or 3/5
223/5
11 VPVP 5.3 Mixing entropy of ideal gas
We consider two kinds of identical ideal gases which are separated by a barrier AB. The temperature T and the pressure P are the same.
TkNPV
TkNPV
B
B
22
11
or P
Tk
N
V
N
V B2
2
1
1
or
P(V1+V2) = (N1+N2)kBT P
Tk
NN
VV B
21
21
Suppose that the barrier is open suddenly. We discuss what is the change of entropy of this system, using the expression of the entropy
])ln(2
3)ln(
2
5[ 0 T
N
VNkS B
Before opening the barrier AB, we have
21 SSSi
where
])ln(2
3)ln(
2
5[
])ln(2
3)ln(
2
5[
01
01
111
TP
TkkN
TN
VkNS
BB
B
])ln(2
3)ln(
2
5[
])ln(2
3)ln(
2
5[
02
02
222
TP
TkkN
TN
VkNS
BB
B
After opening the barrier AB, the total number of atoms is N1 + N2 and the volume is V1 + V2. Then we have the final entropy,
])ln(2
3)ln(
2
5[)(
])ln(2
3)ln(
2
5[)(
021
021
2121
TP
TkkNN
TNN
VVkNNS
BB
Bf
Since
0)( 21 SSSS f
as is expected (the reversible process), we have no change of entropy in this event. ((Gibbs paradox))
This success is partly due to the N! term in the numerator of W(, E). This is closely related to the fact that the particles are not distinguishable in our system. ((Micro-canonical ensemble))
So far we consider the micro-canonical ensemble, where the system is isolated from its surrounding. The energy of the system is conserved. It is assumed that the system of the energy has E and EE , where dE is a small width. The concept of the thermal reservoir does not appear. Since the energy of the system is constant, the possible microstates have the same probability. For the energy E, the volume V, and the number of particles, one can obtain the number of the possible microstates. Then the entropy
),,( VNES can be defined by
),(ln EEWkS B . 6. Canonical ensemble(system with constant temperature)
The theory of the micro-canonical ensemble is useful when the system depends on N, E, and V. In principle, this method is correct. In real calculations, however, it is not so easy to calculate the number of states W(E, E) in general case. We have an alternative method, which is much useful for the calculation in the real systems. The formulation of the canonical ensemble is a little different from that of the micro-canonical ensemble. Both of these ensembles lead to the same result for the same macro system. Canonical ensemble: (N, T, V, constant)
Suppose that the system depends on N, T, and V. A practical method of keeping the
temperature of a system constant is to immerse it in a very large material with a large heat capacity. If the material is very large, its temperature is not changed even if some energy is given or taken by the system in contact. Such a heat reservoir serves as a thermostat.
We consider the case of a small system S(I) in thermal contact with a heat reservoir (II). The system S(I) is in thermal equilibrium with a reservoir W(II). S(I) and W(II) have a common temperature T. The system S(I) is a relatively small macroscopic system. The energy of S(I) is not fixed. It is only the total energy of the combined system.
iIIT EEE
We assume that WII(EII) is the number of states where the thermal bath has the energy EII. If S(I) is in the one definite state i , the probability of finding the system (I) in the state
i , is proportional to WII(EII). The thermal bath is in one of the many states with the
energy ET - Ei
)()( iTIIIIIIi EEWEWp
or
constEEWp iTIIi )](ln[ln
Since
iT EE
)](ln[ iTII EEW can be expanded as
iEII
IIIITIIiTII E
dE
EWdEWEEW
T|
)(ln)(ln)](ln[
Then we obtain
)|)(ln
exp( iEII
IIIIi E
dE
EWdp
T (1)
Here we notice the definition of entropy and temperature for the reservoir as the micro-canonical ensemble:
)(ln IIIIBII EWkS and
IIII
II
TE
S 1
or
IIBII
II
BII
IIII
TkE
S
kdE
EWd 11)(ln
In thermal equilibrium, we have
TTT iII
Then Eq.(1) can be rewritten as
)exp()exp( iB
ii E
Tk
Ep
where = 1/(kBT). This is called a Boltzmann factor. We define the partition function Z as
i
EieZ
The probability is expressed by
iEi e
Zp
1
The summation in Z is over all states i of the system. We note that
1)(
iiEp
The average energy of the system is given by
ZeE
ZE
i
Ei
iln1
since
i
Ei
i
Ei
ii eEZ
eEZ
Z
Z
Z
1
)(11ln
Note that
T
ZTk
T
Z
T
ZE B
lnln1ln 2
.
In summary, the representative points of the system I are distributed with the
probability density proportional to exp(-Ei). This is called the canonical ensemble, and this distribution of representative points is called the canonical distribution. The factor exp(-Ei) is often referred to as the Boltzmann factor. 7. Pressure
The pressure P is defined as
V
Z
V
Z
Ze
V
E
Ze
ZPP
i
EiE
ii
ii
ln111)(
11
Here we define the Helmholtz free energy F as
STEF .
SdTPdV
TdSSdTPdVTdS
TdSSdTdEdF
F is a function of T and V. From this equation, we have
T
V
V
FP
T
FS
8. Helmholtz free energy and entropy
The Helmholtz free energy F is given by
ZTkF B ln ((Proof)) We note that
ZT
kT
E
T
FST
T
FT
FT
T
F
T B ln)(222
,
which leads to
ZTkF B ln What is the expression for the entropy in a canonical ensemble? The entropy is given by
T
FES
where E is the average energy of the system,
Z
Eln
Then entropy S is rewritten as
iiiB
Bii
iB
Bii
iB
B
E
iiB
Bi
Ei
B
ppk
ZkpZpk
ZkpEk
ZkZ
eEk
ZkeEZT
ZkZ
TS
i
i
ln
ln)lnln(
ln
ln
ln11
lnln1
or
i
iiB ppkS ln ,
where pi is that the probability of the i state and is given by
iEi e
Zp
1
The logarithm of pi is described by
ZEp ii lnln
((Note))
We finally get a useful expression for the entropy which can be available for the information theory.
i
iiB ppkS ln .
Suppose that there are 50 boxes. There is one jewel in one of 50 boxes. pi is the probability of finding one jewel in the i-th box for one trial. (a) There is no hint where the jewel is.
p1 = p2 = ….. = p50=50
1
50
1
91.3)50
1ln(
50
1ln
sBB
sssB kkppkS
(b) There is a hint that the jewel is in one of the box with even number.
p1 = p3 = ….. = p49=0
p2 = p4 = ….. = p50=25
1
evens
BBs
ssB kkppkS 219.3)25
1ln(
25
1ln
(c) If you know that the jewel is in the 10-th box,
p10=1 ps = 0 (s ≠ 10)
0ln 1010 ppkS B
If you know more information, the information entropy becomes smaller. 9. Application 9.1. Partition function Z The partition function Z for the ideal gas can be calculated as
2/32
32
3)
2(
!])
2exp([
!NB
NN
BN
N
h
Tmk
N
V
Tmk
pdp
hN
VZ
Using this expression of Z, the Helmholtz free energy F can be calculated as
]1)2
ln(2
3)ln([ 2
h
Tmk
N
VTNkF B
B
.
The internal energy E is
TNkZ
E B2
3ln
.
The heat capacity CV at constant volume is given by
BVV NkT
EC
2
3)(
.
The entropy S is
]2
5)
2ln(
2
3)[ln(
2
Tmk
N
VNk
T
FES B
B .
The pressure P is
V
TNk
V
FP B
T
)( .
9.2 Maxwell’s distribution function
The Maxwell distribution function can be derived as follows.
dvvTk
mvAdvvfdvvvndn
B
22
2 4)2
exp()(4)()( vv
The normalization condition:
1)(4)()( 2 dvvfdvvvndn vv
The constant A is calculated as
2/3
2
Tk
mA
B.
Then we have
)2
exp(2
4)(2
2
2/3
Tk
mvv
Tk
mvf
BB
.
Since M = m NA and R = NA kB, we have
)2
exp(42
)(2
22/3
RT
Mvv
RT
Mvf
which agrees with the expression of f(v) in Chapter 19. ((Mathematica))
f1 IntegrateExp m v2
2 kB T 4 v2, v, 0, ,
GenerateConditions False2 2 32
mkB T
32
eq1 A f1 1;
Solveeq1, A
A m
kB T32
2 2 32
10. Comparison of the expression of S in the canonical ensemble with the
original definition of S in the microcanonical ensemble
The partition function Z can be written as
deeZi
Ei )(
where we use instead of E (or Ej) in the expression. The partition function Z is the Laplace transform of the density of states, )( . Here we show that
)exp()(2)()( *** ddeZ
If the function () is given by
e)()( . We assume that () can be approximated by a Gaussian function
])(2
)(exp[)()(
2*
2**
where
)exp()()( ***
Fig. () vs . () has a Gaussian distribution with the width around = * = E.
The function () has a local maximum at E * . The logarithm of )( is rewritten as
)](ln[
)]exp()(ln[)(ln
We take the derivative of )(ln with respect to ,
d
d )(ln
)(
)('
)(
)('
where
*|
)(ln
d
d
since
0)(' * .
Here we define the number of states ),( W by
)(2)(),( *** W Then we have
*|),(ln
d
Wd (1)
since
** |)(ln
|),(ln
ln)(ln),(ln *
W
W
with fixed . We note that
)exp()(2
2)(])(2
)(exp[)()(
***
**
02*
2**
ddZ
Then we have
*** )](2ln[ln Z The entropy S is calculated as
),(ln
)](2ln[
)](2ln[
lnln1
**
***
*
Wk
k
Tk
T
ZkZ
TS
B
B
B
B
(2)
Using Eqs.(1) and (2), we get
Tk
E
SB
1
or
T
W 1),(ln
11. Boltzmann-Planck’s method
Finally we show the standard method of the derivation, which characterizes well the theory of canonical ensembles.
We consider the way of distributing M total ensembles among states with energies Ej. Let Mj be the number of ensembles in the energy level Ej; M1 ensembles for the energy E1, the M2 ensembles for the energy E2, and so on. The number of ways of distributing M ensembles is given by
!...!
!
21 MM
MW
where
MMj
j
and the average energy E is given by
j
jj M
MEE
The entropy S is proportional to lnW,
j
jMMW )!ln(!lnln
Using the Stirling's formula
jjj
jjj
MMMM
MMMMW
lnln
)1(ln)1(lnln
in the limit of large M and Mj. We note that the probability of finding the state j is
simply given by
M
MEP j
j )(
Then we have
jjj
jjj
jjj
jjj
EPEP
MEPEPM
EMPEMPM
M
MMM
MWM
)](ln[)(
]ln)(ln[)(ln
)](ln[)(1
ln
ln1
lnln1
which is subject to the conditions
EEPE
EP
jjj
jj
)(
1)(
.
Treating P(Ej) as continuous variables, we have the variational equation
0)}()1()({ln
)()()(ln)([
jjjj
jjjjjj
EPEEP
EPEEPEPEP
which gives P(Ej) for the maximum W. Here and are Lagrange’s indeterminate
multipliers. Thus we obtain
0)1()(ln jj EEP
or
]exp[)( jj ECEP
or
)exp()(
1)( jj E
ZEP
where
j
jEZ )exp()(
and
= 1/kBT. With the above )( jEP , the entropy S is expressed by
jjjB
B
EPEPMk
WkS
)](ln[)(
ln
for the total system composed of M ensembles. Therefore, the entropy of each ensemble is
j
jjB EPEPkS )](ln[)(
12. Density of states for quantum box (ideal gas) (a) Energy levels in 1D system
We consider a free electron gas in 1D system. The Schrödinger equation is given by
)()(
2)(
2)(
2
222
xdx
xd
mx
m
pxH kk
kkk
, (1)
where
dx
d
ip
,
and k is the energy of the particle in the orbital.
The orbital is defined as a solution of the wave equation for a system of only one electron:one-electron problem.
Using a periodic boundary condition: )()( xLx kk , we have
ikx
k ex ~)( , (2)
with
222
2 2
22
n
Lmk
mk
,
1ikLe or nL
k2
,
where n = 0, ±1, ±2,…, and L is the size of the system. (b) Energy levels in 3D system
We consider the Schrödinger equation of an electron confined to a cube of edge L.
kkkkk
p 222
22 mmH
. (3)
It is convenient to introduce wavefunctions that satisfy periodic boundary conditions. Boundary condition (Born-von Karman boundary conditions).
),,(),,( zyxzyLx kk ,
),,(),,( zyxzLyx kk ,
),,(),,( zyxLzyx kk . The wavefunctions are of the form of a traveling plane wave.
rkk r ie)( , (4)
with
kx = (2/L) nx, (nx = 0, ±1, ±2, ±3,…..),
ky = (2/L) ny, (ny = 0, ±1, ±2, ±3,…..),
kz = (2/L) nz, (nz = 0, ±1, ±2, ±3,…..).
The components of the wavevector k are the quantum numbers, along with the quantum number ms of the spin direction. The energy eigenvalue is
22
2222
2)(
2)( kk
mkkk
m zyx
. (5)
Here
)()()( rkrrp k kkk i
. (6)
So that the plane wave function )(rk is an eigen-function of p with the eigenvalue k . The ground state of a system of N electrons, the occupied orbitals are represented as a point inside a sphere in k-space. (c) Density of states
Because we assume that the electrons are non-interacting, we can build up the N-electron ground state by placing electrons into the allowed one-electron levels we have just found. The one-electron levels are specified by the wave-vectors k and by the projection of the electron’s spin along an arbitrary axis, which can take either of the two values ±ħ/2. Therefore associated with each allowed wave vector k are two levels:
,k , ,k .
Fig. Density of states in the 3D k-space. There is one state per (2/L)3.
There is one state per volume of k-space (2/L)3. We consider the number of one-electron levels in the energy range from to +d; D()d
dkkL
dD 23
3
42
)(
, (13)
where D() is called a density of states. 13. Application of canonical ensemble for ideal gas (b) Partition function for the system with one atom; 1Z
The partition function Z1 is given by
2/32
22
23
22
3
22
8
)2
exp(4)2(
)2
exp()2(
)2
exp(
CV
km
dkkV
md
V
mZ
k
kk
k
where 3LV ,
mC
2
2 ,
TkB
1
((Mathematica))
Clear "Global` " ;
f1V
2 3 4 k2 Exp C1 k2 ;
Integrate f1, k, 0,
Simplify , C1 0 &
V
8 C13 2 3 2
Then the partition function 1Z can be rewritten as
VnTmk
V
Tmk
V
Tmk
VZ Q
B
BB
2/3
22/322/322
1 22
28
where Qn is a quantum concentration and is defined by
2/3
22
Tmk
n BQ .
nQ is the concentration associated with one atom in a cube of side equal to the thermal average de Broglie wavelength.
Fig. Definition of quantum concentration. The de Broglie wavelength is on the order
of interatomic distance.
2
h
vmp ,
vm
2 ,
where v is the average thermal velocity of atoms. Using the equipartition law, we get
the relation
Tkvm B2
3
2
1 2 , or m
Tkv B3 ,
Then we have
3/13/1
2 11447.1
2
3
2
3
2
3
22
QQBBB nnTmkTmk
m
Tkm
vm
where
2/3
22
Tmk
n BQ
It follows that
3
1
Qn .
((Definition))
1Qn
n → classical regime
An ideal gas is defined as a gas of non-interacting atoms in the classical regime. ((Example))
4He gas at P = 1 atm and T = 300 K, the concentration n is evaluated as
1910446.2 Tk
P
V
Nn
B
/cm3.
The quantum concentration nQ is calculated as
24108122.7 Qn /cm3
which means that Qnn in the classical regime. Note that the mass of 4He is given by
24106422.64 um g.
where u is the atomic unit mass. ((Mathematica))
Clear "Global` " ;
rule1 kB 1.3806504 10 16,
NA 6.02214179 1023,
1.054571628 10 27,
amu 1.660538782 10 24,
atm 1.01325 106 ;
T1 300; P1 1 atm . rule1;
m1 4 amu . rule1
6.64216 10 24
nQm1 kB T1
2 2
3 2. rule1
7.81219 1024
n1P1
kB T1. rule1
2.44631 1019
(b) Partition function of the system with N atoms
Suppose that the gas contains N atoms in a volume V. The partition function ZN, which takes into account of indistinguishability of the atoms (divided by the factor N!), is given by
!1
N
ZZ
N
N .
Using VnZ Q1 , we get
]ln1)[ln(
!ln)ln(ln
NVnN
NVnNZ
Q
QN
where we use the Stirling’s formula
)1(lnln! NNNNNN ,
in the limit of large N. The Helmholtz free energy is given by
]1)2
ln(2
3ln
2
3[ln
)ln(
]1)[ln(
]ln1)[ln(
ln
2
B
B
BQ
B
QB
QB
NB
mkT
N
VTNk
TNkn
nTNk
N
VnTNk
NVnTNk
ZTkF
since
12
ln2
3ln
2
3ln]
2ln[)ln( 2
2/3
2
BBQ mk
TN
V
N
VTmk
N
Vn
The entropy S is obtained as
)ln2
3(ln
]2
5)
2ln(
2
3ln
2
3[ln
0
2
TN
VNk
mkT
N
VNk
T
FS
B
BB
V
where
)2
ln(2
3
2
520
Bmk
Note that S can be rewritten as
BQ
B Nkn
nNkS
2
5ln (Sackur-Tetrode equation)
((Sackur-Tetrode equation))
The Sackur–Tetrode equation is named for Hugo Martin Tetrode (1895–1931) and Otto Sackur (1880–1914), who developed it independently as a solution of Boltzmann's gas statistics and entropy equations, at about the same time in 1912. https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation
In the classical region ( 1Qn
n or 1
n
nQ ), we have
0ln n
nQ
The internal energy E is given by
TNk
TNkn
nTNkTNk
n
nTNk
STFE
B
BQ
BBQ
B
2
32
5ln)ln(
Note that E depends only on T for the ideal gas (Joule’s law). The factor 3/2 arises from the exponent of T in Qn because the gas is in 3D. If Qn were in 1D or 2D, the factor
would be 1/2 or 2, respectively. (c) Pressure P
The pressure P is defined by
V
TNk
V
FP B
T
leading to the Boyle’s law. Then PV is
3
2ETNkPV B (Bernoulli’s equation)
(d) Heat capacity
The heat capacity at fixed volume V is given by
BV
V NkT
STC
2
3
.
When ANN , we have
RCV 2
3 .
Cp is the heat capacity at constant P. Since
PdVTdSdE or
PdVdETdS then we get
pppp T
VP
T
E
T
STC
RkNP
kNP
T
VP BA
BA
p
We note that
TkNE BA2
3 .
E is independent of P and V, and depends only on T. (Joule’s law)
VBAVp
CkNT
E
T
E
2
3
Thus we have
RRRRCC VP 2
5
2
3 .
((Mayer’s relation))
RCC VP for ideal gas with 1 mole.
The ratio is defined by
3
5
V
P
C
C .
(e) Isentropic process (constant entropy)
The entropy S is given by
]ln)[ln()ln2
3(ln 0
2/30 NVTNkT
N
VNkS BB
The isentropic process is described by
2/3VT =const, or 3/2TV =const, Using the Boyle’s law ( RTPV ), we get
3/2VR
PV=const,, or 3/5PV =const
Since = 5/3, we get the relation
PV = constant 14. The expression of entropy: )(ln EWkS B
The entropy is related to the number of states. It is in particular, closely related to Wln . In order to find such a relation, we start with the partition function
dEfNEEdEWZ ]exp[)exp()()exp(
where )(EW is the number of states with the energy E. The function )(Ef is defined by
N
EWTkE
N
EWEEf B )(ln)(ln
)(
In the large limit of N, )(Ef is expanded using the Taylor expansion, as
...)(|)(
)()( ***
EEE
EfEfEf
EE
where
0])(ln
1[1)(
E
EWTk
NE
EfB at *EE
or
*|)(ln1
EEB E
EWk
T
At at *EE ,
)](exp[ *EfNZ
For simplicity, we use E instead of *E . The Helmholtz free energy F is dsefined by
)()]([ln ENfEfNTkZTkF BB or
STEEWTkEF B )(ln leading to the expression of the entropy S as
)(ln EWkS B , and
E
S
T
1
.
15. The expression of entropy:
ppkS B ln
We consider the probability given by
EeZ
p 1
,
where
EeZ ,
EZp lnln ,
The energy E is given by
Z
Z
Z
eEZ
EpE
E
ln
1
1
The entropy is a logarithmic measure of the number of states with significant probability of being occupied. The Helmholtz energy F is defined by
ZTkSTEF B ln . The entropy S is obtained as
T
EZk
T
FES B
ln
We note that
ST
FE
ZT
Tk
T
E
ZEk
Zpkppk
B
B
BB
ln
)ln(
)ln(ln
Thus it follows that the entropy S is given by
ppkS B ln .
16. Thermal average of energy fluctuation
22
222 )(11
EE eEZ
eEZ
EE
][1
])([11
]1
[1
1
222
2
22
2
2
EETk
eEd
dZeEZ
ZTk
eEZd
d
Tk
Ed
d
Tk
Ed
d
dT
dE
dT
d
B
EE
B
E
B
B
where
EZeEd
dZ E
Then we have
][1 22
2EE
TkE
dT
d
B
Since VCEdT
d , we get the relation
][)(
1 222
EETkk
C
BB
V
17. Example: 4He atom as ideal gas
We consider the 4He atom with mass
um 4 = 6.64216 x 10-24 g The number density n at at P = 1 atm and T = 300 K, is
n = 2.44631 x 1019/cm3 The number of atoms in the volume V = 103 cm3 is
N = nV = 2.44631 x 1022 The internal energy
987.1512
3 TNkE B J
The entropy S
]2
5)
2ln(
2
3ln
2
3[ln
2
B
B
mkT
N
VNkS = 5.125 J/K.
((Mathematica))
Clear "Global` " ;
rule1 kB 1.3806504 10 16,
NA 6.02214179 1023,
1.054571628 10 27,
amu 1.660538782 10 24,
atm 1.01325 106, bar 106, J 107 ;
n1 2.44631 1019; V1 103; T1 300;
N1 n1 V1
2.44631 1022
m1 4 amu . rule1
6.64216 10 24
P1kB N1
V1T1 . rule1
1.01325 106
P1 bar . rule1
1.01325
E132
kB N1 T1 . rule1
1.51987 109
E1 J . rule1
151.987
S1 kB N132
Log T1 LogV1N1
32
Logm1 kB
2 2
52
. rule1
5.12503 107
S1 J . rule1
5.12503 17. Link Entropy (Wikipedia) http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics) _______________________________________________________________________ REFERENCES M. Toda, R. Kubo, and N. Saito, Statistical Physics I: Equilibrium Statistical (Springer,
1983). L.D. Landau and E.M. Lifshitz, Statistical Physics (Pergamon Press, 1980). R.P. Feynman, Statistical Mechanics A Set of Lectures (The Benjamin/Cummings, 1982). E. Fermi, Thermodynamics (Dover, 1956). S.J. Blundell and K.M. Blundel, Concepts in Thermal Physics (Oxford, 2006). C. Kittel and H. Kroemer, Thermal Physics (W.H. Freeman and Company (1980). C. Kittel, Elementary Statistical Physics (Wiley & Sons, 1958). P. Atkins, The Laws of Thermodynamics: A Very Short Introduction (Oxford, 2010). D.V. Schroeder, An Introduction to Thermal Physics (Addison-Wesley, 2000). R. Reif, Fundamentals of Statistical and Thermal Physics (McGraw-Hill, 1965).
F. Bloch, Fundamentals of Statistical Mechanics (World Scientific, 2000). D. ter Haar, Elements of Statistical Mechanics (Butterworth Heinemann, 1995). F. Schwabl, Statistical Mechanics, 2nd edition (Springer, 2006). J.W. Halley, Statistical Mechanics (Cambridge, 2007). H. Kawamura, Statistical Physics (Maruzen, 1997) [in Japanese] D. Yoshioka, Statistical Physics, An Introduction (Springer, 2007). ________________________________________________________________________ APPENDIX Density of states for N particles for micro-canonical ensemble
Using the density of states for the 1-particle system, the density of states for the 2-particle system is estimated as
)2(1
)()()(
22
21
0
22
2
0
11212
0
11211122
2
2
fECdxxxEC
dEC
dEDDED
E
E
where 2 is the total energy of two-particle system, and
)2
3(
)]1(2
3[)
2
3(
)1()(1
0
12/)1(32/1
n
ndxxxnf n
Similarly, the density of states for the 3-particle system is
)3()2(
1
)2(
)()()(
2/73
3
1
0
22/73
3
0
12
1313
0
11321133
3
3
ffEC
dxxxEC
dEfC
dEDDED
E
E
where 3 is the total energy of three particles.
Suppose that Dn has the form
2/12/)1(3)()....3()2()( nn
nnn EnfffCED
then 1n can be described as
2/12/31
1
0
12/12/)1(3
11111
)1()()...3()2(
)()()...3()2()(1
nn
n
En
nn
nn
EnfnfffC
dECnfffCEDn
Hence the density of states for the N-particles system is given by
2/12/)1(3)()....3()2()( NNN ENfffCED
where E = EN
)2
3(
)]2
3([
)2
3(
)]1(2
3[)
2
3(
)1(2
3(
)]2(2
3[)
2
3(
...)
2
12(
]2
9[)
2
3(
)2
9(
)]3[)2
3(
)3(
]2
3[)
2
3(
)()....3()2(
N
N
N
N
NNfff
N
One can get
EE
Nm
V
EE
Nm
V
EN
CED
NN
NNN
N
N
N
NN
N
N
N
NN
1
)2
3(
)2(2
1
)4(
1
)2
3(
)]2
3([
)2()4(
)2
3(
)]2
3([
)(
2/32/
2/332
2/32/332
2/12/)1(3
where
2)
2
3(
The number of states whose energy lies between E and + E is given by
E
EE
Nm
V
N
EEDN
EEW
NN
NNN
N
N
2/32/
2/332
)2
3(
)2(2
1
)4(!
1
)(!
1),(
where N! is included because N-particles are non-distinguishable. This can be rewritten as
E
EmE
NN
VEEW N
N
2/32)
2(
)2/3(!),(
,
which is exactly the same as the expression derived from the classical approach.