25
1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i. mechanical contact, ii. heat contact, iii. diffusion contact. Equilibrium. Chemical potential. The main distributions in statistical mechanics. A system in the canonical ensemble. Thermostat. Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

1 Lecture 3 Entropy in statistical mechanics. Thermodynamic contacts: i.mechanical contact, ii.heat contact, iii.diffusion contact. Equilibrium. Chemical

  • View
    229

  • Download
    5

Embed Size (px)

Citation preview

1

Lecture 3Lecture 3

• Entropy in statistical mechanics.

• Thermodynamic contacts:

i. mechanical contact,

ii. heat contact,

iii. diffusion contact.

• Equilibrium.

• Chemical potential.

• The main distributions in statistical mechanics.

• A system in the canonical ensemble. Thermostat.

Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

2

Entropy in Statistical MechanicsEntropy in Statistical MechanicsFrom the principles of thermodynamics we learn that the thermodynamic entropy S S has the following important properties:

dS dS is an exact differential and is equal to DQ/TDQ/T for a reversible process, where DQDQ is the heat quantity added to the system.

Entropy is additive: S=SS=S11+S+S22.. The entropy of the combined system is the sum of the entropy of two separate parts.

S S 0 0. If the state of a closed system is given macroscopically at any instant, the most probable state at any other instant is one of equal or greater entropy.

3

State Function

This statement means that entropy is a state function in that the value of entropy does not depend on the past history of the system but only on the actual state of the system...

In this case one of the great accomplishments of statistical mechanics is to give us a physical picture of entropy.

The entropy of a system (in classical statistical physics) in statistical equilibrium can be defined as

=ln =ln (3.1)

where is the volume of phase space accessible to the system, i.e., the volume corresponding to the energies between E- E- and E+ E+ .2

121

4

Let us show first, that changes in the entropy are independent of the system of units used to measure . As is a volume in the phase space of NN point particles it has dimensions(Momentum Length)3N =(Action)3N (3.2)

Let denote the unit of action; then is dimensionless. If we were to define

N3/

ln ln ln

3

3N

N (3.3)

we see that for changes

(3.4) ln

5

independent of the system units. =Plank’s constant is a natural unit of action in phase space.

It is obvious that the entropy , as defined by (3.1), has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact the change in entropy is an exact

differentialdifferential. Once the ensemble is specified in terms of the spread in the phase space, the entropy is known. We see that, if is interpreted as a measure of the imprecision of our knowledge of a system or as as

a measure of the “randomness”a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure as a measure

of the imprecision or randomness.of the imprecision or randomness.

6

Entropy is an additive Entropy is an additive

It can be easily shown that is an additive. Let us consider a system made up of two parts, one with NN1 particles and the other with NN2 particles. Then

21 NNN

and the phase space of the combined system is the product space of the phase spaces of the individual parts:

(3.5)

21 (3.6)

The additive property of the entropy follows directly:

212121 lnlnlnln (3.7)

7

Thermodynamic contacts between two systems

We have supposed that the condition of statistical condition of statistical equilibriumequilibrium is given by the most probable condition of a closed system, and therefore we may also say that the entropy is a maximummaximum when a closed system is in equilibrium condition.

The value of for a system in equilibrium will depend on the energy E ( E ( <E>) <E>) of the system; on the number NNii of each molecular species ii in the

system; and on external variables, such as volume, strain, magnetization, etc.

8

1 2

Barrier

Insulation

Fig. 3.1.

Let us consider the condition for equilibrium in a system made up of two interconnected subsystems, as in Fig. 3.1. Initially a rigidrigid, , insulatinginsulating, , non-non-permeable barrierpermeable barrier separates the subsystems from each other.

9

Thermal contact

Thermal contactThermal contact - the systems can exchange the energy. In equilibrium, there is no any flow of energy between the systems. Let us suppose that the barrier is allowed to transmit energy, the other inhibitions remaining in effect. If the conditions of the two subsystems 1 and 2 do not change we say they are in thermal thermal equilibriumequilibrium. In the thermal equilibrium the entropy of the total system must be a maximum with respect to small transfers of energy from one subsystem to the other. Writing, by the additive property of the entropy

10

21

(3.8) ==11++22=0 =0

we have in equilibrium

1

11

2

22 0

EE

EE (3.9)

We know, however, that

1

1

2

21 0

E EE (3.11)

as the total system is thermally closedthermally closed, the energyenergy in a microcanonical ensemble being constantconstant. Thus

021 EEE (3.9)

11

As EE11 was an arbitrary variation we must have

1

1

2

2E E

(3.12)

in thermal equilibrium. If we define a quantity by

1

E

(3.13)

then in thermal equilibrium

(3.14)

Here is known as the temperaturetemperature and will shown later to

be related to the absolute temperature TT by =kT=kT , where kk

is the Boltzmann constantBoltzmann constant, 1.38010-23 j/deg K.

21

12

Mechanical contact

Mechanical contactMechanical contact the systems are separated by the mobile barrier; The equilibrium is reached in this case by adequation of pressure from both sides of the barrier. We now imagine that the wall is allowed to move and also passes energy but do not passes particles. The volumes VV11

, VV22 of the two systems can readjust

to maximize the entropy. In mechanical equilibrium

1

11

2

22

1

11

2

22 0

VV

VV

EE

EE (3.15)

13

After thermal equilibrium has been established the last two terms on the right add up to zero, so we must have

1

11

2

22 0

VV

VV

(3.16)

Now the total volume V=VV=V11+V+V

22 is constant, so that

(3.17)21 VVV

We have then

1

1

2

21 0

V VV (3.18)

14

1

1

2

2V V

(3.19)

As VV11 was an arbitrary variation we must

have

in mechanical equilibrium. If we define a quantity by

V E N,

(3.20)

we see that for a system in thermal equilibrium the condition for mechanical equilibrium is

21 (3.21)

We show now that has the essential characteristics of the usual pressure pp.

15

Material-transferring contact

The systems can be exchange by the particles. Let us suppose that the wall allows diffusion through it of molecules of the i thi th chemical species. We have

21 ii NN (3.22)

For equilibrium

or

1

1

2

2N Ni i

(3.24)

1

1

2

21 0

N NN

i ii (3.23)

16

We define a quantity ii by the relation

i

i E VN

,

The quantity ii is called the chemical potential of the ithith

species. For equilibrium at constant temperature

(3.25)

(3.26)21 ii

17

The Canonical EnsembleThe Canonical Ensemble

The microcanonical ensemblemicrocanonical ensemble is a general statistical tool, but it is often very difficult to use in practice because of difficulty in evaluating the volume of phase space or the number of states accessible to the system. The canonical ensemblecanonical ensemble invented by GibbsGibbs avoids some of the difficulties, and leads us easily to the familiar Boltzmann factorBoltzmann factor exp(-exp(-E/kT)E/kT) for the ration of populations of two states differing by EE in energy.

We shall see that the canonical ensemble describes systems in thermal contact with a heat reservoir; the microcanonical ensemblemicrocanonical ensemble describes systems that are perfectly insulated.

In 1901, at the age of 62, Gibbs (1839-1903) published a book called Elementary Principles in Statistical Mechanics (Dover, New York).

18

We imagine that each system of the ensemble is divided up into a large number of subsystems, which are in mutual thermal contact and can exchange energy with each other. We direct our attention (Fig.3.2) to one subsystem denoted ss; the rest of the system will be denoted by rr and is sometimes referred to as a heat reservoir.

(s)

Rest ( r ) or heatreservoir

Subsystem

Total system (t)

(Fig.3.2)

The total system is denoted by tt and has the constant energy EEtt as it is a member of a microcanonical microcanonical ensembleensemble. For each value of the energy we think of an ensemble of systems (and subsystemssubsystems).

19

The subsystems will usually, but not necessarily, be themselves of macroscopic dimensions.

The subsystemsubsystem may be a single molecule if, as in a gas, the interactions between molecules are very weak, thereby permitting us to specify accurately the energy of a molecule. In a solid, a single atom will not be a satisfactory subsystem as the bond energy is shared with neighbors.

Letting dwdwtt denote the probability that the total

system is in an element of volume ddtt of the of the

appropriate phase space, we have for a appropriate phase space, we have for a

microcanonical ensemblemicrocanonical ensemble

20

(3.27)

here CC is constant. Then we can write

(3.28)

otherwise 0

in isenergy theif

t

tttt

dw

EatECddw

otherwise 0

accessible is space phase ofregion theif

t

tst

dw

dCddw

We ask now the probability dwdwss that the subsystem is in

ddss,, without specifying the condition of the reservoir, but

still requiring that the total system being instill requiring that the total system being in EEtt at EEtt

.

Then

rss Cddw (3.29)

21

The entropy of the reservoir is

where rr is the volume of phase space of the reservoir

which corresponds to the energy of the total system being the total system being inin EEtt

at Et.

Our task is to evaluate rr; that is, if we know that the

subsystem is in ddss, how much phase space is accessible

to the heat reservoir?

r rln (3.30)

r e r (3.31)

Note that

E E Er t s (3.32)

22

r r r t s r tr t

tsE E E E

E

EE( ) ( ) ( )

( )......

s

t

trtrr E

E

EE

)(

exp)(exp

1 1

kT

E

Er t

t

( )

(3.33)

(3.34)

(3.35)

where we may take EEss<< E<< Ett because the subsystem is

assumed to be small in comparison with the total system.

We expend

Thus

As Et is necessarily close to Er, we can write, using

(3.13)

1

E

Here is the temperature characterizing every part of the system, as thermal contact as assumed. Finally, from (3.29), (3.34) and (3.35)

23

dw Ae dsE

ss

A Ce r rE ( )

dw A e dsE

ss

1

( )E AeE

kT

( )E AeE

kT

(3.36)

(3.37)

(3.38)

(3.39)

where

may be viewed as a quantity which takes care of the normalization:

Thus for the subsystem the probability density (distribution function) is given by the canonical canonical ensembleensemble

24

ln1=lnA1-E1/ln2=lnA2-E2/

ln12=lnA1A2 - (E1+E2)/

so that , with =12; A=A1A2 ; E=E1+E2, , we have

ln=lnA-E/ (3.40)

We note that lnln is additive for two subsystems in thermal contact:

where here and henceforth the subscripthenceforth the subscript ss is is droppeddropped.. We emphasize that EE is the energy of the energy of the entire subsystem. entire subsystem.

25

We have to note that for a subsystem consisting of a large number of particles, the subsystem energy in a canonical ensemble is very well defined.

This is because the density of energy levels or the volume in phase space is a strongly varying function of energy, as is also the distribution function (3.39).

f

e f p q d

e d

E p q kT

E p q kT

( , ) /

( , ) /

( , )

f

e f p q d

e d

E p q kT

E p q kT

( , ) /

( , ) /

( , )

The average value of any physical quantity f(p,q)f(p,q) over canonical distribution is given by

for the combined systems. This additive propertyadditive property is central to the use of the canonical ensemblecanonical ensemble.