Chapter1s_StatMech

Embed Size (px)

Citation preview

  • 7/23/2019 Chapter1s_StatMech

    1/22

    1

    Chapter 1 Introduction to Equilibrium Statistical Mechanics

    T. L. Hill, An Introduction to Statistical Thermodynamics, 1986.

    D. A. McQuarrie, Statistical Mechanics, 2000.

    1.1 Background

    1.1.1 Sterling "Approximation"

    Consider the following identity:

    ln ! ln1 ln 2 ln lni

    i

    x x

    For a very largex, the above function can be approximated by

    1

    ln ! lnx

    ydy

    lnx x

    For numbers in the order of Avogadro's number, the Sterling "approximation" is very

    accurate.

    1.1.2 Sharpness of Multiplicity Function

    The following results of coin tossing suggest that the outcome will be getting more definite as

    the number of trials increases:

  • 7/23/2019 Chapter1s_StatMech

    2/22

    2

    For a system ofNmagnets, we denotes the number of "up" and "down" states as:12

    N N s

    and 12

    N N s

    , wheresis an integer.

    Hence, the spin excess is calculated as:

    2N N s

    We define the multiplicity function W(N,s) as follows:

    !,

    ! !

    NW N s

    N N

    The total number of states is:

    12

    12

    1 1 2

    NN N

    s N

    W

    For a very largeN, let us consider the logarithm of the multiplicity function:

    ln ln ! ln ! ln !W N N N

    Using the Stirling approximation,

    ln ln ln lnW N N N N N N N N N

    ln lnN N

    N NN N

    Knowing that

    12ln ln 1 2 ln 2 ln 1 2N

    s N s NN

    we have

    212ln ln 2 2 2N s N s NN

    .

  • 7/23/2019 Chapter1s_StatMech

    3/22

    3

    Consequently,

    2 21 1

    2 2ln ln 2 2 2 ln 2 2 2W N s N s N N s N s N

    22

    ln 2

    s

    N N

    e

    The above result suggests that

    12

    12

    , 2 ,0

    N

    N

    s N

    W N s W N

    That is, whenNis very large, the distribution is exceedingly sharply defined ats= 0.

    We know from common experience that systems held at constant temperature usually have

    well-defined properties; this stability of physical properties follows as a consequence of theexceedingly sharp peak in the multiplicity function and of the steep variation of that function

    away from the peak.

    1.1.3 Method of Lagrange Undetermined Multipliers

    Assume that there are npossible outcome for an experiment:Events: 1, 2, 3, , n.

    Perform the experiment forNtimes and denote the frequency of the outcome iasNi:N1,N2, ,Nn

    Obviously, we must have

    i

    i

    N N

    For a particular set of {N1,N2, ,Nn}, the number of possible ways is

    1 2

    !

    ! ! !n

    NW

    N N N

    The total possible outcome is nN.

    The probability of getting the distribution {N1,N2, ,Nn} is:

    i NW

    p Nn

    Take a logarithm of W:

    ln ln ! ln !ii

    W N N

    IfNis extremely large, we can employ the Stirling approximation:

    ln ln lni i ii

    W N N N N N N

  • 7/23/2019 Chapter1s_StatMech

    4/22

    4

    lni i

    i

    N p p

    wherepiis defined as the probability of the appearance of a particular event i.

    Question: As allpivary, what would be the maximum value of W? What would be the mostprobable distribution?

    Consider the following function ofpi

    1

    lnn

    i i

    i

    R p p

    We want to maximizeRwhile subjecting to the following constraint:

    1

    1n

    i

    i

    p

    Define the functionL: 0 1 i

    i

    L R p , where 0is a constant.

    Aspivaries, maximumLwould give maximumR:

    0 1 ii

    dL dR dp

    Given that

    dR 1 ln i ii

    p dp

    dL 0ln i ii

    p dp

    In the presence of 0, which is not yet defined, we can assert that allpiare independentvariables. Thus,

    dL= 0 0ln ip 0

    ip e

    The value of 0is defined by the condition of

    1

    1n

    i

    i

    p

    .

    That is,

    0 0

    1

    1n

    i

    e ne

    1ip n .

    Consequently,

    ln lni ii

    W N p p

    1.2 Ensembles and Postulates

    The object of statistical mechanics is to provide the molecular theory of interpretation of

    equilibrium properties of macroscopic systems the answer of "why".

  • 7/23/2019 Chapter1s_StatMech

    5/22

    5

    Mechanical variables (M): Properties that can be defined in purely mechanical terms such aspressure, energy, volume, number of molecules, etc.

    Non-mechanical variables: temperature, entropy, chemical potential, etc.

    Ensemble: a collection of a very large number of systems, each constructed to be a replica ofa well-defined thermodynamic state.

    Postulate 1: In the limit as the number of ensemble members approach , the ensembleaverage ofMcorresponds to a parallel thermodynamic property, provided that the system of

    the ensemble replicate the thermodynamic state and environment of the actual system of

    interest. (Valid for all kinds of ensembles)

    Postulate 2: In an ensemble of constantN, V, andE, the systems are distributed uniformly, i.e.,with equal probability or frequency, over the possible quantum states consistent with the

    specified values ofN, V, andE. (Principle of equal a priori probabilities)

    Postulates 1 and 2Quantum ergodic hypothesis:The single isolated system of interest spends equal amounts of time, over a long period of

    time, in each of the available quantum states. That is, the time average and the ensembleaverage ofMwould be identical.

    1.3 Canonical Ensemble

    Canonical ensemble: A huge collection of closed isothermal systems (N, V, T).

    Quantum states belonging to different energy levelsEwill have to be considered for acanonical ensemble.

    For the following canonical ensemble, the boarder provides thermal insulation.

  • 7/23/2019 Chapter1s_StatMech

    6/22

    6

    Each system (square box with definiteN, V, T) in the canonical ensemble has the same set of

    energy states, viz.,E1,E2, , etc., which are determined by Nand Vonly.

    Assume there are ntsystems in the ensemble.

    The entire ensemble, which is insulated, is effectively an isolated system with volume ntV,

    numbers of molecules ntN, and a total energyEt. Thus, it can be considered as athermodynamic "supersystem", to which we can apply the postulate 2.

    Quantum

    States

    1 2 3

    Energy E1 = 0.1 E2= 0.2 E3= 0.3

    Systems n1 n2 n3

    Total systems1 2t

    n n n

    Total energy 1 1 2 2 t tt n nE n E n E n E

    For an "ensemble" with nt= 6 andEt= 1.5, we can have

    1 2

    2 2

    4 4

    Distribution 1: n1= 1, n

    2= 3, n

    4= 2

    Permutations: (1, 2, 2, 2, 4, 4), (1, 4, 2, 2, 4, 2), (4, 1, 2, 2, 2, 4), etc.

    or

    Application of the Postulate 2: All the permutations having the sameEtare equally probable.

    For a particular set of distribution n= {n1, n2, } of the energy states, the corresponding

    number of possible ways (permutation) in the canonical ensemble, i.e., t n , is

    1 2

    !

    ! !

    tt

    nn

    n n

    (1)

    n1represents the number of systems found atE1, and so on.

    In general, the required probability of observing a given quantum stateEjin an arbitrarysystem of a canonical ensemble is

    1 t jn

    j

    t t

    n

    n n n

    pn n

    , where jn n denotes the value of njin the distribution of n.

  • 7/23/2019 Chapter1s_StatMech

    7/22

    7

    For nt, we can regard all other weights t(n) as negligible compared with t(n*) which

    denotes the most probable distribution. Hence,

    * * *

    *

    1 t j jj

    t tt

    n n np

    n nn

    , where *

    jn is the value of njin the most probable distribution, n

    *.

    The supersystem has a well-defined total energy at a particular temperature:

    *

    j j t

    j

    n E E *

    j tj

    j t t

    n EE

    n n

    That is,

    j j

    j

    p E E (2)

    where

    1j

    j

    p (3)

    Question: Which of all possible sets of nj's satisfying eqns (2) and (3) gives us the largest t?

    (That is, we want to determine *j

    n )

    From the definition of t, we know that the problem is quite similar to what we have treated

    in section 1.1.3, where we try to maximize the function of1

    lnn

    i i

    i

    R p p

    , but this time the

    maximization is subjected to the constraints of eqns (2) and (3):

    eqn 3 eqn 2

    1i i i

    i i

    L R p p E

    where and are the undetermined multipliers.

    To maximizeLwith respect topj, we must have

    0j

    L

    p

    As shown in section 1.1.3,

    1 ln i iidR p dp 1 lnj

    j

    R

    pp

    Thus, we find

    ln 0j j

    p E

    After rearrangement,jE

    jp e e

  • 7/23/2019 Chapter1s_StatMech

    8/22

    8

    As a result, the most probable distribution in a canonical ensemble is given as:jE

    j

    ep

    Q

    Then, we can obtain the ensemble average of some mechanical variables such as energy:j j

    j

    E p E

    j j j j

    j j

    dE E dp p dE

    dU q w

    From the postulate 1, E U .

    Work done on or by the system may change V, on which the energy levelEjdepends:

    j j

    j

    w p dE

    j

    j

    j N

    Ep dV PdV

    V

    jj j jj jN

    EP p p P P

    V

    jjN

    EP

    V

    wherePjis the pressure when the system is in the stateEj.

    Hence,

    j j

    j

    q E dp

    Thus, for a very smallchange of internal energy, work done means a weighted average of

    change in energy levels and heat is a change of distribution of energy.

    One may be tempted to express the factor of in terms of energies. However, the independent

    variables of real interest here areN, V, and T. Therefore, we will invoke thermodynamicargument to trace the connection between and T(a non-mechanical thermodynamic

    variable).

    From the ensemble average of energyjE

    j

    j

    eE E

    Q

    we obtain:

    2

    ,, , ,

    j j

    j

    E EE j j

    j

    j jNN N N

    E EE e Q ee E

    V Q V Q V Q V

  • 7/23/2019 Chapter1s_StatMech

    9/22

    9

    PE E P P

    From the ensemble average of pressurejE

    j

    j

    eP P

    Q

    ,

    we can similarly obtain

    ,N V

    P

    EP P E

    Thus, we finally obtain the following relation:

    , ,N N V

    E PPV

    which is very similar to the following thermodynamic equation:

    , ,

    1

    1T N N V

    U PP

    V T T

    By the virtue of the postulate 1, we can associate the thermodynamic pressure and internal

    energy with the ensemble average ofPandE.

    Hence, it can be deduced that1

    kT

    where kis a constant to be determined.

    From the definition of E, we obtain

    j j j j

    j j

    dE E dp p dE

    Given thatjE

    j

    epQ

    , we have

    jE

    jp Q e

    ln ln

    j jp Q E

    1ln ln

    j jE p Q

    Substituting the expression ofEjand replacing dEjby dV:

    1

    ln ln j

    j j j

    j j N

    EdE p Q dp p dV

    V

  • 7/23/2019 Chapter1s_StatMech

    10/22

    10

    1

    lnj j

    j

    d p p PdV

    For a closed system (Nconstant) in thermodynamics,

    dU TdS pdV

    Hence we can obtain

    , , lnj jj

    S N V T k p p (4)

    It can be shown that if any two systems are in thermal contact, they will have the same and

    Tat equilibrium. Thus, kis a universal constant known as the Boltzmann factor. Therefore,

    we can evaluate kfor a judiciously chosen system:

    k= 1.381 1023JK1.

    It was Max Planck who first gave it an experimental value based on the law of black-body

    radiation.

    We finally obtained the well-known expression of the Boltzmann distribution:

    , ,jE kT

    j

    ep N V T

    Q

    and , , jE kT

    j

    Q N V T e

    where Qis called the "canonical ensemble partition function".

    When T, Qgives the total number of accessible state and the probability of accessing

    each state would be the same. At regular temperature, the Boltzmann factor will modify suchprobability significantly.

    Substituting the above results into eqn (4):

    lnj jj

    S k p E kT Q

    lnE

    k QT

    In thermodynamics, the Helmholtz free energy is related to entropy as follows:

    U AS T T

    Therefore,

    , , ln , ,A N V T kT Q N V T .

    Hence, if Qis available, we can obtain a rather complete set of thermodynamic functions from

    those derivatives of the Helmholtz free energy.

    Given that dA SdT PdV dN , we have

    , ,

    lnT N T N

    A QP kTV V

  • 7/23/2019 Chapter1s_StatMech

    11/22

    11

    2

    , ,,

    ln ln ln

    N V N VN V

    Q T Q Q

    U kTT T

    1.3.1 Fluctuation

    In a canonical ensemble,N, V, and Tare held fixed, and we can investigate fluctuations in the

    energy, pressure, and related mechanical properties because these are the ones that can vary

    from system to system.

    The variance in the system energy is calculated as:2

    E2 2E E

    2 2

    j jj p E E

    The above equation can be rewritten in a more convenient form by noting that

    2 21 1j jE Ej j j j

    j j j

    p E E e E eQ Q

    ln Q EE

    Given that

    ,

    ln

    N V

    QE

    we have

    2 2

    j j

    j

    Ep E E

    Consequently,

    2 2 2

    ,

    E V

    N V

    EkT kT C T

    2

    2

    EV

    CkT

    Heat capacity depends on the energy fluctuation diamond has relatively low CV

    To appreciate the significance of the energy fluctuation, we write2

    VE kT C

    E E

    From thermodynamics of an ideal gas, we find

  • 7/23/2019 Chapter1s_StatMech

    12/22

    12

    11E O

    E N

    forN= 1023

    PE

    E

    The system of a canonical ensemble departing appreciably from E is virtually zero.

    Essentially, every system has the same energy E and therefore the canonical ensemble is

    equivalent to microcanonical ensemble in practice.

    Note that energy fluctuation could be quite significant in nanomaterials or nano-clusters.

    Also, fluctuation in density is very significant at critical temperature.

    1.4 Microcanonical Ensemble

    Assume that the number of quantum states with energy E is , ,N V E , we have

    1E kT

    j E kT

    j

    ep

    e

    where is the number of microstates with the same energy E.

    From eqn (4),1 1

    lnj

    S k

    and thus we have the best-known equation in statistical mechanics:

    lnS k

    Interesting, Boltzmann himself never gave thought to the

    possibility of carrying out an experiment to measure the

    "Boltzmann constant".

  • 7/23/2019 Chapter1s_StatMech

    13/22

    13

    For any isolated system whatever, the more quantum states available to the system, the higher

    the entropy.

    1.4.1 Statistical-Mechanical Basis of the Third Law

    On the basis of the experimental observations, T. W. Richards and W. Nernst independently

    found that for any isothermal process involving only pure phases in internal equilibrium,

    0lim 0T

    S

    ,

    That is,

    0 0ln ln 0k (*)

    where 0and 0indicate the degeneracy of the initial and final states at 0 K, respectively.

    Furthermore, a value of in the following range

    1 Ne ,

    would be indistinguishable from

    = 1.Verification:

    For a gas ofNparticles, the typical order of magnitude of entropy isNk(orRfor

    one mole of particles). Even if the degeneracy of the ground state has the same

    order of magnitude ofN, practically we still can write 0= 1:

    0ln 0

    0 ln lnT

    k NkN

    S T S k k

    Practically, we have 0 0S .

    The entropy of each pure element or substance in a perfect crystalline form is zero atabsolute zero.

    Max Planck (the third law of thermodynamics)

    1.5 Molecular Partition Function

    Consider the ensemble of a single polyatomic molecule. Based on the Born-Oppenheimer

    approximation, the Hamiltonian can be decomposed into various degrees of freedom:

    trans rot vib elecH H H H H

    That is, we assume that the variables of each degree of freedom are independent. Hence, the

    energy of the system is the sum of individual energies, and the wave function is a product ofthe corresponding wave functions.

    rottrans vib ele

    exp exp exp expji k l

    i j k l

    EE E EQ

    kT kT kT kT

    Consequently, we can write the molecular partition function as:

    trans rot vib ele

    q q q q q

  • 7/23/2019 Chapter1s_StatMech

    14/22

    14

    1.5.1 Translational Partition Function

    The energy states of a particle in a three-dimensional infinite well can be used to obtain the

    partition function of qtrans:

    2

    2 2 2

    trans 2 3, ,

    exp8

    x y z

    x y z

    n n n

    hq n n n

    mV kT

    32 2

    2 31

    exp8n

    h n

    mV kT

    where nis a generic index representing nx, ny, and nz.

    The above summation cannot be expressed in terms of any simple analytic function.

    Fortunately, for macroscopic Vand regular temperature T, the successive terms in the

    summation differ by a very small amount and therefore the summation can be replaced by

    integration:3

    2 2

    trans 2 3

    0

    exp8

    h nq dnmV kT

    Hence,

    32

    trans 3 3

    2 mkT Vq V

    h

    where

    2hmkT

    .

    For a particle of = kT, the de Broglie wavelength is

    2

    h h

    p mkT

    and therefore is commonly referred to as the thermal de Broglie wavelength.

    1.5.2

    Electronic Partition Function

    By convention, we take the zero of the electronic energy to be the separated, electronicallyunexcited atoms at rest:

  • 7/23/2019 Chapter1s_StatMech

    15/22

    15

    The energy of the molecular ground state would become D0, which can be determinedspectroscopically (the difference betweenD0andDeis the zero-point energy).

    Hence, the partition function should read:

    0 1

    ele 0 1exp exp

    D E

    q g gkT kT

    where the degeneracy is denoted bygiand E1is the energy difference between the groundstate and the first-excited state.

    If we define a characteristic temperature for an electronic transition as

    1e

    E

    k

    we would have ein the order of 104K for E1= 1 eV.

    For example, the degeneracy of the first two levels of halogen atoms are 2P3/2and 2P1/2, theprobability of finding the atom at the first excited state is:

    1

    1

    1

    2exp

    4 2exp

    E kTp

    E kT

    We have E1= 0.050 eV for fluorine and 0.94 eV for iodine. At 1000 K, we find p1= 0.22and 9 106for F and I, respectively.

    1.6 Partition Functions of Many-Body Systems

    For an ideal gas system containing nparticles, it is a good approximation to writetot 1 2 nH H H H

    There are many other problems in physics in which the Hamiltonian, by a proper and cleverselection of variables, can be written as a sum of individual quasi-particles, which

    mathematically behave like independent real particles. Examples include photons, phonons,

    plasmons, magnons, rotons, and other "ons."

    1.6.1

    Distinguishable Particles

    Denote the individual energy states by a

    i , where adenotes the particle and idenotes theenergy states. Thus, the canonical partition function is

    , , ,

    , ,a b ci j k kT

    i j k

    Q N V T e

    a b cq q q

    where q(V, T) is the molecular partition function.

    If all particles are identical, we have

    , , ,

    N

    Q N V T q V T

    (non-interacting and distinguishable particles)

  • 7/23/2019 Chapter1s_StatMech

    16/22

    16

    1.6.2 Indistinguishable Particles

    In QM, all known particles fall into two classes:

    Fermions - wave function antisymmetric under the operation of interchanging two identicalparticles

    Bosons - wave function symmetric under the operation of interchanging two identical

    particles

    Consider again the partition function

    , , ,

    , , i j k kT

    i j k

    Q N V T e

    For bosons, the terms (1+ 1+ 3+ ) and (1+ 3+ 1+ ) are identical and should not becounted twice. The corresponding distribution is known as the Bose-Einstein statistics.

    For fermions, terms in which two or more indices are the same cannot be included in thesummation. The corresponding distribution is known as the Fermi-Dirac statistics.

    That is, the canonical partition functions of bosons and fermions cannot be written as qN.

    1.6.3 Boltzmann Statistics

    Consider the energy states of a single particlein a three-dimensional infinite well:

    2

    2 2 2

    , , 28x y zn n n x y z

    hn n n

    ma ,

    where nx, ny, nz= 1, 2, 3,

    Let us define

    2

    2 2 2 2

    2

    8x y z

    man n n R

    h

    For very largeR, the number of molecular

    quantum states with energy can be well

    approximated by the number of lattice points of the octant:

    3 32 2 2

    3

    2 2

    1 4 1 4 8 8

    8 3 8 3 6

    ma mR V

    h h

    where a3is the volume V.

    Calculation done for one particle in a cube of one liter would give an order of

    magnitude of 1030.

    Thus, the number of molecular quantum states available to a molecule at room temperature is

    much greater than the number of molecules in the system. Alternatively, we can state that the

  • 7/23/2019 Chapter1s_StatMech

    17/22

    17

    orbital occupancy is small in comparison with unity. This regime, which is favored by largemass, high temperature, and low density, is known as the classical limit.

    In the classical limit, the terms in each summation is much larger thanN:

    terms

    a b ci j kkT kT kT N

    i j k

    N

    q e e e

    (1 + 2 + 3)(1 + 2 + 3)(1 + 2 + 3):

    (1 1 1), (2 2 2), (3 3 3),(1 2 3), (1 3 2), (2 1 3), (2 3 1), (3 1 2), (3 2 1),

    (1 1 2), (1 2 1), (2 1 1),(1 1 3), (1 3 1), (3 1 1),(2 2 1), (2 1 2), (1 2 2),(2 2 3), (2 3 2), (3 2 2),

    (3 3 1), (3 1 3), (1 3 3),(3 3 2), (3 2 3), (2 3 3),

    (1 + 2 + 3 + 4)(1 + 2 + 3 + 4):

    (1 1), (2 2), (3 3), (4 4)(1 2), (1 3), (1 4), (2 1), (2 3), (2 4), (3 1), (3 2), (3 4), (4 1), (4 2),

    (4 3)

    As such, qNwill be dominated by terms which have different indices. Hence, the canonical

    partition function of fermions or bosons will be given identically as

    , ,!

    NqQ N V T

    N

    where we have corrected the sum byN! forNdifferent indices.

    The corresponding distribution is known as the Boltzmann statistics.

    To assure the validity of the Boltzmann statistics, we have stated that

    3

    2

    2

    8

    6

    mV N

    h

    Hence, we can rewrite the condition as3

    2

    2

    8

    6

    mkTV N

    h

    3

    1V N

    (*)

    Because (V/N)1/3is a distance of the order of the average nearest-neighbor distance between

    molecules, eqn (*) asserts that quantum effects (requiring the Bose-Einstein or Fermi-Dirac

  • 7/23/2019 Chapter1s_StatMech

    18/22

    18

    statistics) will be absent if neighboring molecules are far apart relative to the "thermal" de

    Broglie wavelength.

    1.7 Ideal Gas

    We can safely apply the Boltzmann statistics for gaseous systems with ideal gas behavior.

    1.7.1 Ideal Monatomic Gas

    We can obtain the Helmholtz energy as

    trans eleln ln!

    Nq q

    A kT Q kTN

    trans eleln lnkT N q q N N N

    For brevity, we assume that qele= 1 (i.e. no electronic excitation and degeneracy),

    transln ln 1A NkT q N

    32

    3

    ln

    2 3ln ln ln

    2

    Q

    e mkkT N T V

    Nh

    Thus, for one mole of gas particles:

    ,T N

    A NkTP

    V V

    PV=RT

    2 2

    ,

    ln 3 3 3

    2 2 2N V

    Q NU kT kT NkT RT

    T T

    3

    2V

    C R

    1.7.2 Chemical Equilibrium in Ideal Gas Mixture

    Consider an ideal gas mixture made up ofNXmolecules of typeXandNYof type Yin a closedcontainer with fixed Vand T. We assume no chemical reaction takes place betweenXand Y.

    The canonical ensemble partition function for this binary system is

    , , ,! !

    X YN N

    X YX Y

    X Y

    q qQ N N V T

    N N

    and the Helmholtz free energy is

    lnA kT Q

    Suppose a catalyst is added, making possible the reaction2X Y

  • 7/23/2019 Chapter1s_StatMech

    19/22

    19

    Because the system is closed and has constant Vand T, the second law states that the systemHelmholtz free energy will decrease as the reaction proceeds. At equilibrium,Awill be at

    minimum, i.e., Qis at maximum.

    For convenience, we state that eachXcontains two Yso that we have

    2 X YN N N whereNis a constant.

    Consequently, we obtain

    lnQ ln 2 ln ln 2 ln 2X X X Y X X X X XN q N N q N N N N N N N N

    Hence,

    , ,

    ln0

    X N V T

    Q

    N

    Knowing that

    32

    trans 3

    2 mkTq V

    h

    ,

    one can remove the dependence on Vby considering the concentration:

    2 2 2

    Y Y

    X X

    Y N V q V

    X N V q V

    32

    3

    2 2

    3

    2Y

    eq

    X

    kT mKh m

    That is, we haveKeqa function of temperature only.

    It is also an important point to note that

    2

    Y

    eq

    X

    q VK

    q V

    1.8 Principle of Detailed Balance

    1.8.1 State Functions in Thermodynamics

    Consider the following reaction:

  • 7/23/2019 Chapter1s_StatMech

    20/22

    20

    In principle, we could reach

    0d A

    dt by having

    1 20

    d Ak A k C

    dt

    But then by adding a catalyst, we could affect "[Aeq]". Thus, the steady state

    0d A

    dt

    cannot lead to true equilibrium.

    According to the principle of detailed balance, true equilibrium can be obtained only havingbalance in all steps:

    1B

    KA

    , 3C

    KB

    , 2A

    KC

    .

    That is,

    1 2 3 1K K K

    1 2 3ln ln ln 0RT K K K

    1 2 3

    ln ln ln 0RT K RT K RT K

    As discussed in thermodynamics,0

    1 1lnG RT K 0

    2 2lnG RT K 0

    3 3lnG RT K

    Thus,0 0 0

    1 2 3 0G G G Gibb's free energy is a state function.

    1.8.2

    Boltzmann Distribution

    Chemical species is a kinetic collection of the microscopic states of a molecular system.

    For the chemical reaction of

    A B ,

    we use the running index ito indicate all the energy levels belonging toAandjtoB.

    Let nibe the population per unit volume at energy level i, and so on.

    At equilibrium, the forward rate of each step is equal to the reverse rate of that step (principleof detailed balance):

    i j i j i jk n k n

    Divide both sides by the total population:

    jii j j i

    t t

    nnk k

    n n

  • 7/23/2019 Chapter1s_StatMech

    21/22

    21

    i j i j i jk p k p , wherepiandpjare the probabilities of finding the molecule at levels i

    andj, respectively.

    That is,

    j i j

    i j i

    p k

    p k

    .

    Because the quotient of reaction rates represents some kind of equilibrium, in analogy to the

    relation of Gand equilibrium constant, we can therefore write

    j i kTj

    i

    pe

    p

    Thus, we obtain

    1i kT

    ip e

    Q

    From the normalization that 1ii

    p , we have

    i kT

    i

    Q e

    We have obtained the Boltzmann distribution from the principle of detailed balance.(Courtesy of CY Mou)

    1.8.3

    Statistical Effects on Chemical Equilibrium

    Consider the following reaction of isomerization:A B

    Suppose that some of the energy levels ofAare lower than those ofB, but that the levels ofBare closer together:

    In statistical mechanics, we can take the point of view that a given molecule has accessible toit the full set of energy states indicated byA+B, with the partition function of

  • 7/23/2019 Chapter1s_StatMech

    22/22

    22

    exp exp

    A B

    A B

    i j

    i j

    q q

    q kT kT

    There is a single Boltzmann distribution of molecules among all the levelsA+B. Therefore,

    the fraction of molecules in all levels belonging to the subgroup Ais

    exp AiA AiA B

    kTN q

    N N q q

    At equilibrium, we have

    exp

    exp

    B

    j

    jB BcA

    A A i

    i

    kTN q

    KN q kT

    Equilibrium constant can be understood as a ratio of the assessable states of the products andreactants.