Entropy Definition

Embed Size (px)

Citation preview

  • 8/9/2019 Entropy Definition

    1/28

    Lecture 5 Entropy

    James Chou

    BCMP201 Spring 2008

  • 8/9/2019 Entropy Definition

    2/28

     A measure of the amount of energy in a system that is available

    for doing work; entropy increases as matter and energy in the

    universe degrade to an ultimate state of inert uniformity.

     A measure of the disorder of a system.

    Differentiation of Heat (Q), dQ, is not an exact differential and

    therefore cannot be integrated. Therefore we introduce an

    integration factor (1/T) such that dQ/T can be integrated. And this

    dQ/T is called entropy.

    Some common definitions of entropy

  • 8/9/2019 Entropy Definition

    3/28

  • 8/9/2019 Entropy Definition

    4/28

    An interesting observation

    Random distribution of 

    kinetic energy through

    random collisions

    0-Vx Vx

    Consider velocity in the y direction

    0 Vx-Vx

     

     f v x( ) =1

    2!" 2exp

     #v x2

    2" 2

     ! 2= average of v

     x  " v

     x ( )

    2

     

    !  =   v x

    2=

    k  BT 

    m

    0

  • 8/9/2019 Entropy Definition

    5/28

    Goal of this lecture:

    Use the fundamental principle of maximum entropy to explain the physicalproperties of a complex system in equilibrium with the universe.

  • 8/9/2019 Entropy Definition

    6/28

    Maximum Entropy = Minimum Bias

  • 8/9/2019 Entropy Definition

    7/28

    Example: Rolling a die with 6 possible outcomes.

    The only constraint we have is

     

    P X  =1( ) +  P X  = 2( ) + ...+   P X  = 6( ) =1

    Without additional information about the die, the most unbiased distribution is such that all

    outcomes are equally probable.

     

    P X  =1( ) = P X  = 2( ) = ... = P X  = 6( ) = 1 6

    Principle of Maximum-Entropy in Statistics

    Given some information or constraints about a random variable, we should choose that

    probability distribution for it, which is consistent with the given information, but has otherwisemaximum uncertainty  associated with it.

  • 8/9/2019 Entropy Definition

    8/28

    Shannon’s Measure of Uncertainty

    Shannon [1948] suggested the following measure of uncertainty, which is commonlyknown as the statistical entropy .

     

     H = !   pi ln pii=1

     N 

    "

    1.  H  is a positive function of p1, p2 , …, pn.

    2.  H  = 0 if one outcome has probability of 1.

    3.  H  is maximum when the outcomes are equally likely.

    In the case of the die, you will find the maximum entropy to be

     H = !   pi ln pii=1

    6

    " = ln6 .  

  • 8/9/2019 Entropy Definition

    9/28

    A quick review on logarithm

    loge x = ln x, e = 2.73 

    ln   AB( ) = ln A + ln B , ln   A / B( ) = ln A ! ln B , d dx

    ln   x( ) = 1 x

     

    Stirling approximation: ln   N !( ) "  N ln N   !  N , for very large  N  

  • 8/9/2019 Entropy Definition

    10/28

    Shannon’s entropy  in terms of the number of possible outcomes. 

    Example: the number of outcomes ! from rolling the die  N  times:

    ! = N !

     Np1( )!  Np2( )!...   Np6( )!  

    ln! = ln N !"   ln  Npi( )!

    i=1

    6

    #  

    Permutation of  N  numbers between 1 and 6

    Factor out the redundant outcomes

    Using Stirling’s approximation for very large  N , ln N !!  N ln N "  N , ln#  becomes

    ln# = N ln N "   Npiln Np

    i

    i=1

    6

    $ = N ln N " ln N Npi " N pi ln pii=1

    6

    $i=1

    6

    $ = " N pi ln pii=1

    6

    $  

  • 8/9/2019 Entropy Definition

    11/28

    ln! = " N pi ln  pi   =  NH i=1

    6

    #

     H    =  Const !  ln"

    Conclusion: ln!  is linearly proportional to H . Therefore, maximizing thetotal number of possible outcomes is equivalent to maximizing Shannon’sstatistical entropy.

    Statistical Entropy # of possible outcomes

  • 8/9/2019 Entropy Definition

    12/28

    Entropy in Statistical Physics

    Definition of physical entropy:

     S = const ! ln",   " = # of possible microstates of a close system.

     A microstate is the detailed state of a physical system.

    Example: In an ideal gas, a microstate consists of the  position and velocity  of everymolecule in the system. So the number of microstates is just what Feynman said: the

    number of different ways the inside of the system can be changed without changing theoutside.

    Principle of maximum entropy (The second law of thermodynamics)

    If a closed system is not in a state of statistical equilibrium, its macroscopic state will vary in time, until ultimately the system reaches a state of maximum entropy.Moreover, at equilibrium, all microstates are equally probable.

  • 8/9/2019 Entropy Definition

    13/28

    S  = Const x ln (# of Velocity States X # of Position States)

    # of velocity states does not change.# of position states does change

     

    !S =  S 2

     " S 1

    = Const #   ln$2

    v+ ln$

    2

    r" ln$

    1

    v" ln$

    1

    r

    [ ]

    An example of maximizing entropy:

     

    !S = Const "   ln 2V ( ) N 

    # lnV  N ( ) = Const  "  N ln 2

  • 8/9/2019 Entropy Definition

    14/28

    What is temperature?

    E1 E2

    Not in Equilibrium Equilibrium

     

     E =  E 1+  E 

    2 = const. dE 

    1= !dE 

    S =  Const  "  ln   #1#

    2( ) =  S 1   E 1( ) + S 2   E 2( )  

    Maximize S,dS 

    dE 1

    =dS 

    1

    dE 1

    +dS 

    2

    dE 2

    dE 2

    dE 1

    =dS 

    1

    dE 1

    !  dS 

    2

    dE 2

    = 0 

    Picture from hyperphysics.phy-astr.gsu.edu

    Temperature T  is defined as 1   T =  dS dE . The temperatures of bodies in equilibrium with

    one another are equal.

    At equilibrium,dS 1

    dE 1

    =

    dS 2

    dE 2

  • 8/9/2019 Entropy Definition

    15/28

    Since T  is measured at a fixed number of particles N  and volume V , a more stringent

    definition is

     

    T  =   dE dS ( ) N ,V  .

    Thus far, S  is defined to be const.! ln  "( ). If S is a dimension-less quantity, T  has thedimensions of energy (e.g. in units of Joules (J)).

    But J is too large a quantity. Example: Room temperature = 404.34 x 10-  J !

    What is the physical unit of T ?

    It is more convenient to measure T  in degrees Kelvin (K). The conversion factor betweenenergy and degree is the Boltzmann’s constant , k B = 1.38 X 10

    -23 J / K. Hence we redefine

    S and T  by incorporating the conversion factor.

     

    S =  k  Bln!  and T "T /k 

     B .

  • 8/9/2019 Entropy Definition

    16/28

    What does T =   dE dS ( ) N ,V   mean? 

    Same change in entropy, but more energy is given away by the system initially with higher T . Hence temperature is a measure of the tendency of an object to spontaneously give upenergy to its surroundings.

    e

    2e

    3eLower T 

    e

    2e

    3eHigher T 

     

    S 1= k 

     B ln

      5!

    2!2!

     

    "  # 

    %  &  

    ! E   !S = e k  B ln 3( )

     

    S 2

    = k  B ln

      5!

    3!2!

     

    "  # 

    %  &  

    S 1 = k  B ln  5!

    2!2!

     

    "  # 

    %  &   S 2 = k  B ln

      5!

    3!2!

     

    "  # 

    %  &  

    ! E   !S = 3e k  B ln 3( )

  • 8/9/2019 Entropy Definition

    17/28

    One particle

     

    !v " 4# v

    2,  $ is the speed in 3D.

    N  particles !v  " 4# v3 N %1

    & 4# v 3 N  for large  N .

    Since v  "  E 1 2

    ,

     S =  k 

     Bln! =

    3k  B N 

    2ln E +Const.

    Can we derive the equation of state of a gas (PV = nRT ) from the

    concept of entropy?

     

    ! =!v "!

    r   !v =  # of velocity states

     !r  =  # of position states

    Step 1: Evaluate S =  k  B ln! .

  • 8/9/2019 Entropy Definition

    18/28

    Step 2: Relate kinetic energy E  to temperature.

     S =  k 

     B ln! =

    3k  B N 

    2ln E +Const.

     

    1

    T =dS 

    dE =

    3k  B N 

    2 E   "    E =

    3

    2k  B NT =

    3

    2k  B  nN 

    0( )T 

    n = # of moles, N 0 = 6.02 x 1023

     mol-1

    R  = k BN 0 = 8.314 J mol-1

     K-1

      (Rydberg constant)

    Energy ofn mole of ideal gas:

     

     E =3

    2

    nRT 

    .

    Energy of one ideal gas molecule:

     

     E =3

    2k  BT .

  • 8/9/2019 Entropy Definition

    19/28

    For a particle in a box, each collision with a wall

    occurs in a time interval of 2 L v x , and change in

    momentum (e.g. in x direction), ! p x  is 2mv x .

     

    F  =! p

     x

    !t =

    2mv x

    2 L v x

    = mv x2

     L .

    For N  particles in a box,

     

    F  =   mv x i2

     L = Nm v x2

     Li=1

     N 

    " .

    Step 3: Relate temperature to pressure.

    SinceP = F L

    2=  Nm v

     x

    2V 

     and E = N 

    1

    2m v

    2= N 

     3

    2m v

     x

    2

    ,

    we obtain PV  =2

    3 E .

    Finally, since  E =3

    2nRT , we obtain PV  = nRT .

     

    v x

    2+ v

     y

    2+ v

     z

    2

  • 8/9/2019 Entropy Definition

    20/28

    How do we deal with the enormous complexity of a biological system?

  • 8/9/2019 Entropy Definition

    21/28

    Boltzmann and Gibbs Distribution

    Goal: Describe the probability distribution of a molecule of interest, with energy  E a , in

    equilibrium with a macroscopic thermal reservoir with energy  E  B .

    molecule of interest, a

    Surrounding, B

    In the joint system, the probability of the molecule of interest in a particular state,

     

     E a , is

     

     p E a( ) = ! B   E  B( ) " P0 = expS  B   E  B( )

    k  B

     

    $  % 

    '  (  " P0 .

    The second law says that at equilibrium, or 

    maximum entropy, all microstates are equally

    probable, with a probability P0 .

     

    S  B  E 

     B( ) =  k  B ln  ! B   E  B( )( )

  • 8/9/2019 Entropy Definition

    22/28

    We can use the first-order Taylor’s expansion to approximate S  B   E  B( ) because  E  B  is very near  E tot .

     

    S  B  E 

     B( )   ! S  B   E tot ( )"dS 

     B  E 

    tot ( )

    dE  B

     E a

    = S  B  E 

    tot ( ) "1

    k  BT  E 

    a

     

     E  B

     

     E tot 

     

     E a

     

    S E  B( )

     

     p E a( ) = expS  B   E  B( )

    k  B

     

    "  # 

    %  &  ' P0

    ?

     

     E tot 

    =  E  B+  E 

    a  !  E 

     B

    Hence we obtain

     

     p E a( ) = P0 ! exp S  B   E tot ( )

    k  B

     

    #  $ 

    &  '  exp  ( E a

    k  BT 

     

    #  $ 

    &  '   = A ! exp  ( E a

    k  BT 

     

    #  $ 

    &  '  

    IMPORTANT: The probability distribution of the molecule of interest in equilibrium with its surroundingdepends only on the temperature of the surrounding.

    Boltzmann Distribution, also known

    as the Gibbs Distribution

  • 8/9/2019 Entropy Definition

    23/28

    Random distribution of 

    kinetic energy through

    random collisions

     

     f v x( ) =1

    2!" 2exp

     #v x2

    2" 2

     

    !  =   v x

    2=

    k  BT 

    m

    P   =  Aexp  ! E k  BT 

    # $% 

    & '   =  Aexp

    !1

    2mv

    2

    k  BT 

    $$$

    ' ' ' 

      =  Aexp  !v2

    2  k 

     BT 

    m

    " # $

      % & ' 

    $$$

    ' ' ' 

    Now we can explain the velocity distribution at equilibrium using the

    Boltzmann distribution

  • 8/9/2019 Entropy Definition

    24/28

    Example on rate constant and transition state

    A  k  !   "  !   B

     

    ! E  

     E a

    Transition

    state

     A

    B

    The reaction rate constant, k  [s-1], is

    proportional to exp! E 

    a

     RT 

     

    #  $  

    &  '  , where  E a  is the

    activation energy, or energy barrier, in units of J mol-1.

     

    k =  Aexp ! E 

    a

     RT 

     

    #  $ 

    &  '   Arrhenius equation

  • 8/9/2019 Entropy Definition

    25/28

    Suppose  Ea of a reaction is 100 kJ mol-1 and a catalyst lowers this to

    80 kJ mol-1. Approximately how much faster will the reaction proceedwith the catalyst?

    High energy barriers result in high specificity in a cellular signaling 

     pathway.

     

    k  catalyzed( )

    k  uncatalyzed( )=

    exp   !80  RT ( )

    exp   !100  RT ( )= e

    8" 3000

      RT   ! 2.5 kJ mol-1  at room temperature

  • 8/9/2019 Entropy Definition

    26/28

    PrPC

    PrPSc

     ! E ++

    "  40  kcal mol-1=167.4 kJ mol

    -1

    Catalyzed by either mutation

    or binding of PrPSc

    The role of prion conformational switch in neurodegenerative diseases

  • 8/9/2019 Entropy Definition

    27/28

    What about low energy barrier?

    The temperature-gated vanilloid receptor VR1, a pain receptor, is activated byheat T > 45 °C.

     

    ! E 

     

     E a

     A

    B

    It was found experimentally that the probabilityof VR1 being in the open state is 0.04 at 40°C and 0.98 at 50 °C. What is the energy

    barrier?

  • 8/9/2019 Entropy Definition

    28/28

    Take home messages

    T  = A measure of the tendency of an object to spontaneously give up energy to itssurroundings.

     

    T  =   dE dS ( ) N ,V 

    The Boltzmann & Gibbs Distribution

     

     p E a( ) = A ! exp" E ak  BT 

     

    $  % 

    '  (  

    Equilibrium = A system reaching a state of maximum entropy.Equilibrium = All microstates are equally probable.

     

    S = k  B ln!