1 / 25

Lecture 3

Lecture 3. Entropy in statistical mechanics. Thermodynamic contacts: mechanical contact, heat contact, diffusion contact. Equilibrium. Chemical potential. The main distributions in statistical mechanics. A system in the canonical ensemble. Thermostat.

Download Presentation

Lecture 3

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 3 • Entropy in statistical mechanics. • Thermodynamic contacts: • mechanical contact, • heat contact, • diffusion contact. • Equilibrium. • Chemical potential. • The main distributions in statistical mechanics. • A system in the canonical ensemble. Thermostat. Ice melting" - classic example of entropy increase described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.

  2. Entropy in Statistical Mechanics • From the principles of thermodynamics we learn that the thermodynamic entropyS has the following important properties: • dS is an exact differential and is equal to DQ/T for a reversible process, where DQis the heat quantity added to the system. • Entropy is additive: S=S1+S2. The entropy of the combined system is the sum of the entropy of two separate parts. • S  0. If the state of a closed system is given macroscopically at any instant, the most probable state at any other instant is one of equal or greater entropy.

  3. State Function This statement means that entropy is a state functionin that the value of entropy does not depend on the past history of the system but only on the actual state of the system... In this case one of the great accomplishments of statistical mechanics is to give us a physical picture of entropy. The entropy  of a system (in classical statistical physics) in statistical equilibrium can be defined as =ln (3.1) where  is the volume of phase space accessible to the system, i.e., the volume corresponding to the energies between E-  and E+  .

  4. Let us show first, that changes in the entropy are independent of the system of units used to measure . As is a volume in the phase space of Npoint particles it has dimensions (Momentum  Length)3N =(Action)3N(3.2) Let denote the unit of action; then is dimensionless. If we were to define (3.3) we see that for changes (3.4)

  5. independent of the system units. =Plank’s constant is a natural unit of action in phase space. It is obvious that the entropy , as defined by (3.1), has a definite value for an ensemble in statistical equilibrium; thus the change in entropy is an exact differential. Once the ensemble is specified in terms of the spread in the phase space, the entropy is known. We see that, if is interpreted as a measure of the imprecision of our knowledge of a system or as a measure of the “randomness” of a system then the entropy is also to be interpreted as a measure of the imprecision or randomness.

  6. Entropy is an additive It can be easily shown that is an additive. Let us consider a system made up of two parts, one with N1 particles and the other with N2particles. Then (3.5) and the phase space of the combined system is the product space of the phase spaces of the individual parts: (3.6) The additive property of the entropy follows directly: (3.7)

  7. Thermodynamic contacts between two systems We have supposed that the condition of statistical equilibrium is given by the most probable condition of a closed system, and therefore we may also say that the entropy is a maximum when a closed system is in equilibrium condition. The value of  for a system in equilibrium will depend on the energyE (  <E>)of the system; on the number Niof each molecular species iin the system; and on external variables, such as volume, strain, magnetization, etc.

  8. Let us consider the condition for equilibrium in a system made up of two interconnected subsystems, as in Fig. 3.1. Initially a rigid, insulating, non-permeable barrier separates the subsystems from each other. Fig. 3.1.

  9. Thermal contact Thermal contact - the systems can exchange the energy. In equilibrium, there is no any flow of energy between the systems. Let us suppose that the barrier is allowed to transmit energy, the other inhibitions remaining in effect. If the conditions of the two subsystems 1 and 2 do not change we say they are inthermal equilibrium. In the thermal equilibrium the entropy of the total system must be a maximum with respect to small transfers of energy from one subsystem to the other. Writing, by the additive property of the entropy

  10. we have in equilibrium  =1+2=0 (3.8) (3.9) We know, however, that (3.9) as the total system is thermally closed, the energyin a microcanonical ensemble being constant. Thus (3.11)

  11. (3.14) As E1 was an arbitrary variation we must have (3.12) in thermal equilibrium. If we define a quantity by (3.13) then in thermal equilibrium Here is known as the temperatureand will shown later to be related to the absolute temperature Tby =kT , where kis the Boltzmann constant,1.38010-23 j/deg K.

  12. Mechanical contact Mechanical contact the systems are separated by the mobile barrier; The equilibrium is reached in this case by adequation of pressure from both sides of the barrier. We now imagine that the wall is allowed to move and also passes energy but do not passes particles. The volumes V1, V2 of the two systems can readjust to maximize the entropy. In mechanical equilibrium (3.15)

  13. After thermal equilibrium has been established the last two terms on the right add up to zero, so we must have (3.16) Now the total volumeV=V1+V2is constant, so that (3.17) We have then (3.18)

  14. As V1 was an arbitrary variation we must have (3.19) in mechanical equilibrium. If we define a quantity by (3.20) we see that for a system in thermal equilibrium the condition for mechanical equilibrium is (3.21) We show now that has the essential characteristics of the usual pressure p.

  15. Material-transferring contact The systems can be exchange by the particles. Let us suppose that the wall allows diffusion through it of molecules of the i thchemical species. We have (3.22) For equilibrium (3.23) or (3.24)

  16. We define a quantity iby the relation (3.25) The quantity iis called the chemical potential of the ith species. For equilibrium at constant temperature (3.26)

  17. The Canonical Ensemble The microcanonical ensemble is a general statistical tool, but it is often very difficult to use in practice because of difficulty in evaluating the volume of phase space or the number of states accessible to the system. The canonical ensemble invented by Gibbs avoids some of the difficulties, and leads us easily to the familiar Boltzmann factorexp(-E/kT) for the ration of populations of two states differing by E in energy. We shall see that the canonical ensemble describes systems in thermal contact with a heat reservoir; the microcanonical ensemble describes systems that are perfectly insulated. In 1901, at the age of 62, Gibbs (1839-1903) published a book called Elementary Principles in Statistical Mechanics (Dover, New York).

  18. Total system (t) Rest ( r ) or heat reservoir Subsystem (s) We imagine that each system of the ensemble is divided up into a large number of subsystems, which are in mutual thermal contact and can exchange energy with each other. We direct our attention (Fig.3.2) to one subsystem denoteds; the rest of the system will be denoted byrand is sometimes referred to as a heat reservoir. The total system is denoted by tand has the constant energy Et as it is a member of a microcanonical ensemble. For each value of the energy we think of an ensemble of systems (and subsystems). (Fig.3.2)

  19. The subsystems will usually, but not necessarily, be themselves of macroscopic dimensions. The subsystem may be a single molecule if, as in a gas, the interactions between molecules are very weak, thereby permitting us to specify accurately the energy of a molecule. In a solid, a single atom will not be a satisfactory subsystem as the bond energy is shared with neighbors. Lettingdwtdenote the probability that the total system is in an element of volumedtof the appropriate phase space, we have for a microcanonical ensemble

  20. (3.27) here C is constant. Then we can write (3.28) We ask now the probability dwsthat the subsystem is in ds, without specifying the condition of the reservoir, butstill requiring that the total system being inEtat Et. Then (3.29)

  21. where r is the volume of phase space of the reservoir which corresponds to the energy of the total system being inEtat Et. Our task is to evaluate r; that is, if we know that the subsystem is in ds, how much phase space is accessible to the heat reservoir? The entropy of the reservoir is (3.30) (3.31) Note that (3.32)

  22. where we may take Es<< Et because the subsystem is assumed to be small in comparison with the total system. We expend (3.33) Thus (3.34) As Etis necessarily close to Er, we can write, using (3.13) (3.35) Here  is the temperature characterizing every part of the system, as thermal contact as assumed. Finally, from (3.29), (3.34) and (3.35)

  23. (3.36) where (3.37) may be viewed as a quantity which takes care of the normalization: (3.38) Thus for the subsystem the probability density (distribution function) is given by the canonical ensemble (3.39)

  24. ln1=lnA1-E1/ ln2=lnA2-E2/ ln12=lnA1A2 - (E1+E2)/ where here and henceforth the subscriptsis dropped. We emphasize that E is the energy of the entire subsystem. We note thatln is additive for two subsystems in thermal contact: so that , with =12; A=A1A2 ; E=E1+E2, , we have ln=lnA-E/(3.40)

  25. for the combined systems. This additive property is central to the use of the canonical ensemble. The average value of any physical quantity f(p,q) over canonical distribution is given by We have to note that for a subsystem consisting of a large number of particles, the subsystem energy in a canonical ensemble is very well defined. This is because the density of energy levels or the volume in phase space is a strongly varying function of energy, as is also the distribution function (3.39).

More Related