1 / 22

Conservation of Entropy ???

Conservation of Entropy ???. The question arose, is entropy conserved? After all, energy is. But a great deal of experimental experience indicated that: S(system) + S(surroundings)  0 This is the Second Law of Thermodynamics .

dana
Download Presentation

Conservation of Entropy ???

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Conservation of Entropy ??? • The question arose, is entropy conserved? After all, energy is. • But a great deal of experimental experience indicated that: • S(system) + S(surroundings)  0 • This is the Second Law of Thermodynamics. • Heat never spontaneously flows from a cold body to a hot one. • A cyclic process can never remove heat from a hot body and achieve complete conversion into work

  2. Entropy is Not Conserved 1 Atm T 1Atm T 2 Atm T V1 Vacuum V2=2V1 Two cases of expansion of an ideal gas: 1) Expansion in to a vacuum. 2) Reversible expansion (1) w=0, E=0, qirreversible=0 S(surroundings)=0 To calculate S(system) we look to the second process (they must be the same).

  3. Entropy is Not Conserved 1 Atm T 1Atm T 2 Atm T V1 Vacuum V2=2V1 In the second process we follow an isothermal, reversible path. We know that T=0 and thus E=0 Now…qrev= E - w= nRT ln(2) so S(system)= qrev/T= nR ln(2)

  4. Entropy is Not Conserved For the reversible process we’ve already calculated qrev= nRT ln(V2/V1) = nRT ln(2) S(system) = qrev/T= nR ln(2) One way to make sure this is reversible is to make sure the outside temperature is only differentially hotter. In this case, S(surroundings) = -qrev/T S(total) = 0 Back to case (1), expansion into vacuum... For the total irreversible process, S(surroundings) = 0 (nothing exchanged) S (total) = 0 + nR ln(2) > 0 Question: If S is a state variable, how can DS(total) be path dependent?

  5. Dissorder and Entropy It turns out that disorder and entropy are intimately related. Ludwig Boltzmann c. 1875 We start out by considering the spontaneity of this process. Why doesn’t the gas spontaneously reappear back in the box? After all, w=0, DE=0, q=0, DT=0, but DS=nRln(V2/V1)

  6. Dissorder and Entropy Let’s break the box into N cells and consider the gas to be an ideal gas composed of M molecules. We ask: What is the probability that M molecules will be in the box labeled ‘*’ * This obviously depends on both M and N. We assume N>M for this problem. Number of ways of distributing M indistinguishable balls in N boxes is approximately:  ~ NM/M! (for N and M large) Boltzmann noted that an entropy could be defined as S= k ln()= R ln()/NA There are a number of reasons this is a good definition. One is it connects to thermodynamics.

  7. Dissorder and Entropy So for a given state we have S = k ln() = R ln()/NA = R ln(NM/M!)/NA Let’s say we change state by increasing the volume. Well, for the same sized cells, N increases to N’. S’-S = (R/NA) (ln(N’M/M!)- ln(NM/M!)) = (R/NA) ln(N’M /NM) So S= (R/NA) ln(N’M/NM) = M (R/NA) ln(N’/N) And since N is proportional to volume: S = M (R/NA) ln(V2/V1) = nR ln(V2/V1)

  8. Feynman’s Description of Entropy The number of ways the inside can be arranged so that, from the outside, the system looks the same. Then from a probabilistic point of view, it is natural that entropy should tend go up. Imagine an ideal gas, molecules do not interact with each other, how do they “know where to go”. Answer: They don’t.

  9. Fluctuations All the previous arguments relate to processes involving large numbers of molecules averages over long periods of time. When the system is small and observation time is short, then fluctuations around the “maximum” entropy solution can be found.

  10. Entropy of Materials Why does graphite have more entropy than diamond? Graphite S°298= 5.7 J/K Diamond S°298= 2.4 J/K How about water in its different phases: S°298 (J/K mol) H2O(s,ice) 44.3 H2O(l) 69.91 H2O(g) 188.72

  11. Entropy of Mixing What happens when you mix two ideal gases? What happens when you solvate a molecule?

  12. 100° C Entropy Calculations Another example: dropping an ice cube into boiling hot water. What happens? Is it reversible? 0°C Let’s take the ice cube as the system, and the hot water as the surroundings (i.e. there’s enough hot water so that it stays 100° C).

  13. 1 2 100° C Entropy Calculations Water, 0° C reversible reversible 0° C Ice, 0° C Water, 100° C irreversible Let’s calculate for the reversible path...

  14. 1 Entropy Calculations Entropy of phase transition (for water)

  15. 2 Entropy Calculations Entropy of heating

  16. + 2 1 100° C Entropy Calculations Altogether, DS = 0° C = 45.5 J K–1 mol–1 Remember, this is only for the system (ice cube). Is this process spontaneous? It better be… How can we tell?

  17. Entropy Calculations Entropy change in the surroundings: How much total heat was exchanged with the surroundings? For melting, –6 kJ mol–1. For heating, –0.75 kJ mol–1. This is justCP(T2–T1) Total = –6.75 kJ mol–1. DS = –6750 J mol–1 / 373 K = –18.1 J mol–1 K –1 Total entropy change for the universe: +27.4 J mol–1 K –1 Later, we will combine S and H into another state variable, G, which directly relates to spontaneity.

  18. Entropy and Chemical Reactions Gas phase reaction: More products than reactants implies entropy goes up There are no restrictions on the positions of unbound atoms, so there are more possible configurations () S= k ln() tells us entropy increases There is some tendency for reaction to occur even if DH is positive.

  19. Entropy and Chemical Reactions S CO32- (aq) + H+(aq) HCO3- (aq) 148.1 J/K mol HC2O4- (aq) + OH- (aq) C2O42- + H2O(l) -23.1 Its hard to predict the change in entropy without considering solvent effects Anything that imposes structure or organization on a molecular arrangement lowers the entropy (fewer available configurations) Solvation involves arranging solvent molecules around a solute generally driven by DH at a cost with respect to DS

  20. The Third Law There is an absolute reference point for entropy. The Third Law states: “The entropy of a perfect crystal at 0 K is zero.” More precisely, it states that the entropy of all perfect crystals at absolute 0 is the same, so we might as well call it zero. Molecular (stat mech) considerations show that zero is a good thing to call it.

  21. The Third Law A couple of Third-Law points: • Just cooling something to 0 K is not enough. Example: Carbon monoxide Not a perfect crystal !!! • Elements, compounds, it’s all zero! Unlike E, the reference state for S is a perfect crystal of a pure substance, not necessarily an element.

  22. The Third Law The Third Law makes sense from the molecular point of view: • A perfect crystal has only one possible arrangement of atoms. • At 0 K there is only one possible distribution of energy. W = 1 lnW = 0

More Related