1.11k likes | 2.41k Views
Review of Statistical Mechanics. Schroeder Ch. 2; 3.1 – 3.4; 6.1 – 6.7; 7.1 – 7.3 Gould and Tobochnik Ch. 3.1 – 3.7; 4.1 – 4.11; 6.1 – 6.5. Outline. Principles of probability and combinatorics Classical and quantum systems The microcanonical ensemble and multiplicity
E N D
Review of Statistical Mechanics Schroeder Ch. 2; 3.1 – 3.4; 6.1 – 6.7; 7.1 – 7.3 Gould and Tobochnik Ch. 3.1 – 3.7; 4.1 – 4.11; 6.1 – 6.5
Outline • Principles of probability and combinatorics • Classical and quantum systems • The microcanonical ensemble and multiplicity • Entropy and the 2nd law • The 3rd law of thermodynamics • The canonical ensemble and the partition function • The ideal gas and kinetic theory • Real gases and the virial expansion • The grand canonical ensemble and the grand partition function • Degenerate Fermi gas
Basic Principles of Combinatorics • Combinatorics is the branch of mathematics studying the enumeration, combination, and permutation of sets of variables • The three fundamental combinatorial principles used in statistical mechanics are • Fundamental principle of counting • Permutations • Combinations
Fundamental Principle of Counting • Suppose that a task involves a sequence of k choices. • Let n1 be the number of ways the first event can occur, n2 be the number of ways the second event can occur, … and let nk be the number of ways the kth event can occur. • Then the total number of different ways the task can occur is:
Permutations • Consider a set of things lined up in a row • Q: How many ways can we arrange them? • A: This result is called the number of permutations of things at a time and is denoted by . • It can be shown that the number of permutations of things at a time is
Combinations • Suppose we have people but only committees. • Q: How many committees of people can be chosen from a group of people? • Note that the order of the people in the committee is not important. • A: The result is called the number of combinations of things at a time and is denoted by • It can be shown that the number of combinations of things at a time is
Basic Principles of Probability • Probability is the branch of mathematics that studies the possible outcomes of given events together with the relative likelihoods and distributions of their outcomes • In common usage, probability is used to describe the change that a particular event (or set of events) will occur. • The fundamental principles of probability used in statistical mechanics are • Random variables • Mean value and variance • Binomial and Gaussian distributions • Central limit theorem
Discrete Random Variables • A discrete random variable is one which may take on only a countable number of distinct values. • Ex: Number of children in a family, the number of patients in a doctor’s surgery, etc. • The probability distribution of a discrete random variable is a list of probabilities associated with each of its possible values
Continuous Random Variable • A continuous random variable is one which takes an infinite number of possible values • Ex: Time required to run a mile, position and velocity of a particle, the angle of compass needle. • A continuous random variable is defined over an interval of values. • For continuous random variables, the probability distribution can be written as a probability density function
Mean Value and Variance • A convenient way to describe the distribution of the possible values of a random variable is to specify the mean value. • The definition of the mean value of is • If is a function of , then the mean value of is given by
Mean Value and Variance • The mean value of (also called the expectation value) is a measure of the central value of about which the various values of are distributed. • The deviation of from its mean is given by • A measure of the width of the probability distribution is given by • This quantity is known as the dispersion or variance.
The Binomial Distribution • Suppose we examine the statistical properties of a two state system. • Ex: Paramagnet and Einstein solid • Q: What is the probability of determining outcomes for trials? • A: The probability distribution is given by the binomial distribution. • It can be shown that the binomial distribution is given by
Gaussian Distribution • For large , we see that the binomial distribution approaches the Gaussian distribution • The Gaussian distribution is given by • The most important feature of the Gaussian probability distribution is that its relative width decreases as
Central Limit Theorem • The central limit theorem states that the probability distribution of the value of the sum of a large number of random variables is approximately Gaussian. • In the limit , the probability density is given by • If a random process is related to a sum of a large number of microscopic processes, the sum will be distributed according to the Gaussian distribution independently of the nature of the distribution of the microscopic processes • The central limit theorem implies that macroscopic systems have well defined macroscopic properties even though their constituent parts are changing rapidly.
Formulation of the Statistical Problem • The state of the system is the set of all variables one needs to describe that system at any given time. • The macrostate of a system is the set of all macroscopic variables (e.g. pressure, volume, temperature, etc.) • The microstate of a system is the set of all variables of the system required for a complete mechanical description of the system (e.g. positions and momenta of each particle) • The multiplicity is the number of microstates in a given macrostate.
Methodology of Statistical Mechanics • The essential methodology of statistical mechanics can be summarized as follows: • Specify the macrostate and accessible microstates of the system • Choose the statistical ensemble that is appropriate for the system • Determine the mean values and other statistical properties of the system.
Classical Systems • In order to fully describe the classical microstate of a system of particles, we need to specify all the coordinates and momenta of the particles of the system. • We can obtain a countable number of microstates by assuming that the momenta and position of the system within a cell of area is given by • This is sometimes called the continuum approximation
Classical Systems • To find the multiplicity, we must sum over all the cells that lie within the region of phase space. Thus • Two common examples of classical systems are • Classical ideal gas • Classical harmonic oscillator
Quantum Systems • The quantum states of particles are usually defined by a set of quantum numbers. • A microstate of a quantum system is completely given once its wavefunction is known and the wavefunction is given by its quantum number • Two common examples of quantum systems are • Spin ½ paramagnet • Einstein solid
Statistical Ensemble • How can we determine the probability in which a system finds itself in a given microstate at any given time? • We can perform the same experiment simultaneous on a large number of identical systems called an ensemble. • If there are such identical systems and state is found times, then the probability that state occurs in any experiment is
Statistical Ensemble • Suppose we are interested in knowing about some property of the system . • Since we cannot know the precise microstate the system is in, we generally ask for and . • Based on the central limit theorem, when the system is sufficiently large, statistical fluctuations must be small compared with its average value • We will study three statistical ensembles in this course • The microcanonical ensemble (isolated systems) • The canonical ensemble (closed systems) • The grand canonical ensemble (open systems)
The Fundamental Postulate of Statistical Mechanics • In order to use statistical ensembles to determine the probability of a system being in a given microstate, we need to make further assumptions to make further progress. • A relatively simple assumption for a wide variety of thermodynamic systems is the assumption of equal a priori probabilities: • In an isolated system in equilibrium, all accessible microstates are equally probable
Irreversibility • If the system is not in the most probable macrostate, it will rapidly and inevitably move toward that macrostate because there are far more microstate in that direction than away. • The system will subsequently stay at that macrostate(or very near to it), in spite of the random fluctuations of energy back and forth between the two solids. • When two solids are in thermal equilibrium with each other, completely random and reversible microscopic processes tend at the macroscopic level to push the solids inevitably toward an equilibrium macrostate. • Any random fluctuations away from the most likely macrostate are extremely small.
Multiplicity and the 2nd Law • Generally, for any thermodynamic system • Energy tend to rearrange themselves until the multiplicity is near its maximum value. • Any large system in equilibrium will be found in the macrostate with the greatest multiplicity • Multiplicity tends to increase • All of these statements are restatements of the second law of thermodynamics. • This can be shown explicitly for • Interacting Einstein solids • Interacting ideal gases
Entropy and the 2nd Law • Boltzmann defined the entropy as • An isolated system, being initially in a non-equilibrium state, will evolve from macrostateswith lower multiplicity (lower probability, lower entropy) to macrostateswith higher multiplicity (higher probability, higher entropy). • Once the system reaches the macrostateswith the highest multiplicity (highest entropy), it will stay there. • Therefore, the entropy of an isolated system never decreases. • Thus, any process that increases the number of microstates will happen (if it’s allowed by the 1st law).
Statistical Mechanics and Thermodynamics • The bridge between statistical mechanics and thermodynamics is through the multiplicity function • The procedure is • Find for a given thermodynamic system • Evaluate the entropy using • Evaluate the temperature using • Solve for
The 3rd Law of Thermodynamics • The statement of the third law of thermodynamics was originally postulated by Walther Nernst and later modified by Planck • At , the specific entropy (entropy per particle) is constant, independent of all the extensive properties of the system • Since , the third law implies that at zero temperature, every isolated thermodynamic system should settle into its unique lowest-energy state called the ground state. • The 3rd law also gives a complete description of the thermodynamic definition of entropy
Introduction to the Microcanonical Ensemble • A microcanonical ensemble is a collection of isolated systems that have achieved equilibrium in the same state. • Microcanonical ensembles are defined by the number of particles they contain and their total energy, together with other external constraints. • According to the fundamental postulate of statistical mechanics, the probability of finding a member of the ensemble in a given microstate r is
Absolute Temperature • Consider two systems and and the system obtained by placing and in thermal contact with each other. • Assume that the system is thermally isolated. • The probability of finding a given value for the energy of is • In thermal equilibrium, it can be shown that • Therefore, we write where is the absolute temperature of the system.
First Law of Thermodynamics • The first term on the RHS is associated with the external work done by the system, whereas the second term on the RHS is associated with energy transfer by heat • Macroscopic work is related to changes in the state energies and heat is related to changes in the occupation probability distribution.
Generalized Forces • For an isolated system, if the system does an amount of work , always staying in the state , then the external work done by the system must be equal to the loss in energy of the state . • The macroscopic external work done is the average of the above expression • The quantity is called the generalized force conjugate to . It can be shown that
Examples in the Microcanonical Ensemble • Using the microcanonical ensemble, one can • Derive the pressure and energy equation of state for a monatomic ideal gas • Examine the thermal behavior of a spin ½ paramagnet • Examine the thermal behavior of the Einstein solid
Introduction to the Canonical Ensemble • We have already described the microcanonical ensemble, which is defined as a collection of isolated systems. • Here we want to describe systems that are in contact with a heat reservoir, able freely to exchange energy by heat with it, but not particles. • An ensemble of systems in contact with a heat reservoir is characterized by the temperature of the reservoir and the number of particles and is called a canonical ensemble.
Derivation of Boltzmann Distribution • What is a probability of finding a member of the ensemble in a given state with energy ? • Consider a system in equilibrium with a heat reservoir, , so that the combined system is isolated. • For any energy of , it can be shown that the probability of finding system with energy is given by
Derivation of Boltzmann Distribution • If we define the constant as , then for any state , the probability is given by • The quantity is called the partition function and it is the sum of all Boltzmann factors. • The partition function plays a role analogous to that of the multiplicity function in the microcanonical ensemble.
Properties of the Partition Function • It can be shown that all the thermodynamic functions can be expressed in terms of the partition function and its derivatives. • The average internal energy is • The generalized forces are • The entropy is given by
Properties of the Partition Function • The enthalpy is given by • The Helmholtz free energy is given by • The Gibbs free energy is
Properties of the Partition Function • Consider a system consisting of two parts, and , which interact weakly but are in contact with the same heat reservoir. • The partition function for is given by • By induction, we can generalize the above result to a system , consisting of subsystems which interact weakly
Partition Function for Common Thermal Systems • The partition function for a single quantum harmonic oscillator is • The partition function for a single spin ½ particle in an external magnetic field is • The partition function for a single molecule in a classical ideal gas is • The partition function associated with the rotation of diatomic molecules is given by
Kinetic Theory • One of the important uses of the canonical ensemble is the derivation of kinetic theory. • We will derive two basic principles concerning kinetic theory • Equipartitiontheorem • Maxwell speed distribution
The Equipartition Theorem • Suppose that the total energy of a classical system in thermal equilibrium with a heat reservoir at temperature can be written in quadratic form • The equipartition theorem states that the mean value of each contribution to the total energy equals
Applications of the Equipartition Theorem • For a system of particles in three dimensions, the mean kinetic energy is • For a high temperature crystalline solid, the mean energy is • The heat capacity at constant volume for the high temperature crystalline solid is , known as the law of Dulong and Petit.
Maxwell Speed Distribution • This distribution is the Maxwell distribution of speeds and it represents the number of molecules of the gas per unit volume with speeds between and • Note that the form of the probability distribution is a Gaussian. • Note that the probability of the velocity of a molecule in a particular direction is independent of the velocity in any direction.
Maxwell Speed Distribution • Using the Maxwell speed distribution, it can be shown that the mean speed in a classical system is • Likewise, the mean square velocity is
Real Gases • Now, we will study real gases by relaxing the assumptions that make an ideal gas • Non-zero molecular size • Molecular interactions • The energy of the system of molecules can always be written as
Real Gases • For real gases, it can be shown that: • The equation of state for real gases can be written in terms of an expansion in number density (called a virial expansion) • The ideal gas equation of state is based on the 1st order virial expansion. • The van der Waals equation of state is based on the 2nd order virial expansion.
Introduction to the Grand Canonical Ensemble • We have already described the canonical ensemble, which is defined as a collection of closed systems. • If we relax the condition that no matter is exchanged between the system and its reservoir, we obtain the grand canonical ensemble. • In this ensemble, the systems are all in thermal equilibrium with a reservoir at some fixed temperature, but they are also able to exchange particles with this reservoir.
Derivation of Gibbs Distribution • What is the probability of finding a member of the ensemble in a given state with energy and containing particles? • Consider a system in thermal and diffusive contact with a reservoir, , whose temperature and chemical potential are effectively constant. • If has energy and particles, it can be shown that
Derivation of Gibbs Distribution • Since the combined system belongs to a microcanonical ensemble, the probability of finding our system with energy and particles is given by • Since the reservoir is very large compared with our system, then and
Derivation of Gibbs Distribution • If we define the constant as , then for any state , the probability is given by • The quantity is called the grand partition function. By requiring that the sum of the probabilities of all states to equal 1, it can be shown that • The partition function plays a role analogous to that of the multiplicity function in the microcanonical ensemble.