Lecture 10 PHYS 416

24
Lecture 10 PHYS 416 Thursday Sep 23 Fall 2021 1. Some HW 3 solutions 2. Review 3.2 The microcanonical ideal gas 3. Introduce 3.3 What is temperature? 4. Quiz 3

Transcript of Lecture 10 PHYS 416

Page 1: Lecture 10 PHYS 416

Lecture 10 PHYS 416 Thursday Sep 23 Fall 2021

1. Some HW 3 solutions2. Review 3.2 The microcanonical ideal gas3. Introduce 3.3 What is temperature?4. Quiz 3

Page 2: Lecture 10 PHYS 416

To find the average <O> of a property in the microcanonical ensemble means averaging over this energy shell:

OE= 1Ω E( )δ E O P,!( )

E<H P ,!( )<E+δ E∫ dPd!

By defining our average as an integral where all regions of phase space (with the same energy) are equally weighted, we are making an important assumption: all possible configurations are included.

Remember how we said you will never see an ice cube get colder and larger if you drop in hot tea? Since that type of configuration has the correct energy, we will count it too in deriving all ensemble properties.

Page 3: Lecture 10 PHYS 416

( )( ) ( )

2 22 2 !2

! !2 NN

mPN N

N m N m N m-- æ ö

=ç ÷ +=

+ -è ø

n!∼ ne( )n 2πn ∼ n

e( )n

Configuration Space

We consider a monatomic ideal gas. The total energy is fixed = microcanonical ensemble. Consider only the set of positions; ignore the momenta.

There are 2N atoms. In a box of volume V= L3, what is the probability that the right half has N+m of the atoms, with N-m in the left half?

How many ways can 2N particles be distributed between the two halves? 22N. Then how many ways can N+m be on the right side? Use our standard combinatoric function:

We will use Stirling’s formula to evaluate these factorials:

Page 4: Lecture 10 PHYS 416

Pm ≈ e−m2

N 2⎛⎝⎜

⎞⎠⎟

−N

emN⎛

⎝⎞⎠

−m

e−m

N⎛⎝

⎞⎠

m

≈ P0 exp −m2N( )

21 expmmP

N Npæ ö

» -ç ÷è ø

Voila!

Now recognize that this is a Gaussian function. We set its integral equal to unity, since it is a probability, which allows us to fix the prefactor. The sets the standard deviation to:

σ = N2

QUESTION: Is it possible for this standard deviation to depend on the speed of the particles, that is, the temperature?

We will see that the properties of configuration space can be separated from the momentum space properties IF the energy does not depend on position, i.e., there is no potential energy for the Hamiltonian.

Page 5: Lecture 10 PHYS 416

Momentum space

1 2mαα=1

3N

∑ vα2 =

pα2

2mαα=1

3N

∑ = P2

2m

µ SRℓ−1( ) = π ℓ 2 Rℓ ℓ 2( )!

OK, so now let’s consider the momenta of the N particles. What is the probability that particle #1 has an x-component of momentum with the value p1?

The total kinetic energy is:

We will describe this as a 3N-dimensional sphere in momentum space with radius:

The volume of such a sphere is:

R = 2mE.

Page 6: Lecture 10 PHYS 416

( )21

11 exp

22 BB

ppmk Tmk T

rp

æ ö= -ç ÷

è ø

( )( )

21

11 3exp

2 22 2 3p Npm Em E N

rp

æ ö-= ç ÷

è ø

Shortly we will have a good definition of temperature for the ideal gas, which will then give us the Boltzmann distribution:

A powerful derivation of the momentum probability distribution function for the ideal gas (microcanonical ensemble).

Here we take advantage of the relation kT=2E/3N, to be derived soon for the ideal gas.

Page 7: Lecture 10 PHYS 416

Other observations:

• This completely describes the connection between momentum distributions and temperature.

• This effectively gives us the Boltzmann distribution, with exponent –DE/kT

• We also get the equipartition theorem, in that the mean KE = kT/2. (Each harmonic degree of freedom is the same.)

• This result turns out to be valid for nearly all gases governed by classical physics.

ρ p1( ) = 1

2πmkBTexp −

p12

2mkBT⎛

⎝⎜⎞

⎠⎟

Page 8: Lecture 10 PHYS 416

8

Einstein’s model of solid (1907)

Each atom in a solid is connected to immovable walls

Energy:22 2

2 2 21 1 10 0 02 2 22 2 2

yx zvib s s x s y s z

pp pK U k s U k s U k s U

m m m

⎛ ⎞⎛ ⎞ ⎛ ⎞+ = + + + + + + + +⎜ ⎟⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎝ ⎠⎝ ⎠ ⎝ ⎠

3D oscillator: can separate motion into x,y,z componentsEach component: energy is quantized in the same way

Page 9: Lecture 10 PHYS 416

Which state is the most probable?

3 POCKETS 3 POCKETS

4 dollar bills

Interacting atoms: 4 quanta and 2 atoms

quanta in 1 (# of ways)

quanta in 2 (# of ways)

(# of ways 1)× (# of ways 2)

0 (1) 4 (15) 1 x 15 =151 (3) 3 (10) 3 x 10 = 302 (6) 2 (6) 6 x 6 = 363 (10) 1 (3) 10 x 3 = 304 (15) 0 (1) 15 x 1 = 15

Total: 126

microstates vs macrostates?

Page 10: Lecture 10 PHYS 416

Fundamental Assumption of Statistical Mechanics

The fundamental assumption of statistical mechanics is that, over time, an isolated system in a given macrostate (total energy) is equally likely to be found in any of its microstates (microscopic distribution of energy).

Thus, our system of 2 atoms is most likely to be in a microstate where energy is split up 50/50.

atom 1 atom 2

29%

12%

24%

12%

24%

Page 11: Lecture 10 PHYS 416

Two Big Blocks Interacting

q1 q2 = 100–q1 Ω1 Ω2 Ω1 Ω2

0 100 1 2.78 E+81 2.77 E+81

1 99 300 9.27 E+80 2.78 E+83

2 98 4.52 E+4 3.08 E+80 1.39 E+85

3 97 4.55 E+6 1.02 E+80 4.62 E+86

4 96 3.44e E+8 3.33 E+79 1.15 E+88

… … … … …

Add 100 quanta of energy.Most microstates distributethe energy evenly.

11

Page 12: Lecture 10 PHYS 416

Definition of Entropy

lnS k≡ Ω231.38 10 Jk K

−= ×where Boltzmann’s constant is

Entropy:

12

Page 13: Lecture 10 PHYS 416

Entropy: S ≡ k lnΩ, where k=1.4x10-23 J / K

We will see that in equilibrium, the most probable distribution of energy quanta is that which maximizes the total entropy. (Second Law of Thermodynamics).

13

Page 14: Lecture 10 PHYS 416

Entropy: S ≡ k lnΩ, where k=1.4x10-23 J / K

k ln(Ω1Ω2 ) = k ln(Ω1) + k ln(Ω2 )

Why entropy?

(property of logarithms)

When there is more than one “object” in a system, we can add entropies just like we can add energies:

Stotal=S1+S2, Etotal=E1+E2.

( )( )

1 !! 1 !q Nq N+ −

Ω =−Remember:

14

Page 15: Lecture 10 PHYS 416

Approaching Equilibrium

• Assume system starts in an energy distribution A far away from equilibrium (most likely state).

• What happens as the system evolves?

• Crucial point: consider our system to contain both blocks. The macrostate of the system is specified by the total # quanta: 100.

• All microstates compatible with qtotal=100 are equally likely, but not all energy distributions (values of q1) are equally likely!

energy distribution A energy distribution B

15

Page 16: Lecture 10 PHYS 416

Second Law of ThermodynamicsA mathematically sophisticated treatment of these ideas was developed by Boltzmann in the late 1800’s.

The result of such considerations, confirmed by all experiments, is

THE SECOND LAW OF THERMODYNAMICS

If a closed system is not in equilibrium, the most probable consequence is that the entropy of the system will increase.

By “most probable” we don’t mean just “slightly more probable”, but rather “so ridiculously probable that essentially no other possibility is ever observed”. Hence, even though it is a statistical statement, we call it the “2nd Law” and not the “2nd Likelihood”.

16

Page 17: Lecture 10 PHYS 416

SUMMARY:

1. We learned to count the number of microstates in a certain macrostate, using the Einstein model of a solid.

2. We defined ENTROPY.

3. We observe that the 2nd Law of thermodynamics requires that interacting systems move toward their most likely macrostate (equilibrium reflects statistics!); explains irreversibility.

4. We defined TEMPERATURE to correspond to the observed behavior of systems moving into equilibrium, using entropy as a guide (plus 2nd law, statistics).

Temperature:

1T≡

∂S∂Eint

Entropy: S ≡ k lnΩ

17

Page 18: Lecture 10 PHYS 416

3.3 What is temperature?

Consider a system with 2 subsystems, labelled 1 and 2. Each has a fixed V and fixed N, and are weakly coupled so that energy can be shared. Total energy is fixed (microcanonical ensemble). A particular state can be labelled (s1,s2), with E=E1+E2.

The probability density for s1 to occur must follow

ρ s1( )∝Ω2 E − E1( )Where the phase-space volumes of the energy shells for the two subsystems are

Ω1(E1)δ E1, Ω2(E2 )δ E2

Copyright Oxford University Press 2006 v2.0 --

3.3 What is temperature? 59

the dependence of the energy E1 of the first subsystem on the state s2of the second one, and vice versa.29 29A macroscopic system attached to

the external world at its boundaries isusually weakly connected, since the in-teraction energy is only important nearthe surfaces, a negligible fraction ofthe total volume. More surprising, themomenta and configurations in a non-magnetic, non-quantum system are twouncoupled subsystems: no terms in theHamiltonian mix them (although thedynamical evolution certainly does).

Our microcanonical ensemble then asserts that the equilibrium be-havior of the total system is an equal weighting of all possible statesof the two subsystems having total energy E. A particular state of thewhole system is given by a pair of states (s1, s2) with E = E1 + E2.This immediately implies that a particular configuration or state s1 ofthe first subsystem at energy E1 will occur with probability density30

30 That is, if we compare the proba-bilities of two states sa1 and sb1 of sub-system 1 with energies Ea

1 and Eb1, and

if Ω2(E − Ea1 ) is 50 times larger than

Ω2(E − Eb1), then ρ(sa1) = 50 ρ(sb1) be-

cause the former has 50 times as manypartners that it can pair with to get anallotment of probability.

ρ(s1) ∝ Ω2(E − E1), (3.20)

where Ω1(E1) δE1 and Ω2(E2) δE2 are the phase-space volumes of theenergy shells for the two subsystems. The volume of the energy surfacefor the total system at energy E will be given by adding up the productof the volumes of the subsystems for pairs of energies summing to E:

Ω(E) =

∫dE1 Ω1(E1)Ω2(E − E1), (3.21)

as should be intuitively clear.31 Notice that the integrand in eqn 3.21,normalized by the total integral, is just the probability density32 for the 32Warning: Again we are being sloppy;

we use ρ(s1) in eqn 3.20 for the prob-ability density that the subsystem is ina particular state s1 and we use ρ(E1)in eqn 3.23 for the probability densitythat a subsystem is in any of many par-ticular states with energy E1.

subsystem to have energy E1:

ρ(E1) = Ω1(E1)Ω2(E − E1)/Ω(E). (3.23)

If the two subsystems have a large number of particles then it turns out33

33Just as for the configurations of theideal gas, where the number of particlesin half the box fluctuated very little,so also the energy E1 fluctuates verylittle from the value E∗1 at which theprobability is maximum. We will showthis explicitly in Exercise 3.8, and moreabstractly in note 37 below.

that ρ(E1) is a very sharply peaked function near its maximum at E∗1 .

Hence in equilibrium the energy in subsystem 1 is given (apart from smallfluctuations) by the maximum in the integrand Ω1(E1)Ω2(E−E1). Themaximum is found when the derivative (dΩ1/dE1)Ω2 − Ω1 (dΩ2/dE2)is zero, which is where

1

Ω1

dΩ1

dE1

∣∣∣∣E∗

1

=1

Ω2

dΩ2

dE2

∣∣∣∣E−E∗

1

. (3.24)

It is more convenient not to work with Ω, but rather to work with itslogarithm. We define the equilibrium entropy

Sequil(E) = kB log(Ω(E)) (3.25)

for each of our systems.34 Like the total energy, volume, and number

34Again, Boltzmann’s constant kB is aunit conversion factor with units of [en-ergy]/[temperature]; the entropy wouldbe unitless except for the fact that wemeasure temperature and energy withdifferent scales (note 25 on p. 58).of particles, the entropy in a large system is ordinarily proportional

31It is also easy to derive using the Dirac δ-function (Exercise 3.6). We can also derive it, more awkwardly, using energy shells:

Ω(E) =1

δE

E<H1+H2<E+δEdP1 dQ1 dP2 dQ2 =

∫dP1 dQ1

(1

δE

E−H1<H2<E+δE−H1

dP2 dQ2

)

=

∫dP1 dQ1 Ω2(E −H1(P1,Q1)) =

n

nδE<H1<(n+1)δEdP1 dQ1 Ω2(E −H1(P1,Q1))

≈∫

dE1

(1

δE

E1<H1<E1+δEdP1 dQ1

)Ω2(E −E1) =

∫dE1 Ω1(E1)Ω2(E − E1), (3.22)

where we have converted the sum to an integral∑

n f(n δE) ≈ (1/δE)∫dE1f(E1).

Page 19: Lecture 10 PHYS 416

Ω E( ) = dE1Ω1 E1( )∫ Ω2 E − E1( )

The total number of states in this ensemble is given by the volume in phase space with E=E1+E2, adding (integrating) over all subsystems (microstates) E1,E2:

This allows us to write the probability density for the subsystem 1 to have energy E1:

ρ E1( ) =Ω1 E1( )Ω2 E − E1( ) Ω E( )

Page 20: Lecture 10 PHYS 416

1Ω1

dΩ1

dE1 E1∗

= 1Ω2

dΩ2

dE2 E−E1∗

ρ E1( ) =Ω1 E1( )Ω2 E − E1( ) Ω E( )Of course for large N this is a very sharply peaked function with a maximum at some E1*. Equilibrium between these two subsystems is given by the maximum in the product W1W2. Set the derivative to zero:

dΩ1

dE1Ω2 −Ω1

dΩ2

dE2= 0

Page 21: Lecture 10 PHYS 416

Sequil E( ) = kB log Ω E( )( )Let’s define the equilibrium entropy:

dSdE

= kB1ΩdΩdEHence:

Page 22: Lecture 10 PHYS 416

ddE1

S1 E1( )+ S2 E − E1( )( ) = dS1dE1 E1∗−dS2dE2 E−E1∗

= 0

1Ω1

dΩ1

dE1 E1∗

= 1Ω2

dΩ2

dE2 E−E1∗

This…

…becomes this:

Page 23: Lecture 10 PHYS 416

dS1dE1

=dS2dE2

1T= ∂S∂E V ,N

At equilibrium we therefore have this equality:

We also know that at equilibrium the two subsystems reach the same temperature, so we define:

Page 24: Lecture 10 PHYS 416

1T= ∂S∂E V ,N

“Inverse temperature is the cost in entropy to buy one unit of energy.”