This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

來自 University of Minnesota 的課程

分子热力学统计

146 個評分

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

從本節課中

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Well now that we know how to compute third law entropies whether we measure

them experimentally or compute them from first principles.

Let's work a bit on aligning those entropies with our intuition.

And so let me illustrate a few 298 kelvin standard molar entropies.

And we'll always use the SI units of joules per kelvin per mol.

And so, let's start with some solids. So, here we have the entropies of carbon,

in two different allotrope, its diamond form, and its graphite form.

And it has very small entropies at 298 kelvin.

So if you remember at this temperature nitrogen gas had a huge entropy of almost

200. But for the solids, it's only a couple

percent of that 2.4 5.7. So a bit higher for graphite.

And then we also have some metals. So these are not the atoms but the solid

forms. So sodium metal, potassium metal, silver

metal, 51.3, 64.7, 42.6. So these are about ten fold larger than

is the case for carbon. Well so what about diamond compared to

graphite sort of an intuitive test. So we know that diamond is one of the, is

the hardest substance on Moe's scale of hardness if I remember back to my grade

school science classes. And the reason it's hard is that it has

an incredibly stiff lattus, so the carbon atoms in a lattus are in a beautiful

diamond like arrangement and graphite on the other hand is not as stiff.

Graphite is easily broken and it's just a different form of carbon.

So the graphite has a somewhat higher entropy because it can have a bit more

disorder in its solid lattus. Meanwhile, the metals differ from carbon

insofar as they are conductors as opposed to insulators.

And so in conductors there are accessible states to the electrons.

So a conductor allows electrons to flow. And those electrons can access states

that are continuously available in a sense.and that contributes to the

entropies of conducting metals. So they just have higher values.

Lets do a another sort of intuitive check and compare liquids to gases.

And so if we look at water or bromine, both of which can be a liquid or a gas

pretty readily at 298 kelvin. The liquid forms have substantially lower

entropy's then the gases forms. And that simply reflects the condensed

nature of the liquid reduces its entropy, there's less disorder when you condense

everything as opposed to allowing it to fill a large volume.

Now bromine irrespective of a phase liquid or gas has substantially higher

entropy then water. And so, most of that is associated with

the, just the greater mass of bromine. So if you remember, we'll see it again in

a, a little bit. The expression for entropy involves mass

in the numerator of an, of some expressionable logarithm.

So as the mass goes up, the entropy goes up.

And then in the gas phase that difference grows a little lower.

So here's 82 joules per kelvin per mol. Here it is 57, I guess that works out to

be roughly joules per kelvin per mol. And one of the reasons for that is that

water is a nonlinear molecule, unlike bromine.

And so it has some additional rotation. It has an additional rotational degree of

freedom. And you may recall that rotations can

contribute significantly to entropy. And so that's something not available to

bromine and that dictates some of the change in the difference.

What about gases? So, here are a variety of gases.

And it'll become more clear in a moment why I've, I've tabulated them this way.

Here, we have the noble gases. And they range in entropy from 126.2 at

room temperature, joules per kelvin per mole.

Through neon, argon, krypton, xenon up to 169.7 And so what's the reason for this

variation? Well, again, I'll, I'll ask you to

remember that the transnational piece of the, partition function, and its

appearance within the expression for entropy involves the mass of the

molecule. So I've ordered these by increasing mass.

And you'll notice, that the, the difference as they go up, is, it

increases 20, it increases 8, it increases 10, it increases 5.

So the increase is getting smaller and smaller.

And that's because, it's not the mass, it's not linear in the mass, remember

it's the log of the mass. So, I'll show that to you graphically in

just a moment. But that's the behavior we expect.

As things get heavier, there will be a decreasing influence of, of becoming

heavier. Now, the next series of gasses I have

plotted over here are molecular fluorine, chlorine, bromine and iodine.

And, again, these are increasing from 202.8 up to 260.7.

So, by coincidence, to some extent, if you were to compare the masses along

these arrows that I've indicated. It turns out Argon, it weighs almost

exactly as much as molecular fluorine. And krypton is very close to molecular

chlorine. And xenon is very close to molecular

bromine. However, the diatomic halogen gasses have

substantially more entropy, even though they have roughly the same masses as do

the noble gases. And why might that be?

Well the noble gases can't rotate, there just atoms.

But the halogen gas's have two rotational degrees of freedom.

And so there's a contribution then to rotational entropy which depends on the

rotational temperature and we're way about rotational temperature at 298

kelvin. And so that's the reason for the extra

entropy in these diatomics compared to their mass similar monotomics.

Over here, finally, the last set of gases, these are the hydrogen hal,

halogen gases, I guess I'll call them hydrogen halides, we could say.

HF, HCL, HBr, and HI. Again, increasing entropy as I go to

increasing mass of the halogen. And if I were to look at similar masses,

I have that F2 is actually relatively close to HCL in mass.

CL2 to HBr, and Br2 to HI. What we observe is that the entropy

decreases as we go from the dihalogens, the diatomic halogens, to the hydrogen

halogens. So, why might that be?

Now, you might be tempted, given what's going across the bottom of the slide, to

say, aha. We've talked about the translational

part, we've talked about the rotational part.

Maybe it has something to do with the vibration.

But that probably would be a bad answer. There's two reasons it might not be such

a great answer. One is we know that, at room temperature,

usually it doesn't seem like vibrations contribute all that much to entropy.

But in addition, if you were to ask, what's the frequency of one of these

hydrogen halides compared to one of these dihalogens.

Well, all right, one might not know that off the top of one's head.

But in general, Bromine, for instance, has a very weak bond to some extent.

It has a low vibrational frequency. But as soon as you attach a hydrogen atom

to something, you're talking about a very high vibrational frequency because of the

way reduced mass plays a role in the vibration.

So, one would expect The vibrational entropy to be even smaller for the

hydrogen substituted halogens, compared to the dihalogens.

No, so in fact what's playing a role here is, is not, so I'll put it in brackets,

not the vibrational component of the entropy, it's still the rotational

component. So, it depends on rotational temperature,

and what does rotational temperature depend on?

It depends on the moment of inertia. And so these molecules have very large

moments of inertia, especially as we get to the really heavy ones, because you've

got big heavy atoms on some sort of a rotor.

Hydrogen on the other hand is the lightest element in the periodic table.

And so, it contributes very little to a moment of inertia.

So, because these have lower moments of inertia, they have more distantly spaced

rotational energy levels. And, there's less disorder, because

there's less accessible levels at a given temperature.

And that reduces the entropy, then. So really this is a great table to assess

ones appreciation for what's contributing, how much you had expected

to the contribute them and what trends you would expect to see.

So, let me just plot those now on a common scale, those are all the numbers I

showed on the last slide in units of joule per kelvin per mole.

Plotted as the log of the mass of the molecule itself.

And so here are the noble gasses. And you see that, sure enough, it's

roughly linear in the log rhythm of the mass.

Here are the dihalogen molecules. And they too are still linear in the

mass. So, Since all were var-, well, we're also

varying a little rotation, but it doesn't show up here.

it, i-, it has the expected increase in entropy associated with an increase in

mass. There will also be some associated with

the change in moment of inertia, but it must be increasing also with log mass.

And then finally we have the HX series. So each of these series is unique -

comparisons being made within itself. And when we were focusing on things

having similar mass, like these three, two molecules and an atom.

They all have a similar mass, and yet the noble gas has the lowest entropy because

all it has is translational entropy. The hydrogen halide has more because it

can rotate, but it's moment of inertia is smaller than the dihalogens, and so, it

gets a little extra rotational entropy. And it's highest here, on the axis of

entropy. So, variation within a series primarily

dictated by mass, relationships between series differentiated by rotational

entropy. Alright, well having made those points,

let me give you a chance to exercise your intuition on a series and then we'll come

back. Well let's wrap up with just a few more

comparisions. I want to look now at some polyatomic

gases. So still at, at 298 k, I've got carbon

dioxide, nitrogen dioxide, methane, acetylene, ethylene and ethane.

And one of the purposes of this slide is to illustrate again, the amazing

agreement between calculated entropies. So using only properties of the

individual molecules, their mass, their rotational temperature, their vibrational

temperature, and the degeneracy of their electronic ground state.

These are the calculated entropies. These are the experimental entropies.

And you observer that to within 0.1 joules per kelvin per mole, spot-on

quantitative agreement. But, our goal here was to look a bit more

at trends. So let's look at CO2 versus NO2.

So CO2 and NO2 weigh very, very nearly the same.

Carbon has a mass of 12 in it's most abundant isotope, nitrogen a mass of 14,

but there's a difference of 27 26.5. I guess, if we want to be careful, in

their entropies. So, why is that?

Well, the issue is that carbon dioxide is a linear molecule.

Nitrogen dioxide is a bent molecule. It's non-linear.

And, so remember, that a linear molecule only has two rotational degrees of

freedom. The non linear nitrogen dioxide has three

rotational degrees of freedom. So that's an extra rotational degree of

freedom where there can be a lot of disorder because of closely spaced

rotational energy states. And that's enough to contribute to

substantially higher entropy. If we now look at the various

hydrocarbons here, methane, acetylene ethylene ethane, there is a steady

increase in the entropy. It's not huge, amongst, say, the, the C2

isomers. And mostly it's just associated with

increasing mass. So I keep adding hydrogen, or, or carbon

atoms. and that adds to translational entropy.

And then there's also a small increase in the rotational moments of inertia for the

heavier hydrocarbons. And so I'll, I'll just emphasize one more

time that you know a real demonstration of the power of statistic thermodynamics

is that it's possible to derive from first principles these third-law

entropies, and derive them with alarming accuracy.

let me do one more, somewhat informative comparison, I think, and look at two

other gases. Here we have acetone, or dimethyl ketone,

and here we have trimethylene oxide. So, four-membered ring with an oxygen.

you could also call this ox-, oxitane, c-, common name for that four member

ring. And, they both have molecular formula,

C3H6O. Alright, so they have the same mass.

And yet if we look at their respective entropies at 298 kelvin, it's 298 jewels

per kelvin per mol, just a coincidence that it happens to be at the same number

298, as the temperature we're studying. 298 jewels per kelvin per mol for

acetone. 274, so reduced by about 78% for

trimethylene oxide. So again, a good test of intuition, why

might one expect that sort of behavior? And this one it's a little trick here

maybe to point to something trivial, like rotation, or to at least thinking about

the moments of inertia, perhaps its not obvious which one should have higher or

lower. But certainly what is true is that when

we tie this molecule together in a strained ring.

It seems like that's going to have less freedom, less disorder to have motions of

the atoms within the molecule. So if you think about acetone for example

the CH3 groups attached to the carboneel. They're nearly free roaders, to some

extent. So that is an internal degree of freedom,

where you might expect there to be a lot of energy levels.

This is rotating slowly, it's rotating more quickly, that are accessible at room

temperature. But I can't really rotate about any bond

in trimethylene oxide. Because I've tied all the bonds into a

small tight ring. And really, that shows up.

There is less entropy in the molecule on the right than on the left.

Because there is less freedom for internal motions.

The last item I'd like to look at which again shows off the power of this first

principle's analysis is residual entropy. So let me tell you a story about carbon

monoxide. So carbon monoxide is an interesting

molecule. It has a dipole moment, that's

illustrated as shown here. It's negative at the carbon end, positive

at the oxygen. If that seems to violate

electronegativity it's an interesting molecule.

It's a very small dipole moment, but it is oriented in the direction that I've

indicated. And it's because there's a pi cloud of

electrons, and there's a sigma bond, as well.

But in the end it's slightly polarized towards carbon.

And if you look at the boiling point of carbon monoxide, 81.6 kelvin, and ask the

question if I compute using statistical molecular thermodynamics, what's the

molar entropy at that temperature. I'll get a value of 160.3 joule per

kelvin per mol. But the measurement by starting very

cold, using a third of the heat capacity and then integrating up, is a 155.6.

And that is a residual entropy, that is a difference between calculated and

experimental, of, it looks like 4.7. That's much larger than anything we've

seen before, where pretty much 0.1 was The largest deviation we saw.

So, so what might be going on there? Is this a failure of molecular

thermodynamics? I, I certainly hope not.

And actually, the issue is an interesting one.

So, because of this very small dipole moment, if we were to make a perfect

crystal of carbon monoxide we would expect it to organize itself.

So that every dipole was opposed to the dipole next to it, because that's the

best orientation for dipoles, and then opposed again, and opposed again, and

opposed again. But the interaction between these dipoles

is really very, very small. And the problem is that as we're cooling

it, we can't really cool it slowly enough that it settles into that final perfect

crystal. Instead it sort of freezes, that's really

the right term here I guess, freezes into a state having higher disorder where

we'll have some align dipole. Maybe a few in a row even.

And then there's an opposed dipole and we just didn't get all the way there.

So there's residual entropy left in that thing that we're experimentally

measuring. We're assigning it as though it's zero,

but it's not, it's something a little higher.

Well, how much might that be? Well, let's think about that from a

standpoint of statistical entropy. We would get that w for a mole, if you

think of it as though this dipole can be either up or down in the crystal and we

just call them degenerate. We say that they're not actually

interacting with each other. Well then there would be a degeneracy of

two for every one of the molecules. So the total degeneracy then is, for a

mole, would be two to Avagodro's number power.

Because they're all independent and they can all be up or down.

And so, s equals k. The molar entropy equals k log molar w.

That's k log 2 to Avogadro's number. And so I'll take the Avogadro's number

out front as a multiplier. It's r log 2.

And that says, actually, we're cooling it down to absolute zero, but because we got

trapped in this highly degenerate-like crystal, we've got, in that crystal, not,

not zero entropy. But about 5.7 joules per kelvin per mol.

So if I take the 155.6 that I measured as the increase in entropy.

And I add it to, not zero because I never got to zero but I add it to 5.7 I would

be at 161.3. That certainly agrees with 160.3.

Its a bit high and that suggests actually we don't have complete degeneracy.

We did manage to get many of the dipoles oriented favorably one to another, we

just didn't get all the way there. And so that's residual entropy.

And it's kind of a nice indication that actually the statistical molecular

thermodynamics is more accurate potentially than a measurement.

Because the measurement requires us to get to the perfect crystal as a starting

point. Okay.

Well that covers a lot of intuition as well as some interesting thinking about

things near absolute zero. In the next video we'll consider

additivity of entropies. [SOUND]