**Notice**: __2ndlaw.com__ is now
http://2ndlaw.oxy.edu. Please update your links and bookmarks.

* Entropy *and Gibbs free energy, ΔG = ΔH - TΔS

This page is for students who have wrestled with some problems involving the Gibbs equation, ΔG = ΔH - TΔS, and think that the DH in it has nothing to do with entropy.

Prof: The whole Gibbs relationship or function is about entropy change.

Student: You’re wrong. Just that last term, TΔS, is entropy change. There’s always a conflict between the enthalpy and the entropy terms. Only at high temperatures does the entropy part win. Ha.

P: Divide the Gibbs deal by T. What’s the nature of the terms now? Doesn't each one look like entropy change?

S: Δ G/T = Δ H/T - Δ S. Whaddya mean, look like entropy change? Δ S is q/T.

P: Sure, but q is the transfer of thermal energy (that we often, too loosely, call "heat"). So isn’t ΔH really a "q", a thermal energy transfer? Also, ΔG/T has the form of an energy transfer/T just as that ΔH/T does. All ΔSs! Therefore, the Gibbs equation really is

"Entropy change (1) = Entropy change (2) – Entropy change (3)"

BUT we’d better be a lot more specific and talk about what those three entropy changes really mean.

S: Darn right. Divide by T and I admit everything in that Gibbs __looks__ like
entropy change. But that just confuses me. What happens to the fight between enthalpy and
entropy if enthalpy turns into entropy? Do I have to learn another mysterious phys chem
equation?

P: No way, no mystery. Let’s give it the full court press – you’ll be amazed at how neat everything comes out (because now that "fight" between enthalpy and entropy will make sense). It'll give you a much better feel for entropy itself.

To start, let’s think about a system in which a chemical reaction is occurring. There can be thermal energy transferred ("heat") from the system to its surroundings or vice versa. Well, let’s really think big by saying that nothing else is happening in the entire universe but the reaction in our system. Look at the entropy changes involved:

Now, if chemicals are mixed in the system (at constant T and, normally, constant P) and a reaction occurs, some thermal energy transfer ("heat"), q, takes place in the reaction. How much q? That's easy: q is the change in enthalpy; q = ΔH(system). (The sign of ΔH can be + or - , but we're just talking broadly and generally, so let's start simply with + ΔH.)

However, from the viewpoint of the surroundings, the sign of ΔH changes when thermal energy is transferred from the system and becomes absorbed by the surroundings, i.e., a + ΔH(system) when transferred to the surroundings becomes - ΔH(surroundings) -- (and, of course, a -ΔH(system) when transferred out of the system becomes ΔH(surroundings). As a general equation, simply to express that change of sign, here's (2):

-ΔH(surroundings) =ΔH(system) (2)

What does this have to do with entropy? To answer that, let’s divide equation (2) by T:

-ΔH/T(surroundings) = ΔH/T(system) (3)

Then, since ΔS = q/T, and the only q in the surroundings right now is - ΔH, that means that - ΔH/T(surroundings) = - ΔS(surroundings). Therefore, inserting this result in equat. (3):

(Assuming that the surroundings are far larger than the system, i.e., reversible conditions.)

-ΔS(surroundings) = ΔH/T(system) (4)

or, changing signs merely so we have a + ΔS to work with in a moment,

ΔS(surroundings) = - ΔH/T(system) (5)

Now, replacing DS(surroundings) in equat. (1) with -ΔH/T(system) as just justified by (5):

ΔS(universe) = - ΔH/T(system) + ΔS(system) (6)

ΔS(universe)?? Just to say that
aloud seems like a really big mouthful -- and head-full! But remember that we started out
originally by saying that the __only__ reaction happening in the whole universe was the
one in our constant T, constant P system. So any DS(universe) would be perfectly measured by what happens only __in our system__.

Let's see. To put it in the most general terms, the energy change that occurred in the reaction in the system -- and which entropy measures by ΔS = q/T -- has been spread out, some remaining in the system as ΔS and some being transferred to the surroundings and identifiable as the ΔH/T(surroundings). (It can be used for work, but right now we're looking at it either as though it was all dissipated as unavailable thermal energy or as though the work itself had resulted in the same amount of waste energy, i.e., as ΔS(surroundings).

But exactly what was the original energy change that came out of the reaction in our system? We don't know, but let's give it a symbol of ΔG(system). OK, since that was the only event that occurred in the universe, and because energy flow (thermal energy transfer, "heat") divided by T is entropy, ΔG/T(system) is equal to the total entropy change in the entire universe due to this reaction! ΔS(universe) = ΔG/T(system)

Oops! Watch it! We have to switch algebraic signs when looking at this energy/T change
from a __universe's__ viewpoint rather than from the __system's__. So

ΔS(universe) = - ΔG/T(system)

and equation (6) -- derived from that "way-back-there" equation (1) -- becomes

- ΔG/T(system) = - ΔH/T(system) + ΔS(system)

Multiply through by –T and what do you get?

S: How about that! The old Gibbs, ΔG = ΔH - TΔS. But wait. What kind of fast shuffle are you dealing me when you to start with all entropy and end up with enthalpy and ΔG that’s called "free energy"? Is entropy enthalpy? Is entropy free energy? I’m REALLY confused now.

P: Slow down. Don’t panic.

Entropy is never enthalpy,
nor free energy. A system’s enthalpy is only entropy change (__after ΔH is divided by T__) if it is transferred to the surroundings __and
no work of any sort is done there in the surroundings__. A surroundings’ enthalpy
is only entropy change (after ΔH being divided by T) when it is
transferred to the system and __no work is then performed in the system__. Gibbs free
energy change, ΔG, is only considered entropy change(after
being divided by T) when __no useful work of any kind is done by the heat transfer in the
system or in the surroundings__.

Starting with that Equation (1), we have found the breadth of the concept we
call entropy. Entropy change, in the classical "macrothermodynamics" we've been
looking at, is the ratio-measure, q/T, of the driving force for __every__ spontaneous
chemical reaction in the universe. Even when we focus on a little chemical system in our
lab on earth by means of the Gibbs equation, we see the driving force -- the tendency of
energy formerly bound in reactants to be spread out in the products. From information in
other areas of chemistry than thermodynamics, we now know that the ΔH of Gibbs primarily comes from the difference in
electronic binding energy between products and reactants. Usually in exothermic reactions,
the products have stronger bonds and so some of the greater binding energy in the less
strongly bound reactants is released as heat. From the energetics of molecules (again,
outside of macrothermodynamics) we know that this increased thermal energy is due to
molecules moving more rapidly and colliding more forcefully with one another. When the
resultant DH(system) gets to the
surroundings, it can be used to boil water or run a steam engine or charge a battery. But
if it __isn’t__ used for doing any work, it is simply dissipated __in the
surroundings__ as DH/T(surroundings),
i.e., as ΔS(surroundings).
It becomes entropic "waste heat", no longer capable of causing change
because it is at the same temperature as the surrondings (even though it may have raised
the terperature of the surroundings and infinitesimal amount.)

The Gibbs equation is only concerned with
macrothermodynamics, with what we can measure in the lab rather than what is happening
down there inside the molecules during chemical reactions. What IS entropy change from the
viewpoint of a molecule? Boltzmann and "microthermodynamics" deal with that.
(You've heard a lot about it in other sections of this Web site.) To begin with, depending
on their temperature and solid/liquid/gas state, all molecules need a certain amount of
energy to move and rotate and vibrate internally, in addition to the large amount of
electronic energy in their bonds that holds them together. S(298^{o}K, formation)
in your textbook tables -- that's really DS(from absolute zero, formation) because entropies are assumed to be 0 there
-- measures all those kinds of energy "in a bundle". So TΔS in the Gibbs equation is the amount of energy a
mol of product molecules needs to exist at a temperature T compared to one of reactant
molecules. (Give them more energy and they scoot around faster because the added energy
goes into their translational/rotational/ vibrational modes. Heat them up to a high enough
temperature and you even boost them to an excited electronic state where bond breaking can
occur.

The confusion -- about "If it's
enthalpy, how can that be entropy???" -- comes in because the BIG equation (1) is
actually the __ultimate__ scenario for energy transfer in any chemical change. (Or
"physical change" too, but let's skip that in concentrating on Gibbs.) We may be
able to sneak some work out of the ΔH
when or as that energy transfer from system to surroundings occurs and that's fine.
Perhaps we can use it for human purposes. Good. Go ahead and calculate how much w.

But equation (1) is saying that ultimately any kind of work is going to become diffused (or at least tend to) and will then be unusable dispersed thermal energy in just as many molecules in the system and surroundings as possible. Entropy measures energy transfers from "concentrated" to "spread out", and that's the overall trend in the physical universe.

So you now have a better picture of fundamentals and of the Gibbs equation. There is no "conflict" betweenΔS. They're really just different aspects of entropy change: ΔH will become entropy in the surroundings if it does no work, and TΔS just represents what little energy is left over to become dissipated (as increased entropy in the surroundings) after the ΔS(formation) of the product molecules has been taken care of.

S: Ho ho ho! Now I see better what that "crossover T" point in the problems I
worked really means. At low temperatures many reactions, say the old one of water being
broken into hydrogen and oxygen gas, just don't go to any appreciable extent. Why not?
Well, you simply don't have enough energy around in the system to put it into any new
hydrogen and oxygen molecules so they can exist with their own individual allotments of ΔS even in their minimal quantum levels. However,
as you raise the temperature more and more from outside the system -- and that means more
and more __intense__ energy, not just more energy -- there comes a temp point where you
have enough flowing into the system to give the H_{2}
and O_{2} molecules adequate energy for them to
exist. Then any temperature above that is more than they need: they show it by flying
around faster and hitting each other harder and harder.

Entropy is the key, especially when I can see it interpreted by what you call "microthermodynamics". That micro stuff says that entropy measures how much energy is needed to fill those many more electronic (plus vibrational, etc.) quantum levels in the hydrogen and oxygen compared to the water. Neat. If there's not enough intense energy to fill them? Forget it. No reaction. Even though the text's talking about "positional entropy" and "the more molecules, the more entropy" made it easy to answer questions about increased entropy, I never could see how having more molecules in the products than the reactants had anything to do with the reaction not going at low T and going at high T.

P: Hooray. Ya got it.

References

Scholarly and helpful
analysis of the Gibbs function, showing that is is more closely related to entropy than to
energy and building on Planck’s function (of total entropy change in a universe) that
is used in this Web page: Professor Laurence E. Strong and H. Frank Halliwell, *Journal
of Chemical Education*, 1970, 47 [5] 347 – 352.

An excellent introduction to
Professor Norman C. Craig’s procedure of attacking entropy problems and to his short
book *Entropy Analysis *(John Wiley, New York 1992), the best text on the subject, is
in "Entropy Analyses of Four Familiar Processes", *Journal of Chemical
Education,*** 1988*** 65* (9), 760 – 764.

Professor John P. Lowe’s
explanation of the importance of the occupancy of energy levels as a genuine basis for
entropy (rather than "randomness" or "disorder") via Q and A is superb
in *Journal of Chemical Education,*** 1988*** 65* (5), 403 – 406.