Thermodynamic potentials

Basics

From Callan, we have two equivalent principles of thermodynamics:.

  1. Entropy maximum principle. The equilibrium value of any unconstrained internal parameter is such as to maximize the entropy for the given value of the total internal energy.
  2. Energy minimum principle. The equilibrium value of any unconstrained internal parameter is such as to minimize the energy for the given value of the total entropy.1

The basic formula for the internal energy as a function of S and V (and the chemical potentials, Ni) is

$latex U = TS – pV – \mu N$

where N stands for the sum over different chemical potentials

$latex \sum_{i} \mu_{i}N_{i}$

representing the quasi-static chemical work and

$latex dU = TdS – pdV – \mu dN$

Depending on the problem at hand, It may be simpler to solve for the equilibrium state if the formula had different independent variables. That is what Legendre transformations do.

Legendre transformations

Given a formula

$latex Y = Y(X)$

then

$latex P=\frac{\partial Y}{\partial X}$

is the slope of the curve at a given point. We would like an equivalent representation of the system but one where the independent variable is P rather than X. We will use the intercept of the Y axis of a tangent of constant slope at X.

Then the slope is given by

$latex P = \frac{Y-\psi}{X-0}$

which can be arranged to give the Legendre transformation

$latex \Psi = Y – PX$

Then

$latex d\Psi = dY – PdX -XdP = -XdP$

or

$latex -X = \frac{\partial \Psi}{\partial P}$

so the inverse transformation is just

$latex Y = XP + \Psi$

In general, for

$latex Y = Y(X_0,X_2…X_i)$

the partial slope of this hypersurface is given by

$latex P_k = \frac{\partial Y}{\partial X_k}$

and the Legendre transformation is

$latex \Psi = Y -\sum_{k} P_kX_k$

Example from Callan, problem 5.2-1:

$latex y = \frac{x^2}{10}$    gives    $latex P = \frac{\partial Y}{\partial X} = \frac{x}{5}$

so

$latex \Psi (P) = y – Px = -\frac{5}{2}P^2$

Inversely,

$latex -X = \frac{\partial \Psi}{\partial P} = -5P$

so

$latex Y(X) = \Psi + XP = \frac{X^2}{10}$

Principle thermodynamic potentials

The most used thermodynamic potentials are the following and are all Legendre transforms of the internal energy U as shown in the table.

Name Symbol Formula Natural variables Constant Interpretation
Internal energy U TS – pV + μNi S, V, Ni Total internal energy
Helmholtz free energy F (or A) U – TS T, V, Ni T Max energy at constant T, V after system pays “entropy tax”
Enthalpy H U + pV S, p, Ni p  Total heat (caloric) energy availabe in system at const. P after V change
Gibbs free energy G U + pV -TS T, p, Ni T and p  Max energy available at const. T, P after V change and “S tax”

We can identify each term with a type of energy:

  • TS with heat (or caloric) energy,
  • pV with volume-change energy (as when a piston is pushed) and
  • the sum $latex \sum_{i} \mu_{i}N_{i}$ as the energy of chemical bonding.

We will ignore such things as electrical energy.

The above-cited energy-minimum principle applies to all these potentials under specific circumstances.

Helmholtz Potential Minimum Principle. The equilibrium value of any unconstrained internal parameter in a system in diathermal contact with a heat reservoir minimizes the Helmholtz potential over the manifold of states for which T = Tr [at constant T].1Callan, 155

Enthalpy Minimum Principle. The equilibrium value of any unconstrained internal parameter in a system in contact with a pressure reservoir minimizes the enthalpy over the manifold of states of constant pressure (equal to that of the pressure reservoir).2Callan, 156

Gibbs Potential Minimum Principle. The equilibrium value of any unconstrained internal parameter in a system in contact with a thermal and a pressure reservoir minimizes the Gibbs potential at constant temperature and pressure (equal to those of the respective reservoirs).3Callan, 157

We can attribute physical meanings to these potentials.

In the case of enthalpy, a system at constant P can expand or contract, therefore losing energy to move around molecules. This volume-change energy is exactly what is subtracted from U in order to derive H. So enthalpy is the energy released as heat by the system at constant P. Or, in the other direction,

…heat added to a system at constant pressure and at constant values of all the remaining extensive parameters (other than S and V) appears as an increase in the enthalpy.4Callan, 161

As for the Helmholtz free energy,

… the work delivered in a reversible process, by a system in contact with a thermal reservoir [so at constant T], is equal to the decrease in the Helmholtz potential of the system.5Callan, 159

The Helmholtz free energy is thus the available work at constant temperature and volume. The term TS represents the “entropy tax” which the system must pay in order for the total entropy of the universe to remain zero.

The Gibbs free energy is useful when T and P are constant, which is the usual case of chemical reactions open to the atmosphere, which acts as a reservoir of T and P. Gibbs free energy is thus much beloved of chemists. Callan gives a more interesting example, pointing out that it may also be true “… in a small subsystem of a larger system that acts as both a thermal and a pressure reservoir (as in the fermentation of a grape in a large wine vat).”6Callan, 167

Spontaneous reactions

Suppose there is only one thermodynamic system in the universe and that its T and V are constant. Then the sum of the entropy changes of the system and of its surroundings is the entropy change of the universe.

$latex \Delta S_{univ} =\Delta S_{surr} +\Delta S_{sys}$

If the energy of the system changes by an amount U, then the surroundings will change by -U, and

$latex \Delta S_{surr} = – \frac{\Delta U}{T}$

so

$latex \Delta U_{univ} = \Delta S_{sys} – \frac{-\Delta U}{T}$

which means

$latex \Delta U_{univ} = T\Delta SU_{univ} = -(\Delta U -T\Delta S) = -\Delta F$

So the entropy maximum principle says that for constant T and V, the change in the Helmholtz free energy must be negative, so that $latex \Delta S_{univ} = \frac{\Delta U_{univ}}{T} > 0$.

Similarly, if only a spontaneous chemical reaction occurs at constant T and P, then a change in heat energy (enthalpy) of the system means the surroundings change by the negative of that., i.e., $latex -\Delta H_{sys}$. So equation (1) gives

$latex \Delta S_{univ} = -\frac{\Delta H_{sys}}{T} + \Delta S_{sys}$

The only change in the universe is due to the dispersal of energy in a quantity we may call -ΔG. Therefore

$latex \Delta S_{univ} = -\frac{\Delta G}{T} = -\frac{-\Delta H_{sys}}{T} + \Delta S_{sys}$

which may be rewritten as

$latex \Delta G = \Delta H_{sys} – T\Delta S_{sys}$

which is the Gibbs free energy.

Bibliography

Atkins, Peter, The laws of thermodynamics: A very short introduction. Oxford: Oxford University Press, 2010. Print.

Callan, Herbert B. Thermodynamics and an introduction to thermostatistics. New York: John Wiley and Sons. 1985, 2005. Print.

Shankar, R. Fundamentals of Physics. Mechanics, Relativity, and Thermodynamics. New Haven: Yale UP, 2014. Print.

Notes

Notes
1 Callan, 155
2 Callan, 156
3 Callan, 157
4 Callan, 161
5 Callan, 159
6 Callan, 167

Simpler overviews

This thread gives presentations that are simpler and necessarily less complete overviews of the subjects of main articles. As such, they may not contain enough information to be quite convincing. Sorry about that. These are sometimes counter-intuitive subjects.

So we suggest that you start with the full version and only come here in case of need or, maybe, to review. Appropriate links exist in the main articles.

We will start with the Big Three theories of physics.

Thermodynamics

Thermodynamics is the branch of physics which started out talking about heat energy, in the early days of steam engines. Then other forms of energy, like electromagnetic or nuclear energy, came up and were included, so today it is the general theory of energy. We will be talking a lot about energy, so thermodynamics is important to us.

Thermodynamics posits three laws:

  1. The energy of the universe does not change; it is always conserved.
  2. In a physical process, the entropy of the universe always increases.
  3. The entropy of a system at a temperature of absolute zero is zero.

Law number one says that energy is always conserved. In fact, like speed limits, this one can be violated provided the perp does not get caught. This happens in Quantum Mechanics and we will look at that shortly. For anything bigger than an atom, though, energy is always conserved.

Law number two says in any process involving transformation of energy, you are probably going to lose some. This is equivalent to saying that the universe goes from a relatively ordered state (such as large plates stacked together, small plates stacked together, no large ones mixed with small ones) to a relatively unordered state (all the plates stacked together in any old order). Take a look at the Entropy entry on the main menu bar for more (easy) details. Most simply put, entropy is a measure of disorder. Nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time.

A standard example is the breaking of an egg, in which the egg becomes more disordered and therefore in a state of higher entropy. To do the opposite, for the broken egg to come back together, would require a decrease in entropy. Have you ever seen a broken egg re-form itself?

We are victims of entropy too. We eat food to provide energy to maintain our bodies in their highly ordered state. Whenever metabolism stops, rot sets in and .. we return to unordered dust.

We will use entropy in the following pages to explain why lots of things happen the way they do, rather than the opposite way.

Law number three just gives a base value for entropy measurement. Zero Kelvin is really cold, so cold we have never been able to get there in a laboratory. Precisely, it is -273.15° Celsius or zero Kelvin. We don’t do Fahrenheit.

Maybe you would like to check the full version now. If not, just go on.

Quantum Mechanics

Quantum mechanics is the theory we use to talk about atoms and elementary particles. i.e,  of what happens at very small dimensions, on the order of 10-30 meters or less! On that scale, things are very unusual.

There is no way to make QM intuitive. So here goes…

According to quantum mechanics, what is “out there” is a vast amount of space – not an empty backdrop, but actually something. This space is filled with particles so small that the distance between them is huge compared to their own sizes. Not only that, but these particles are actually waves, or something else which acts sometimes like waves and sometimes like particles. Light sometimes diffracts like waves (think prism) and sometimes leaves traces like particles.

As if that were not bad enough, it is impossible to measure simultaneously where they are and how fast they are moving (or how much energy they possess and when). This last effect is referred to as indeterminacy, or the Uncertainty Principle, one of the more uncomfortable and, simultaneously, fruitful results of the theory. We already have mentioned this exception to the First Law of Thermodynamics.

The three main difficulties most people have with QM are the following:

  1. the so-called wave-particle duality;
  2. the existence of discrete quanta for values of physical parameters;
  3. the Uncertainty Principle;
  4. the Exclusion Principle.

We have already mentioned numbers 1 and 3. Number 2 comes from the math. In fact, QM is a mathematical formalism with an equation one can (try to) solve for any given object of study. In general, the equation only has solutions for certain values of the parameters of the system. These might be the energy or the angular momentum or other things. For an atom, only certain energies are possible. Such allowed values are called quanta.

The Exclusion Principle says that certain particles known as fermions are constrained in such a way that no two of them can occupy the same QM state. Electrons are fermions. So this phenomenon, called the Exclusion Principle, is at the root of solid-state physics and therefore of the existence of transistors and all the technologies dependent thereupon – portable computers, mobile telephones, space exploration and the Internet, just as to mention a few examples. So QM has indeed revolutionized modern life, for the better and for the worse (think of wasteful and dangerous nuclear bomb proliferation).

The exclusion principle is also responsible for the fact that electrons in a collapsing super-dense star cannot all be in the same state, so there is a pressure effectively keeping them from being compressed any further. We will read more about that in the cosmology chapter.

Maybe you would like to check the full version now. If not, just go on.

Relativity

Relativity is not the idea that ‘everything is relative.” It is about relative motion.  The first, or Special Theory of Relativity says that if you are in a moving train and I am standing still outside, we will both use the same equations to describe what is going on, although the values we get may be different. I think you are moving, but if you have just waked up from a deep nap, you may think you are standing still and I am moving. But the oddball thing is that if you and I both measure the speed of light, we will come up with the same value, independently of how fast either of us is moving with respect to the other.

Normally, if you walk along your train car from the back to the front of the train, you will figure you are walking at something like 4 km per hour. But if the train is moving at 100 km per hour, I will see you moving at 104 km per hour. But if it is a beam of light from a laser which you shine down the length of the train, we will both measure the same value, about 300,000 km per hour. Bizarre, non?

The other odd thing of Special Relativity is that time and space are not independent, but go together in what is called space-time. And only massless, objects can travel at the speed of light, like photons, the particles of light.

If a massive object like my sister traveled at high speeds, near the speed of light, it would seem to me that she was getting heavier but thinner (perpendicular to the direction of her motion) and her clock would run more slowly. This includes her bodily clock — her heart. This is the heart of the Twin Paradox. If she takes a very fast spaceship on a joy ride, when she gets back, she will be younger than any hypothetical twin she may have left behind on earth. This has been tested (with high-speed particles) and is definitely true.

The other Relativity theory is called General Relativity. Whereas Special Relativity is the theory of space-time and light, General Relativity is the theory of gravity. GR says that space-time is curved and that it is more curved where gravitational forces are stronger. In fact, gravity is the curvature of space-time. Think of a plane surface with a depression in it. Put a ball on it and the ball will roll into the depression. Try to visualize that in four dimensions (Good luck!) and you’ve got GR.

Please take note: QM, SR and GR are all tested and confirmed theories. They are not hypotheses, they are true — at least for the moment. Any subsequent theory which amends them will have to include or explain their results.

Maybe you would like to check the full version now.

For the moment, I do not see how to simplify the next subject any more than I have already.  So go on to the Standard Model of Elementary Particles.

Thermodynamics

Thermodynamics is the science of forms of energy, one of our most important and interesting topics, as it underlies other sciences. Energy is the capacity for doing work. Among forms of “work” for which energy is required are lifting a heavy object, making electrons flow in a wire (electric current), streaming electrically charged ions in the nervous system or effecting contraction of muscle fibers (to lift that heavy object).

Over time, thermodynamics has evolved, adapting itself to new discoveries. First elaborated to deal with steam engines and boring cannons1The double entendre is intended., it has been adapted to include such phenomena as electromagnetism and even information.

Thermodynamics is summarized in the form of three laws2The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid..

  1. The energy of the universe does not change; it is always conserved.3There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
  2. In a physical process, the entropy of the universe never decreases.
  3. The entropy of a system at a temperature of absolute zero4Absolute zero is 0 Kelvin, or -273.15° Celsius. is zero.

Law number 1 is the justly famous law of conservation of energy. Law number 2 explains why physical processes don’t play out backwards in time. Number 3 is somewhat more obscure, but important.

Actually, after these three, someone noticed that there should be another, yet more fundamental law; it was therefore named the zeroth law, making the laws of thermodynamics a trinity of four:

0. If A is a system in thermal equilibrium with system B, and B is in thermal     equilibrium with system C, then systems A and C are also in thermal equilibrium.

This provides a way to measure the thermal state of a system. If system B is simply a thermometer, this says that if A and C are each in thermal equilibrium with it, they both have the same temperature.

Entropy is a concept, a mathematical construction, in some contexts calculated as the quotient of two other quantities: In a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T). We shall soon see another formula for it. The point is that you cannot hold a grain of entropy in your hand or feel it, like warmth or cold. There is no such thing as a simple entropy meter. But the abstract nature of entropy does not keep it from being a useful — in fact, an essential — idea.

Most simply stated, entropy is a measure of disorder. As every housewife or mother knows, nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time.

From a statistical viewpoint, entropy can be defined as follows.

Entropy is the probability5In fact, the logarithm of the probability. that something occurs in a certain state, i. e., with a certain configuration or order of its components.

Currently, many physicists and chemists prefer to speak of entropy as a measure of energy dispersal, rendering the dispersed energy unusable.6Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html An example of such dispersed energy is that lost to friction as heat, like that which renders a perpetual-motion machine impossible.

Physical, chemical and biological sciences explain that everything is composed of other smaller things, with a few exceptions, such as some of the most elementary particles, quarks. Even an electromagnetic field like light or radio waves is composed of photons. To understand the word “composed”, one can imagine an assembly of molecules or grains of something. If you interchange two identical molecules, you have a different system but one which you cannot distinguish from the first. One macro-state (like a bowl of sugar) is said to be composed of different, indistinguishable micro-states (because we do not know where the individual grains are – nor do we care). The number of interchangeable micro-states can be quite large. The probability of a macro-state’s occurring is a function of the number of indistinguishable states and so is its entropy.

S = k logW

where S is the entropy, k is the so-called Boltzmann constant, and W is the number of possible micro-systems which correspond to the same macro-system. (Originally, W stood for Wahrscheinlichkeit, German for probability.) In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states7In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.. Greater entropy corresponds to a higher probability, in turn reflecting a greater degree of disorder, since more component particles can be exchanged without our noticing.

Imagine building a sand castle. Small lumps of wet sand are placed in a precisely ordered fashion to create the various structures of the castle – walls, moats, turrets and so forth. But later, the tide does its job, reducing the well-ordered castle to a shapeless heap of wet sand. The grains are no longer in order, but are as if they had been thrown down just anywhere, leaving a formless heap, with more indistinguishable micro-states and in complete disorder – and therefore greater entropy.

So we should notice two things:

  • Nature selects the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.

  • Building the ordered sand castle takes work, i. e., expenditure of energy. Although the castle’s construction in an ordered form reduces by a small amount the entropy of the universe, the work it took has consumed energy and therefore increased the entropy elsewhere, so the total entropy of the universe has increased. In general, one can say that increasing complexity requires more energy input to keep entropy change positive.

That is why the Second Law and entropy are important. It takes work, i. e., energy, to reduce the entropy locally, although it nevertheless increases the entropy of the universe as a whole.

Another much-cited example of increased disorder, and therefore entropy, is the breaking of an egg. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.

As for us, entropy = death! Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (adenosine triphosphate), which then transmits the energy it contains to cells which use it to maintain the metabolism of the body. But with age, disorder comes in the form of small pains and dysfunctions. Old entropy’s got us.8To be sung to the tune of “Old rockin’ chair”.

The universe itself is not exempt from the Second Law. Galaxies, stars, our solar system and good old Earth are all ordered at the expense of nuclear energy from the stars or space itself (gravitational or electromagnetic fields or other more exotic or less-well-understood fields like the Higgs field or dark energy). One day, entropy will prevail there too. Even if the human species should find refuge elsewhere before our sun explodes in some five billion years, the entire universe will ultimately become cold, dark and diffuse.9This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest. Carpe Diem!

This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by our own efforts reverse – temporarily – the entropy of our little corner of the universe.

Now you can go read about something stranger yet, quantum mechanics.

Notes

Notes
1 The double entendre is intended.
2 The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid.
3 There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
4 Absolute zero is 0 Kelvin, or -273.15° Celsius.
5 In fact, the logarithm of the probability.
6 Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html
7 In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.
8 To be sung to the tune of “Old rockin’ chair”.
9 This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest.

Pourquoi l’entropie?

L’entropie, tu as dit “l’entropie”? Qu’est-ce que c’est?

Je vais te répondre. Et puisque j’ai répondu mal à cette même question autrefois, je vais tenter de le faire bien cette fois-ci. On va voir que l’entropie est essentiel pour comprendre pourquoi certaines choses arrivent et d’autres pas. Par exemple, un œuf entier peut devenir un œuf cassé (bien trop facilement), tandis qu’on n’a jamais vu le contraire. Et ce concept va très loin.

Pour dévoiler la fin, l’entropie sert à mesurer le désordre . La nature cherche le désordre, donc la plus grande entropie. L’entropie sert donc à prévoir dans quelle direction un processus va se dérouler. Celui qui a compris ça peut laisser tomber le reste de cet article. Autrement, on continue.

L’entropie est un concept, une construction mathématique, souvent calculé comme le quotient de deux autres chiffres ou par le formule qu’on va voir tout à l’heure. (Par exemple, dans un processus où de l’énergie calorifique (chaleur) est transférée à un objet, l’entropie du processus est la quantité de chaleur divisée par la température (S = Q/T).) On ne peut pas tenir un bout d’entropie dans la main, ni même le ressentir, comme le chaud ou le froid. Il n’y a pas d’entropie-mètre. Mais malgré le caractère abstrait de l’entropie, on peut en avoir une compréhension qualificatif et utile.

L’entropie a été étudié avec différents buts, par exemple, construire le meilleur machine thermique possible. Différents buts on donné parfois des aspects différents de l’entropie. Mais on a démontré qu’ils sont tous compatibles. Il y a même une entropie qui correspond à l’information. J’avoue ne pas comprendre cela.

L’entropie est une notion centrale de la branche de la physique qui s’appelle la thermodynamique. La thermodynamique est la science de toutes les formes d’énergie. Donc, elle sous-tend toutes les sciences. Quant à l’énergie, c’est la capacité de faire du travail, par exemple, de déplacer ou de soulever un objet lourd. Une autre forme de “travail” pour laquelle l’énergie est nécessaire est le déplacement d’électrons dans un fil (Ça s’appelle courant électrique.) ou le déplacement des produits chimiques chargés dans le système nerveux. Il y en a bien d’autres.

Les sciences biologiques, chimiques et physiques démontrent que toute chose est composé d’autres choses plus petites, à quelques exceptions près (les particules les plus élémentaires, comme les quarks). Même un champ électromagnétique, comme la lumière, est composé de photons. Pour comprendre le mot « composé », on peut imaginer un ensemble de molécules ou de grains de quelque chose. Les gourmands peuvent penser à une sauce, avec des molécules d’eau, de vin, de grains de poivre, d’ions de sel, de molécules d’arômes de champignon, de basilic et d’autres bonnes choses. Yum ! (En français, cela se dit « miam ».)

Maintenant, retournons à l’entropie. Il existe plusieurs manières d’expliquer l’entropie, mais la plus intuitive est certainement la suivante.

L’entropie est la probabilité5 que quelque-chose de composé existe ou a lieu dans un certain état, c’est à dire, avec une certaine configuration ou ordre de ses composants.

Les chaussettes de Paul

Des exemples, tu veux des exemples. D’accord, OK…

J’ai un copain dont la mère a été très dure avec lui pendant son enfance. Je vais te l’expliquer. Elle a insisté pour qu’il range ses chaussettes bleues pâles à gauche dans un tiroir de sa commode et ses chaussettes roses à droite dans ce même tiroir. Tu comprends tout de suite combien cela a tracassé le pauvre Paul. Alors, l’entropie là-dedans ? Simple, mon cher.

Pour simplifier, imaginons que Paul à trois paires de chaussettes, trois chaussettes bleues pâles et trois roses. (Il portait toujours des chaussettes de deux couleurs différentes. (Il ne m’a pas dit s’il portait une couleur spécifique à chaque pied et j’ n’ai pas osé lui poser la question.)) Le jour du lavage, il peut ranger ses chaussettes propres comme on va expliquer maintenant.

Commençons par les chaussettes bleues pâles et supposons qu’elles sont numérotées pour qu’on puisse en parler individuellement. (Ce calcul n’est pas rigoureux, mais cela donne l’idée.) Pour la première à ranger, Paul peut choisir le numéro un, le numéro deux ou le numéro trois, ce qui fait trois choix possibles. Après ce choix difficile, il ne lui reste que deux chaussettes bleues pâles. Donc, il n’y a que deux choix, le numéro deux ou le numéro trois, s’il a déjà pris le numéro un. Et pour la troisième, il n’a qu’un seul choix, celle qui reste. Donc il y a trois choix pour la première. Pour chaque premier choix, il y a deux choix pour le deuxième, ce qui fait six choix possibles en tout (deux pour chacun des trois, 3 fois 2 fait 6.). Et pour le dernière, il ne reste qu’une chaussette et donc un choix. C’est pareil pour les chaussettes roses. Donc, en tout, il a douze choix, six pour les bleues pales et six pour les roses. Ça fait 12 façons de ranger les chaussettes dans l’ordre. Note bien que, puisque les chaussettes ne sont pas vraiment numérotées, la mère de Paul, en contrôlant l’obéissance de son fils, ne peut pas savoir laquelle des 12 choix il a fait mais voit seulement que les chaussettes sont bien ordonnées comme elle veut.

Un jour, dans un accès de désespoir, Paul a enlevé le tiroir et le secoué violemment pendant une minute ou deux avant de se calmer. Dans quel état étaient les chaussettes après? Un peu partout, ou toutes mélangées, n’est-ce pas? C’est que, lorsque Paul a secoué le tiroir, l’entropie a frappé ! Maintenant, les chaussettes sont dans un désordre assez complet. Regardons comment on peut le faire du départ, le même jour de lavage. Maintenant, n’importe quelle chaussette peut aller n’importe où. Donc, pour la première, Paul peut prendre n’importe laquelle des six – six choix. Pour la deuxième choix, il lui reste cinq chaussettes, donc cinq choix pour chacun des six choix de la première chaussette. Ça fait déjà 6 fois 5, ou 30 choix seulement pour les deux premières chaussettes. Évidemment, pour la troisième il reste quatre chaussettes, donc le même nombre de choix. On est à 6 fois 5 fois 4 = 120 choix déjà et il nous reste 3 chaussettes. Nous savons déjà que trois chaussettes peuvent être choisies de six façons, donc la totale est de 120 fois 6 ou 720 choix !

Il y a donc 12 choix possibles pour ranger les chaussettes dans l’ordre par couleur bleue pale ou rose et 720 manières de le faire dans le désordre. La mère de Paul ne distingue pas entre différents choix ordonnés ou différents choix non-ordonnés, mais elle voit trop bien la différence des états des chaussettes. Lequel est le plus probable ? Évidemment, le désordre, par 720 contre 12. Et l’entropie est une mesure du ce désordre.

Bien entendu, les physiciens sont des gens sérieux et ne parlent pas (beaucoup) de l’entropie d’un tiroir à chaussettes, mais ils parlent bien de l’entropie des molécules, par exemple, celles d’un gaz. Là, où on peut avoir un échantillon de quelque chose comme 1023 molécules, le désordre prime de loin sur l’ordre. Encore, on parle du nombre de micro-états différents qui correspondent au même macro-état.

Les billes de Raoul

Raoul a une boîte pleine de billes identiques . Supposons qu’il échange deux billes. On peut spécifier la localisation de chaque bille individuellement et parler donc de l’état détaillé (le micro-état) originel et de l’état détaillé après l’échange des deux billes. Les micro-états sont différents. Mais vu par le fier propriétaire de la boîte de billes (le macro-état), rien n’a changé : Il est incapable de distinguer entre les deux micro-états. De cette façon, on peut avoir un grand nombre de micro-états différents qui sont vu de l’extérieur comme étant le même macro-état. Plus qu’il y a de billes, plus sera le nombre de combinaisons qui sont identiques pour Raoul et plus grand sera l’entropie de la boîte de billes.

Comme c’est simple !

Les principes de la thermodynamique

Le physicien qui a conçu cette façon de définir l’entropie s’appelait Ludwig Boltzmann et sur sa pierre tombale il y a le formule qu’il a présenté au monde :

S = k logW

où S est l’entropie, k est un constant nommé après Mr Boltzmann, et W est le nombre de façons possibles de micro-systèmes (12 ou 720 pour les chaussettes) qui correspondent au même macro-système (le tiroir à chaussettes). Autrement dit, plus le nombre de micro-systèmes est grand, plus l’entropie est grande, puisqu’elle est proportionnelle au logarithme du nombre de micro-états. Une plus grande entropie correspond à un plus grand degré de désordre.

Mais pourquoi parle-t-on de l’entropie, à quoi ça sert ? Eh bien, parce que la deuxième principe de la thermodynamique dit :

Dans un processus physique, l’entropie de l’univers augmente toujours.

C’est assez évident quand on pense à l’aventure de Paul lorsqu’il secoue le tiroir à chaussettes. Pour info, la première principe dit :

L’énergie de l’univers ne change pas ; elle est toujours conservée.

Ceci est la très renommée loi de la conservation de l’énergie. (En anglais, on dit plutôt « lois », mais en français on dit « principes » pour rappeler qu’elles ne peuvent pas être prouvées.) Il y a une troisième principe, qui serait plus longue à expliquer, mais citons le :

L’entropie d’un système à une température de zéro absolu est zéro.

(Au cas où, zéro absolu est zéro Kelvin, ou -273,15° Celsius.) Mais ce n’est pas tout. Après tout ça, quelqu’un a remarqué qu’il devrait exister une autre principe encore plus fondamentale que les trois autres, ce qui fait des principes de la thermodynaique une trinité de quatre principes ; on l’a donc baptisé la zéroième principe :

Si un système A est en équilibre thermique avec le système B, et le système B est en équilibre thermique avec le système C, alors les systèmes A et C sont aussi en équilibre thermique.

Eh alors, tu dis ? Cela veut dire qu’il y a un moyen de savoir, de mesurer quel est l’état thermique d’un système. Ça pourrait être le système B, où B est tout simplement un thermomètre, qui est en équilibre thermique avec A et C, où « être en équilibre thermique » signifie que les deux ont la même température. C’est très important. De quoi parlerait-on si l’on n’avait pas la zéroième principe ? « Je pense que le logarithme de l’entropie va augmenter demain. » O, non !

La thermodynamique est supposée être valide partout dans l’univers sauf pour les objets très petits comme les particules élémentaires (les protons, électrons, quarks, etc). Les physiciens savent parfaitement que les théories d’aujourd’hui seront modifiées et amplifiées par celles de demain … sauf pour la deuxième principe de la thermodynamique. On y croit dur comme le fer !

Encore plus sur le désordre

La femme de Paul, Drusilla, se souvient souvent de ses vacances d’enfance au bord de la mer et des châteaux de sable qu’elle aimait y construire. De beaux châteaux, avec des douves, des pont-levis, des mâchicoulis et une énorme donjon au centre. Considérons ça.

Lorsque Drusilla construisait un château de sable, elle plaçait des petites quantités de grains exactement là où elle les voulait. Puisqu’elle est très minutieuse, c’est comme si elle plaçait chaque grain à un endroit spécifique de l’édifice et pas n’importe où. Rien n’était laissé au hasard. Mais lorsque la mère de Drusilla l’a appelée pour venir manger et se coucher, elle se mettait au lit avec tristesse, parce qu’elle savait ce qui l’attendrait le lendemain matin. Se levant tôt et courant en pyjama à la plage avec la lumière du soleil matinal aux yeux et les cris stridents des mouettes aux oreilles, elle arrivait à son château … ou ce qui en restait … et trouvait que la marée (assistant de l’entropie) avait fait son boulot. Le beau château si bien ordonné était réduit à un tas difforme de sable mouillé. De la boue. Mais Drusilla était courageuse – ou têtue, si on préfère – elle mangeait son petit déjeuner et elle recommençait.

Après avoir vu le cas des chaussettes de Paul, on comprend bien que les choix de destination des grains de sable de Drusilla était similaire, même si les grains étaient bien plus nombreux que les chaussettes et les destinations (mur, donjon, etc. mais pas les douves) plus nombreux que les deux côtés du tiroir. Elle plaçait les grains, comme déjà dit, afin de créer de l’ordre. Par contre, si elle avait simplement jeté les grains n’importe où sur le site du futur non-château, elle aurait formé un tas assez similaire à celui laissé par la marée, en désordre – et donc de plus grande entropie.

Remarquons donc :

  • La nature a choisi l’état du plus grand désordre ; en ce faisant, elle a suivi la deuxième principe et a fait augmenter l’entropie de l’univers.
  • Drusilla a du travailler, c’est à dire, a dépenser de l’énergie, pour construire la version ordonnée du château. Même si elle a réduit un peu l’entropie de l’univers dans la forme ordonnée du château, le travail qu’elle a fait a contribué à augmenter l’entropie de l’univers total.

C’est pourquoi la deuxième principe et l’entropie sont importantes pour nous. Il faut travailler, c’est à dire dépenser de l’énergie (la capacité de travail) pour réduire l’entropie d’un coin de l’univers, tout en augmentant l’entropie de l’univers entier. Mais à la longue, la nature trouvera toujours le moyen de faire échouer nos projets. On est permis de trouver ça un peu triste.

Et l’œuf

On peut voir maintenant le rôle joué par l’entropie dans les événements journaliers, comme un œuf qui se casse. Ou plutôt que je casse, par maladresse. L’œuf entier est une structure bien ordonné, l’œuf cassé est dans un état bien moins organisé, donc, de plus grande entropie.

Mais si je casse l’œuf exprès pour faire une omelette, l’œuf bien remué est encore plus désorganisé et donc dans un état d’entropie encore plus grande.

On peut dire que le temps ne recule pas, mais qu’il avance, parce que l’entropie peut ainsi augmenter encore, comme dans l’exemple suivant…

Le corps humain

Entropie = mort ! Et oui. Nos corps, avec leurs cellules, tissus et organes sont des objets biologiques très ordonnés. Nous ne maintenons cet ordre que par des dépenses d’énergie. Celle-ci vient de la nourriture que nous mangeons, que nos organes et cellules digèrent et réduisent en éléments énergétiques, notamment en ATP12, qui sert à transmettre l’énergie qu’il contient aux cellules pour faire fonctionner le métabolisme de tout le corps, l’ingénierie électrique des neurones et la flexion des muscles. Mais avec l’âge, le désordre arrive en forme de petites douleurs, d’arthroses et de choses plus graves. Et la mort est suivi par la désintégration (désordre) totale.

Plus abstrait – de l’information

Nous avons vu que l’entropie est une mesure du nombre de micro-états indistinguables qui corréspondent au même macro-état d’un système. Echanger deux chaussettes roses ou deux grains de sables et on ne remarque pas la différence. L’entropie corréspond donc à une manque d’information, des données qu’on pourrait connaître mais qu’on ignore. Comme si le grain de sable numéro 2.143.768 est au-dessus du numéro 3.472.540 ou le contraire. Donc, le plus élévé l’entropie, le moins d’information qu’on a sur le micro-état qui sous-tend le macro-état.

L’entropie est donc une mesure de l’information cachée dans un système.

Mais on ne va pas entrer dans la théorie de l’information ici.

Et enfin, l’univers

Les galaxies, les étoiles, notre système solaire, notre bonne vieille terre13 – tous des systèmes ordonnées par la dépense d’énergie d’origine nucléaire dans les étoiles ou dans l’espace même (champs gravitationnels ou électromagnétiques, champs d’origine quantique). Un jour, l’entropie va gagner là aussi. Même si l’espèce humaine trouve refuge ailleurs avant que notre soleil explose dans cinq milliard d’années14, l’univers entier deviendra à la longue diffus, froid et ténébreux. Carpe diem !

Cela nous laisse quand même le temps de réaliser (ou pas) nos rêves de gloire ou de richesses, d’art ou de prouesses sexuelles, de connaissances ou d’humanité afin de prouver à nous-mêmes que nous sommes capables de dépasser notre héritage évolutionnaire15 et, par notre propre énergie, de faire reculer – provisoirement – l’entropie de notre petit coin de l’univers.1

Why entropy?

Entropy, you said “entropy”? What’s that?

Well, I’ll try to explain. And since I replied poorly to the same question once, I ‘ll try to do it right this time. We will see that entropy is essential for understanding why certain things happen and others do not. For example, a whole egg can become a broken egg (all too easily), while the contrary has never been known to happen. And this concept goes a long way.

Here is a spoiler: Entropy is a measure of disorder. Nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time. If you have already understood that, you can skip the rest of this article. Otherwise, let’s go on.

Entropy is a concept, a mathematical construction, often calculated as the quotient of two other figures. (For instance, in a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T. But you do not have to remember that.) The thing which makes it difficult is that you must calculate entropy. You can not hold a grain of entropy in your hand, or even feel it, like warmth or cold. There is no such thing as an entropy meter. But despite the abstract nature of entropy, we can have a qualitative and usable understanding of it.

Across history, entropy has been studied at different times for different purposes, for example, to build the best possible heat engine. In a word, the notion of entropy has evolved. Different goals sometimes have brought up different aspects of entropy. It has been shown that they are compatible. There is even an entropy matching information, which I admit I do not understand.

Entropy is a central concept in the branch of physics called thermodynamics. Thermodynamics is the science of all forms of energy, so it underlies all sciences. Energy is the capacity for doing work, such as moving or lifting a heavy object. Another form of “work” for which energy is required is the movement of electrons in a wire (called electric current) or the movement of electrically charged ions in the nervous system. There are many others.

Biological, chemical and physical sciences explain that everything is composed of other smaller things, with a few exceptions (the most elementary particles, such as quarks). Even an electromagnetic field, such as light, is composed of photons. To understand the word “composed”, one can imagine a assembly of molecules or grains of something. Food lovers can think of a sauce with water and wine molecules, ground peppercorns, salt ions, aromatic molecules of mushroom, basil and other good things. Yum! (“Yum”in French is “miam”. Thought you’d like to know.)

Now, back to entropy. There are several ways to explain entropy, but the following is certainly the most intuitive version.

Entropy is (approximately) the probability that something composed occurs in a certain state, i. e., with a certain configuration or order of its components.

Paul’s socks

Examples, you want examples. All right, OK

I have a friend whose mother was very hard on him during his childhood. I’ll explain. She insisted that he place his pale blue socks on the left in a drawer of his dresser and his pink socks on the right in the same drawer. You understand immediately how much it upset poor Paul. So what does entropy have to do with it? Simple, my friend.

For simplicity, imagine that Paul has but three pairs of socks, three pale blue socks and three pink ones. (He always wore socks of two different colors. (He did not say whether he had a specific color to each foot and I did not ask.)) At the end of wash day, he can put away his socks in the following manner.

Let’s start with the pale blue socks and assume they are numbered so that we can refer to them separately. (This calculation is not rigorous, but it gives the idea.) For the first sock, Paul can choose number one, number two or number three, having therefrawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that numore three possible choices. After this difficult selection, he has left only two pale blue socks, i. e., only two choices, sock number two or number three, assuming he has already chosen number one. And finally, there is only one third choice, the last remaining sock. So there are three choices for the first. For each of these, there are two choices for the second, making a total of six choices (two for each of the three, 3 times 2 is 6). And for the last, there is only one sock and therefore one choice. The same goes for the pink socks. So in all, Paul has twelve choices, six blue and six pink. There are 12 ways to store his socks in order. Note that, since the socks are not really numbered, Paul’s mother, checking up on her son’s obedience, has no way of knowing which of the 12 choices he made, but does see that the socks are arranged the way she wants.

One day, in a fit of despair, Paul pulled out the drawer and shook it violently for a minute or two before calming down. In what state were the socks after that? All mixed together, n’est-ce pas? So when Paul shook the drawer, entropy struck! Now the socks are in a fairly complete mess. Let’s see how we can do this starting the same wash day. Now any sock can go anywhere in the drawer. So for the first, Paul may take any of the six socks. For the second choice, he still has five socks to choose from for each choice of the first sock. That makes 6 times 5 or 30 choices for just the first two socks. For the third choice, there remains four socks, so the total number of choices so far is 6 times 5 times 4 = 120 choices already. We still have 3 socks and we already know that 3 socks may be chosen six ways, so the total is 120 times 6 or 720 choices!

So there are 12 possible choices for storing socks in order by pale blue or pink and 720 ways to do it in disorder. Which is more likely? Obviously, disorder, by 720 against 12. And entropy is a measure of this disorder.

Of course, physicists are serious people and do not speak (much) of the entropy of a sock drawer, but they so talk about the entropy of molecules, for example, those of a gas. For a gas, a sample of which may contain something like 1023 molecules, disorder wins every time. Physicists talk about the number of different micro-states (pink sock here, blue sock there) corresponding to the same macro-state.

Raoul’s marbles

Raoul has a box full of identical marbles. Suppose that we exchange two marbles. You can specify the location of each marble individually and thereby talk about their detailed states (micro-states) before and after the exchange of the two marbles. These micro-states are different. But to little Raoul’s eyes, the box of marbles (the macro-state) has not changed at all; Raoul is unable to distinguish between the two micro-states. In this way, a lot of different micro-states are taken as being the same macro-state. The greater the number of marbles, the greater the number of combinations that seem identical to Raoul and the greater the entropy of the box of marbles.

How simple it really is!

The laws of thermodynamics

Ok, this part is a tiny bit hairy, then we get back to fun stuff. So hang on to your seat.
The physicist who came up with this way of defining entropy was named Ludwig Boltzmann. On his tombstone is engraved the formula th he introduced to the world:

S = k logW

where S is the entropy, k is a constant named after Herr Dr Boltzmann, and W is the number of possible micro-systems (12 or 720 for socks) which correspond to the same macro-system (the sock drawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100. In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states. Greater entropy corresponds to a greater degree of disorder.

But why are we talking about entropy, what’s the point? Well, because the second law of thermodynamics says:

In a physical process, the entropy of the world always increases.

It’s pretty obvious when you think about the story of Paul when he shakes the sock drawer. For your information, the first law says:

The energy of the universe does not change; it is always conserved.

This is the justly famous law of conservation of energy.

There is a third law, which would take longer to explain, but it goes like this:

The entropy of a system at a temperature of absolute zero (0 Kelvin, or -273.15° Celsius) is zero.

But there’s one more. After all that, someone noticed that there should be another law more fundamental than the three others, making the laws of thermodynamics a trinity of four laws; it was therefore named the zeroth law:

If A is a system in thermal equilibrium with the system B, and B is in thermal equilibrium with the system C, then the systems A and C are also in thermal equilibrium.

So what, you ask? Well, this means that there is a way to measure the thermal state of a system. It could be the system B, where B is simply a thermometer, which is in thermal equilibrium with A and C, where “be in thermal equilibrium” means that both have the same temperature. This is very important. What would we talk about if we did not have the zeroth law? “I think that the logarithm of the entropy will increase tomorrow.” No way!

Thermodynamics is assumed to be valid everywhere in the universe except for very small objects such as elementary particles (protons, electrons, quarks, etc.). Physicists are well aware that today’s theories will be modified and augmented by those of tomorrow… except for the second law of thermodynamics. They believe firmly in it!

More on disorder

Paul’s wife, Drusilla, often reminisces about her childhood holidays at the seaside and about the sand castles she loved to build. Beautiful castles with moats, drawbridges, battlements and a huge tower in the center. Let’s think about that.

When Drusilla built a sand castle, she placed the grains exactly where she wanted them. Since she was very exacting (or finicky), it was as if she placed each grain in a specific location of the castle and not anywhere else. Nothing was left to chance. But when the mother of Drusilla called her to come and eat and go to bed, she got into bed with sadness, because she knew what she would discover the next morning. Getting up early and running in pajamas to the beach with the morning sunlight in her eyes and the strident noise of screeching seagulls in her ears, she came to her castle – or what was left of it – and found that the tide (the assistant of entropy) had done its job. The beautiful, well-ordered castle had been reduced to a shapeless heap of wet sand. Mud. But Drusilla was brave – or stubborn, if you prefer – she ate her breakfast and began again.

After having seen the case of Paul’s socks, it is clear that the choice of destination of Drusilla’s sand grains was similar, although the grains were much more numerous than Paul’s socks and the destinations (wall, keep, etc. but not the moats) outnumbered both sides of Paul’s dresser drawer. She placed the grains, as already stated, to create order. If she had just thrown the grains anywhere on the site of the future non-castle, it would have formed a heap quite similar to that left by the tide, in complete disorder – and therefore greater entropy.

So we should notice two things:

  • Nature has selected the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.
  • Drusilla had to work, i. e., to expend energy, to build the ordered castle. Although she reduces by a small amount the entropy of the universe in the form of the ordered castle, the work she has done has helped to increase the entropy elsewhere, so the total entropy of the universe has increased.

That is why the Second Law and entropy are important to us. It takes work, i. e., energy (work capacity) to reduce the entropy of a bit of the universe, while increasing the entropy of the whole universe. But in the end, nature always finds a way to thwart our plans. It is permitted to find that a little sad.

And the egg

We can now see the role of entropy in everyday events, such as an egg that breaks. Or rather, I break because I don’t pay enough attention to what I am doing. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.

But if I break an egg on purpose in order to make an omelet, the well-stirred egg is even more disorganized and therefore in an even greater state of entropy.

We can say that time never backs up, but always runs forwards, because that way entropy may increase, as in the following example …

The human body

Entropy = death! Alas, yes. Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy, just like Drusilla did for her castles. The energy comes from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (Adenosine triphosphate, since you asked.), which transmits the energy it contains to cells in order to maintain the metabolism of the whole body, electricity in neurons and flexing of the muscles. But with age, disorder comes in the form of small pains, arthritis and more serious things. And death is followed by disintegration (total disorder = higher entropy). Old rockin’ chair may get us, but entropy will get us last.

More abstract – information

We have seen that entropy is a measure of the number of indistinguishable micro-states which correspond to the same macro-state of something. Exchange two blue socks or two grains of sand and we do not notice the difference. So entropy corresponds to missing information, data that we could imagine knowing but don’t. Like if grain of sand 2,143,768 is on top of gran 3,472,540 or the other way around. So the higher the entropy, the less information we know about the underlying micro-state.

The entropy is thus a measure of the hidden information contained in a system.

But we will not go into information theory here.

And finally, the universe

Galaxies, stars, our solar system and good old Earth (Carl Sagan’s “pale blue dot”) are all ordered by the expense of nuclear energy from the stars or space itself (gravitational fields or electromagnetic fields of quantum origin). One day, entropy will “prevail” there too. Even if the human species finds refuge elsewhere before our sun explodes in five billion years or so, the entire universe will eventually become diffuse, cold and dark. Carpe Diem!

This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by ou