What atomic physics and chemistry tell us

I am, reluctantly, a self-confessed carbon chauvinist. Carbon is abundant in the Cosmos. It makes marvelously complex molecules, good for life. I am also a water chauvinist. Water makes an ideal solvent system for organic chemistry to work in and stays liquid over a wide range of temperatures. But sometimes I wonder. Could my fondness for materials have something to do with the fact that I am made chiefly of them?

– Carl Sagan, Cosmos

The early stages of the universe and the lives of stars are the matter of physics and astronomy and their offspring, astrophysics and cosmology. By the time the first living things showed up on Earth, processes were occurring which require our knowing about the phenomena described by the science of chemistry. QM is the basis of atomic physics and that is the basis of chemistry, so we are ready for it.

To do even begin a comprehensive survey of chemistry is well beyond the scope of this document. We will illustrate its usefulness and some of its fruits by considering two subjects of great importance not only to Carl Sagan but to all of us – carbon and water.

In order to do that, it is necessary to know about several sujects:

Then we move on to consider, first, the past, starting almost 14 Gya.

Standard model of elementary particles

This is as good a place as any to mention the Standard Model. It is not a theory like QM or GR, but it uses QM to build a “model”, i. e., a proposed explanation, of the constitution and properties of the elementary particles of which everything else is constituted. The principal elementary particles are the proton, the neutron and the electron, but there are many more. For instance, neutrons and protons (as well as mesons) are not elementary, they are composed of quarks.

Warning: This presentation necessarily presents lots of new vocabulary. In addition, it sounds like numerology. But it works – with one notable exception, the force of gravity. However, gravity is thought to be negligeable at atomic scales (except when discussing black holes, which we are not).

Basically, there are two type of particles, fermions and bosons.

Bosons have spin (the particle’s angular momentum due to its spinning on an axis) which takes on only integral values – for instance, 0 or 1. They are virtual particles which carry the forces between other particles.igade.

Fermions all have half-integral values of spin – ½, 1½ and so on. They are the basic components of matter, the blocks from which matter is built. There are two types of fermions – quarks and leptons. They all are shown in the figure, the particle zoo, which is composed of six quarks (shown in purple), six leptons (green) and four bosons (orange). (The Higgs is special and will be discussed later – maybe.)

Standard Model particle zoo, from Wikimedia Commons

Standard Model particle zoo, from Wikimedia Commons

The quarks are said to differ by their flavor – up, down, charm, strange, top or bottom – which obviously is just an arbitrary term and has nothing to do with taste. They are arranged in three columns called generations. Quarks have charges +2/3 or -1/3; leptons, 0 or 1. Each of the six flavors of quark exists in three versions indicated (by analogy) by the colors red, green and blue1Not the same as the colors on the chart., for a total of 18. When forming matter particles, the quarks must group together in such a way that the result is “colorless”, so occur in the combination R+G+B for baryons like the neutron or proton, and R+R, where the bar above the character indicates an antiparticle, for mesons. Particles composed of quarks are called hadrons.

Note that the mass of, for instance, a proton (about 938 MeV/c2) is far greater than the sum of the masses of its constituent quarks (2×2.3 MeV/c2 + 4.8 MeV/c2) . The excess, almost 99% of the proton’s mass, is potential energy of the strong force which binds the quarks together into the proton.

Matter is made up of atoms with nuclei containing protons and neutrons (also called nucleons), with electrons forming a negatively-charged cloud around the nucleus. Such long-lived particles are made up of quarks from the first column, called the first generation. A proton is composed of two up quarks and a down quark. The former have a charge of +2/3, the latter of -1/3, so the total is +1. A neutron is composed of an up and two downs, for a total charge of +2/3-1/3-1/3 = 0. And so on. (This is the numerology part that was mentioned.)

As stated, bosons are virtual, force-carrying particles. Modern physics recognizes four forces in nature.

  1. Gravity is thought to be conveyed by a yet-to-be observed particle called the graviton (not shown because never observed and not part of the standard model — yet). It is a weak force (the weakest) but works across enormous, interstellar distances. We shall see that it is responsible for the formation of stars, galaxies and planets – no less. We owe a lot go gravity.

  2. The electromagnetic force between charges or magnets is conveyed by the photon, which is the particle of light. Like gravity, it has infinite range, but is stronger. Since its sources can be either positive or negative charges, the two cancel each other out over large distances, making the overall force weaker. The quantum theory of the electromagnetic force is called quantum electrodynamics, or QED.

  3. The strong force is the strongest, but is very short-ranged. It holds quarks together in the nucleus in spite of the repulsive electric forces between their charges. It is conveyed by the appropriately-named gluon. The quantum theory of strong interactions is called quantum chromodynamics, or QCD.

  4. The weak force is weaker than the strong or EM forces, but is still stronger than gravity. It also is a very short-range force. responsible for decays of various radioactive particles. It is conveyed by the W and Z bosons.

So there two infinite-range forces, gravity and EM, and two short-range ones, the strong and weak. By order of strength, from strongest to weakest, they are strong, EM, weak and last, but in many ways not least, gravity.

In principle, EM is be as far-reaching as gravity, but as you get farther away from, say, a positive charge, the more it is shielded by negative charges, so that the net charge seen is about zero. Gravity only has an attractive “charge”, so its effects extend over a very large range, over galaxies and more. As we shall soon see, that is a Good Thing!

Now let’s get away from this weird physics for a while and look at evolution.

Notes

Notes
1 Not the same as the colors on the chart.

Relativity

Special relativity

Just as QM is the theory of the very small, Relativity is the theory of very large dimensions.

As usual, if this page gets to be too much, try the simpler version. This one is recommended, though.

In fact, there are two theories of relativity. The simpler, first one, called Special Relativity (SR), is about space-time, relative movement and the speed of light. It says:

  1. the equations of physics are the same (invariant, in math-physics speak) for all observers who are moving with uniform steady motion relative to each other1Such observers are said to be in inertial systems.;
  2. the speed of light in a vacuum is a constant, the same for all observers.

The first statement, the principle of relativity, means that if someone goes by you in a high-speed train on a straight track, then either you or she can consider herself to be stationary and the other moving. If you have traveled by train, you may have wondered sometimes whether your train was advancing or the one next to you was moving backwards.

The second statement means that if both you and your friend in the train measure the speed of a light beam, you both will find the same answer, about 300,000 km/sec. This is not intuitive.

From these two statements, lots of things follow, including.

  • the equivalence of mass and energy, expressed by everybody’s favorite (and perhaps only) equation, E=mc2;

  • the fact that only bodies without mass can travel at the speed of light;

  • curious distortions like the faster someone is moving relative to us,

    • the heavier she gets,

    • the shorter she gets in the direction of motion and

    • the slower her clock runs, including her body clock, the heart.

This last point is the origin of the famous twin “paradox”. If your twin takes off in a space ship and travels fairly fast compared to the speed of light, then when she gets back to Earth, she will be younger than you, the twin she left behind. Strange, but this has been tested (with particles) and found to be true.

All this comes out of the mathematics for describing relative motion, which is in turn due to the fact that space and time are not independent, but form a four-dimensional “thing” called space-time. What we see as space and time are aspects of space-time and can appear differently for different observers. And they can get mixed up together, especially if you are traveling at high speeds.

General relativity

The other Relativity theory is called General Relativity (GR). Whereas Special Relativity is the theory of space-time and light, General Relativity is the theory of gravity. GR says that space-time is curved and that it is more curved where gravitational forces are stronger. In fact, gravity is the curvature of space-time. Think of a plane surface with a depression in it. Put a ball on it and the ball will roll into the depression. Try to visualize that in four dimensions (Good luck!) and you’ve got GR. Uh, sort-of …

There are also lots of things which follow from GR. One of the more interesting is that gravity can change the direction of light, since light travels through the space deformed by gravity. This and many other predictions have been observed to occur just as GR predicts.

Curiously enough, clocks run slower in a stronger gravitational field. A clock runs faster on top of a high building or in a satellite than it does on the surface of the Earth. For a satellite, this is opposite to the effect of high speeds on fast-moving clocks in SR, which makes them run slower. The two corrections are in opposite directions but do not cancel each other completely, so both must be applied to GPS systems in order for them to function correctly.

More recently, we have discovered that not only is space-time curved, but that space is expanding. This is because a gravitational field exerts not just a force, but a pressure. This pressure is considered to come from a term in the equations of GR called the Cosmological Constant. Unlike the force of gravity, which is always attractive, the pressure can be negative, in which case it is not attractive, but repulsive. It is the outward force of the negative pressure which drives the expansion of space. For the last 7 Gy2Gy means giga-years, or 109 years or 10 billion years., the expansion of space has been accelerating. More about that in the cosmology chapter.

QM and GR — an unhappy marriage

As far as they go, SR and GR are as true to nature as can be detected within their separate domains, and so constitute bodies of knowledge – theories. They must be taken into account practically for the technology of GPS systems.

The problem is that QM does not work on the scale of GR, nor the other way around. This is probably the biggest problem in contemporary physics, the resolution of which is the grail now pursued by theorists.

In summary:

We live in a world where matter sometimes behaves like a particle and sometimes like a wave and where only probabilities can be calculated. All this is taking place in a curved four-dimensional space-time which is expanding at an accelerating rate! And this space is not an empty vacuum, it is something.

Got that?

Now onto the the standard model of elementary particles.

Notes

Notes
1 Such observers are said to be in inertial systems.
2 Gy means giga-years, or 109 years or 10 billion years.

Quantum mechanics

Quantum mechanics is the theory of what happens at very small dimensions, on the order of 10-30 meters or less! It is therefore the theory which must be used in order to understand atoms and elementary particles.

According to quantum mechanics, what is “out there” is a vast amount of space – not an empty backdrop, but actually something. This space is filled with particles so small that the distance between them is huge compared to their own sizes. Not only that, but they are waves, or something else which acts sometimes like waves and sometimes like particles. The modern interpretation of this is in terms of fields, things which have a value (and perhaps a direction) at every point in space. “Every particle and every wave in the Universe is simply an excitation of a quantum field that is defined over all space and time.1Blundell, 1.

Nobody can actually measure simultaneously where a particle is and how fast it is moving (or how much energy it possesses and when). This effect is referred to as indeterminacy, or the Uncertainty Principle, one of the more uncomfortable and, simultaneously, fruitful results of the theory.

As a result of this indeterminacy, energy need not be conserved, regardless of thermodynamics, for very short periods of time, giving rise to all sorts of unexpected phenomena, such as radiation from black holes. But that is another subject.

Time-dependant non-relativistic Schrödinger equation

Time-dependant non-relativistic Schrödinger equation from Wikipedia

QM is explained by a mathematical formalism based on an equation, generally referred to as the Schrödinger equation, although it exists in several forms (differential, matrix, bra-ket, tensor). The solution to this equation is called the wave function, represented by the Greek letter ψ. The wave function serves to predict the probability that the system under study be in a given state. It gives only a probability for the state. (In fact, the probability is not the wave function itself, but its complex square.) This knowledge only of probabilities really irks some people and nobody really understands what it means (dixit Richard Feynman, one of the greatest of quantum theorists). But the mathematics works.

According to QM, some parameters of a system, such as energy or wavelength, can only take on certain values; any values in between are not allowed. Such allowed values are called eigenvalues. The eigenvalues are separated by minimal “distances” called quanta and the system is said to be quantized. We will see a good example of them when we look at atomic structure.

An important result of QM is that certain particles known as fermions are constrained so that two of them can never occupy the same QM state. This phenomenon, called the Exclusion Principle, is at the root of solid-state physics and therefore of the existence of transistors and all the technologies dependent thereupon – portable computers, mobile telephones, space exploration and the Internet, just as to mention a few examples. So QM has indeed revolutionized modern life, for the better and for the worse.

The exclusion principle is also responsible for the fact that electrons in a collapsing super-dense star cannot all be in the same state, so there is a pressure effectively keeping them from being compressed any further. We will read more about that in the cosmology chapter. Closer to home, fermions constitute matter, including us.

An important subject of study and discussion in current theoretical physics is the interpretation of QM, such as in the many-worlds hypothesis, but that subject is beyond the scope of this article.

Go on to read about relativity, because it’s probably not what you thought it was.

Notes

Notes
1 Blundell, 1.

Thermodynamics

Thermodynamics is the science of forms of energy, one of our most important and interesting topics, as it underlies other sciences. Energy is the capacity for doing work. Among forms of “work” for which energy is required are lifting a heavy object, making electrons flow in a wire (electric current), streaming electrically charged ions in the nervous system or effecting contraction of muscle fibers (to lift that heavy object).

Over time, thermodynamics has evolved, adapting itself to new discoveries. First elaborated to deal with steam engines and boring cannons1The double entendre is intended., it has been adapted to include such phenomena as electromagnetism and even information.

Thermodynamics is summarized in the form of three laws2The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid..

  1. The energy of the universe does not change; it is always conserved.3There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
  2. In a physical process, the entropy of the universe never decreases.
  3. The entropy of a system at a temperature of absolute zero4Absolute zero is 0 Kelvin, or -273.15° Celsius. is zero.

Law number 1 is the justly famous law of conservation of energy. Law number 2 explains why physical processes don’t play out backwards in time. Number 3 is somewhat more obscure, but important.

Actually, after these three, someone noticed that there should be another, yet more fundamental law; it was therefore named the zeroth law, making the laws of thermodynamics a trinity of four:

0. If A is a system in thermal equilibrium with system B, and B is in thermal     equilibrium with system C, then systems A and C are also in thermal equilibrium.

This provides a way to measure the thermal state of a system. If system B is simply a thermometer, this says that if A and C are each in thermal equilibrium with it, they both have the same temperature.

Entropy is a concept, a mathematical construction, in some contexts calculated as the quotient of two other quantities: In a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T). We shall soon see another formula for it. The point is that you cannot hold a grain of entropy in your hand or feel it, like warmth or cold. There is no such thing as a simple entropy meter. But the abstract nature of entropy does not keep it from being a useful — in fact, an essential — idea.

Most simply stated, entropy is a measure of disorder. As every housewife or mother knows, nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time.

From a statistical viewpoint, entropy can be defined as follows.

Entropy is the probability5In fact, the logarithm of the probability. that something occurs in a certain state, i. e., with a certain configuration or order of its components.

Currently, many physicists and chemists prefer to speak of entropy as a measure of energy dispersal, rendering the dispersed energy unusable.6Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html An example of such dispersed energy is that lost to friction as heat, like that which renders a perpetual-motion machine impossible.

Physical, chemical and biological sciences explain that everything is composed of other smaller things, with a few exceptions, such as some of the most elementary particles, quarks. Even an electromagnetic field like light or radio waves is composed of photons. To understand the word “composed”, one can imagine an assembly of molecules or grains of something. If you interchange two identical molecules, you have a different system but one which you cannot distinguish from the first. One macro-state (like a bowl of sugar) is said to be composed of different, indistinguishable micro-states (because we do not know where the individual grains are – nor do we care). The number of interchangeable micro-states can be quite large. The probability of a macro-state’s occurring is a function of the number of indistinguishable states and so is its entropy.

S = k logW

where S is the entropy, k is the so-called Boltzmann constant, and W is the number of possible micro-systems which correspond to the same macro-system. (Originally, W stood for Wahrscheinlichkeit, German for probability.) In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states7In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.. Greater entropy corresponds to a higher probability, in turn reflecting a greater degree of disorder, since more component particles can be exchanged without our noticing.

Imagine building a sand castle. Small lumps of wet sand are placed in a precisely ordered fashion to create the various structures of the castle – walls, moats, turrets and so forth. But later, the tide does its job, reducing the well-ordered castle to a shapeless heap of wet sand. The grains are no longer in order, but are as if they had been thrown down just anywhere, leaving a formless heap, with more indistinguishable micro-states and in complete disorder – and therefore greater entropy.

So we should notice two things:

  • Nature selects the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.

  • Building the ordered sand castle takes work, i. e., expenditure of energy. Although the castle’s construction in an ordered form reduces by a small amount the entropy of the universe, the work it took has consumed energy and therefore increased the entropy elsewhere, so the total entropy of the universe has increased. In general, one can say that increasing complexity requires more energy input to keep entropy change positive.

That is why the Second Law and entropy are important. It takes work, i. e., energy, to reduce the entropy locally, although it nevertheless increases the entropy of the universe as a whole.

Another much-cited example of increased disorder, and therefore entropy, is the breaking of an egg. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.

As for us, entropy = death! Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (adenosine triphosphate), which then transmits the energy it contains to cells which use it to maintain the metabolism of the body. But with age, disorder comes in the form of small pains and dysfunctions. Old entropy’s got us.8To be sung to the tune of “Old rockin’ chair”.

The universe itself is not exempt from the Second Law. Galaxies, stars, our solar system and good old Earth are all ordered at the expense of nuclear energy from the stars or space itself (gravitational or electromagnetic fields or other more exotic or less-well-understood fields like the Higgs field or dark energy). One day, entropy will prevail there too. Even if the human species should find refuge elsewhere before our sun explodes in some five billion years, the entire universe will ultimately become cold, dark and diffuse.9This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest. Carpe Diem!

This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by our own efforts reverse – temporarily – the entropy of our little corner of the universe.

Now you can go read about something stranger yet, quantum mechanics.

Notes

Notes
1 The double entendre is intended.
2 The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid.
3 There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
4 Absolute zero is 0 Kelvin, or -273.15° Celsius.
5 In fact, the logarithm of the probability.
6 Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html
7 In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.
8 To be sung to the tune of “Old rockin’ chair”.
9 This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest.

Pourquoi l’entropie?

L’entropie, tu as dit “l’entropie”? Qu’est-ce que c’est?

Je vais te répondre. Et puisque j’ai répondu mal à cette même question autrefois, je vais tenter de le faire bien cette fois-ci. On va voir que l’entropie est essentiel pour comprendre pourquoi certaines choses arrivent et d’autres pas. Par exemple, un œuf entier peut devenir un œuf cassé (bien trop facilement), tandis qu’on n’a jamais vu le contraire. Et ce concept va très loin.

Pour dévoiler la fin, l’entropie sert à mesurer le désordre . La nature cherche le désordre, donc la plus grande entropie. L’entropie sert donc à prévoir dans quelle direction un processus va se dérouler. Celui qui a compris ça peut laisser tomber le reste de cet article. Autrement, on continue.

L’entropie est un concept, une construction mathématique, souvent calculé comme le quotient de deux autres chiffres ou par le formule qu’on va voir tout à l’heure. (Par exemple, dans un processus où de l’énergie calorifique (chaleur) est transférée à un objet, l’entropie du processus est la quantité de chaleur divisée par la température (S = Q/T).) On ne peut pas tenir un bout d’entropie dans la main, ni même le ressentir, comme le chaud ou le froid. Il n’y a pas d’entropie-mètre. Mais malgré le caractère abstrait de l’entropie, on peut en avoir une compréhension qualificatif et utile.

L’entropie a été étudié avec différents buts, par exemple, construire le meilleur machine thermique possible. Différents buts on donné parfois des aspects différents de l’entropie. Mais on a démontré qu’ils sont tous compatibles. Il y a même une entropie qui correspond à l’information. J’avoue ne pas comprendre cela.

L’entropie est une notion centrale de la branche de la physique qui s’appelle la thermodynamique. La thermodynamique est la science de toutes les formes d’énergie. Donc, elle sous-tend toutes les sciences. Quant à l’énergie, c’est la capacité de faire du travail, par exemple, de déplacer ou de soulever un objet lourd. Une autre forme de “travail” pour laquelle l’énergie est nécessaire est le déplacement d’électrons dans un fil (Ça s’appelle courant électrique.) ou le déplacement des produits chimiques chargés dans le système nerveux. Il y en a bien d’autres.

Les sciences biologiques, chimiques et physiques démontrent que toute chose est composé d’autres choses plus petites, à quelques exceptions près (les particules les plus élémentaires, comme les quarks). Même un champ électromagnétique, comme la lumière, est composé de photons. Pour comprendre le mot « composé », on peut imaginer un ensemble de molécules ou de grains de quelque chose. Les gourmands peuvent penser à une sauce, avec des molécules d’eau, de vin, de grains de poivre, d’ions de sel, de molécules d’arômes de champignon, de basilic et d’autres bonnes choses. Yum ! (En français, cela se dit « miam ».)

Maintenant, retournons à l’entropie. Il existe plusieurs manières d’expliquer l’entropie, mais la plus intuitive est certainement la suivante.

L’entropie est la probabilité5 que quelque-chose de composé existe ou a lieu dans un certain état, c’est à dire, avec une certaine configuration ou ordre de ses composants.

Les chaussettes de Paul

Des exemples, tu veux des exemples. D’accord, OK…

J’ai un copain dont la mère a été très dure avec lui pendant son enfance. Je vais te l’expliquer. Elle a insisté pour qu’il range ses chaussettes bleues pâles à gauche dans un tiroir de sa commode et ses chaussettes roses à droite dans ce même tiroir. Tu comprends tout de suite combien cela a tracassé le pauvre Paul. Alors, l’entropie là-dedans ? Simple, mon cher.

Pour simplifier, imaginons que Paul à trois paires de chaussettes, trois chaussettes bleues pâles et trois roses. (Il portait toujours des chaussettes de deux couleurs différentes. (Il ne m’a pas dit s’il portait une couleur spécifique à chaque pied et j’ n’ai pas osé lui poser la question.)) Le jour du lavage, il peut ranger ses chaussettes propres comme on va expliquer maintenant.

Commençons par les chaussettes bleues pâles et supposons qu’elles sont numérotées pour qu’on puisse en parler individuellement. (Ce calcul n’est pas rigoureux, mais cela donne l’idée.) Pour la première à ranger, Paul peut choisir le numéro un, le numéro deux ou le numéro trois, ce qui fait trois choix possibles. Après ce choix difficile, il ne lui reste que deux chaussettes bleues pâles. Donc, il n’y a que deux choix, le numéro deux ou le numéro trois, s’il a déjà pris le numéro un. Et pour la troisième, il n’a qu’un seul choix, celle qui reste. Donc il y a trois choix pour la première. Pour chaque premier choix, il y a deux choix pour le deuxième, ce qui fait six choix possibles en tout (deux pour chacun des trois, 3 fois 2 fait 6.). Et pour le dernière, il ne reste qu’une chaussette et donc un choix. C’est pareil pour les chaussettes roses. Donc, en tout, il a douze choix, six pour les bleues pales et six pour les roses. Ça fait 12 façons de ranger les chaussettes dans l’ordre. Note bien que, puisque les chaussettes ne sont pas vraiment numérotées, la mère de Paul, en contrôlant l’obéissance de son fils, ne peut pas savoir laquelle des 12 choix il a fait mais voit seulement que les chaussettes sont bien ordonnées comme elle veut.

Un jour, dans un accès de désespoir, Paul a enlevé le tiroir et le secoué violemment pendant une minute ou deux avant de se calmer. Dans quel état étaient les chaussettes après? Un peu partout, ou toutes mélangées, n’est-ce pas? C’est que, lorsque Paul a secoué le tiroir, l’entropie a frappé ! Maintenant, les chaussettes sont dans un désordre assez complet. Regardons comment on peut le faire du départ, le même jour de lavage. Maintenant, n’importe quelle chaussette peut aller n’importe où. Donc, pour la première, Paul peut prendre n’importe laquelle des six – six choix. Pour la deuxième choix, il lui reste cinq chaussettes, donc cinq choix pour chacun des six choix de la première chaussette. Ça fait déjà 6 fois 5, ou 30 choix seulement pour les deux premières chaussettes. Évidemment, pour la troisième il reste quatre chaussettes, donc le même nombre de choix. On est à 6 fois 5 fois 4 = 120 choix déjà et il nous reste 3 chaussettes. Nous savons déjà que trois chaussettes peuvent être choisies de six façons, donc la totale est de 120 fois 6 ou 720 choix !

Il y a donc 12 choix possibles pour ranger les chaussettes dans l’ordre par couleur bleue pale ou rose et 720 manières de le faire dans le désordre. La mère de Paul ne distingue pas entre différents choix ordonnés ou différents choix non-ordonnés, mais elle voit trop bien la différence des états des chaussettes. Lequel est le plus probable ? Évidemment, le désordre, par 720 contre 12. Et l’entropie est une mesure du ce désordre.

Bien entendu, les physiciens sont des gens sérieux et ne parlent pas (beaucoup) de l’entropie d’un tiroir à chaussettes, mais ils parlent bien de l’entropie des molécules, par exemple, celles d’un gaz. Là, où on peut avoir un échantillon de quelque chose comme 1023 molécules, le désordre prime de loin sur l’ordre. Encore, on parle du nombre de micro-états différents qui correspondent au même macro-état.

Les billes de Raoul

Raoul a une boîte pleine de billes identiques . Supposons qu’il échange deux billes. On peut spécifier la localisation de chaque bille individuellement et parler donc de l’état détaillé (le micro-état) originel et de l’état détaillé après l’échange des deux billes. Les micro-états sont différents. Mais vu par le fier propriétaire de la boîte de billes (le macro-état), rien n’a changé : Il est incapable de distinguer entre les deux micro-états. De cette façon, on peut avoir un grand nombre de micro-états différents qui sont vu de l’extérieur comme étant le même macro-état. Plus qu’il y a de billes, plus sera le nombre de combinaisons qui sont identiques pour Raoul et plus grand sera l’entropie de la boîte de billes.

Comme c’est simple !

Les principes de la thermodynamique

Le physicien qui a conçu cette façon de définir l’entropie s’appelait Ludwig Boltzmann et sur sa pierre tombale il y a le formule qu’il a présenté au monde :

S = k logW

où S est l’entropie, k est un constant nommé après Mr Boltzmann, et W est le nombre de façons possibles de micro-systèmes (12 ou 720 pour les chaussettes) qui correspondent au même macro-système (le tiroir à chaussettes). Autrement dit, plus le nombre de micro-systèmes est grand, plus l’entropie est grande, puisqu’elle est proportionnelle au logarithme du nombre de micro-états. Une plus grande entropie correspond à un plus grand degré de désordre.

Mais pourquoi parle-t-on de l’entropie, à quoi ça sert ? Eh bien, parce que la deuxième principe de la thermodynamique dit :

Dans un processus physique, l’entropie de l’univers augmente toujours.

C’est assez évident quand on pense à l’aventure de Paul lorsqu’il secoue le tiroir à chaussettes. Pour info, la première principe dit :

L’énergie de l’univers ne change pas ; elle est toujours conservée.

Ceci est la très renommée loi de la conservation de l’énergie. (En anglais, on dit plutôt « lois », mais en français on dit « principes » pour rappeler qu’elles ne peuvent pas être prouvées.) Il y a une troisième principe, qui serait plus longue à expliquer, mais citons le :

L’entropie d’un système à une température de zéro absolu est zéro.

(Au cas où, zéro absolu est zéro Kelvin, ou -273,15° Celsius.) Mais ce n’est pas tout. Après tout ça, quelqu’un a remarqué qu’il devrait exister une autre principe encore plus fondamentale que les trois autres, ce qui fait des principes de la thermodynaique une trinité de quatre principes ; on l’a donc baptisé la zéroième principe :

Si un système A est en équilibre thermique avec le système B, et le système B est en équilibre thermique avec le système C, alors les systèmes A et C sont aussi en équilibre thermique.

Eh alors, tu dis ? Cela veut dire qu’il y a un moyen de savoir, de mesurer quel est l’état thermique d’un système. Ça pourrait être le système B, où B est tout simplement un thermomètre, qui est en équilibre thermique avec A et C, où « être en équilibre thermique » signifie que les deux ont la même température. C’est très important. De quoi parlerait-on si l’on n’avait pas la zéroième principe ? « Je pense que le logarithme de l’entropie va augmenter demain. » O, non !

La thermodynamique est supposée être valide partout dans l’univers sauf pour les objets très petits comme les particules élémentaires (les protons, électrons, quarks, etc). Les physiciens savent parfaitement que les théories d’aujourd’hui seront modifiées et amplifiées par celles de demain … sauf pour la deuxième principe de la thermodynamique. On y croit dur comme le fer !

Encore plus sur le désordre

La femme de Paul, Drusilla, se souvient souvent de ses vacances d’enfance au bord de la mer et des châteaux de sable qu’elle aimait y construire. De beaux châteaux, avec des douves, des pont-levis, des mâchicoulis et une énorme donjon au centre. Considérons ça.

Lorsque Drusilla construisait un château de sable, elle plaçait des petites quantités de grains exactement là où elle les voulait. Puisqu’elle est très minutieuse, c’est comme si elle plaçait chaque grain à un endroit spécifique de l’édifice et pas n’importe où. Rien n’était laissé au hasard. Mais lorsque la mère de Drusilla l’a appelée pour venir manger et se coucher, elle se mettait au lit avec tristesse, parce qu’elle savait ce qui l’attendrait le lendemain matin. Se levant tôt et courant en pyjama à la plage avec la lumière du soleil matinal aux yeux et les cris stridents des mouettes aux oreilles, elle arrivait à son château … ou ce qui en restait … et trouvait que la marée (assistant de l’entropie) avait fait son boulot. Le beau château si bien ordonné était réduit à un tas difforme de sable mouillé. De la boue. Mais Drusilla était courageuse – ou têtue, si on préfère – elle mangeait son petit déjeuner et elle recommençait.

Après avoir vu le cas des chaussettes de Paul, on comprend bien que les choix de destination des grains de sable de Drusilla était similaire, même si les grains étaient bien plus nombreux que les chaussettes et les destinations (mur, donjon, etc. mais pas les douves) plus nombreux que les deux côtés du tiroir. Elle plaçait les grains, comme déjà dit, afin de créer de l’ordre. Par contre, si elle avait simplement jeté les grains n’importe où sur le site du futur non-château, elle aurait formé un tas assez similaire à celui laissé par la marée, en désordre – et donc de plus grande entropie.

Remarquons donc :

  • La nature a choisi l’état du plus grand désordre ; en ce faisant, elle a suivi la deuxième principe et a fait augmenter l’entropie de l’univers.
  • Drusilla a du travailler, c’est à dire, a dépenser de l’énergie, pour construire la version ordonnée du château. Même si elle a réduit un peu l’entropie de l’univers dans la forme ordonnée du château, le travail qu’elle a fait a contribué à augmenter l’entropie de l’univers total.

C’est pourquoi la deuxième principe et l’entropie sont importantes pour nous. Il faut travailler, c’est à dire dépenser de l’énergie (la capacité de travail) pour réduire l’entropie d’un coin de l’univers, tout en augmentant l’entropie de l’univers entier. Mais à la longue, la nature trouvera toujours le moyen de faire échouer nos projets. On est permis de trouver ça un peu triste.

Et l’œuf

On peut voir maintenant le rôle joué par l’entropie dans les événements journaliers, comme un œuf qui se casse. Ou plutôt que je casse, par maladresse. L’œuf entier est une structure bien ordonné, l’œuf cassé est dans un état bien moins organisé, donc, de plus grande entropie.

Mais si je casse l’œuf exprès pour faire une omelette, l’œuf bien remué est encore plus désorganisé et donc dans un état d’entropie encore plus grande.

On peut dire que le temps ne recule pas, mais qu’il avance, parce que l’entropie peut ainsi augmenter encore, comme dans l’exemple suivant…

Le corps humain

Entropie = mort ! Et oui. Nos corps, avec leurs cellules, tissus et organes sont des objets biologiques très ordonnés. Nous ne maintenons cet ordre que par des dépenses d’énergie. Celle-ci vient de la nourriture que nous mangeons, que nos organes et cellules digèrent et réduisent en éléments énergétiques, notamment en ATP12, qui sert à transmettre l’énergie qu’il contient aux cellules pour faire fonctionner le métabolisme de tout le corps, l’ingénierie électrique des neurones et la flexion des muscles. Mais avec l’âge, le désordre arrive en forme de petites douleurs, d’arthroses et de choses plus graves. Et la mort est suivi par la désintégration (désordre) totale.

Plus abstrait – de l’information

Nous avons vu que l’entropie est une mesure du nombre de micro-états indistinguables qui corréspondent au même macro-état d’un système. Echanger deux chaussettes roses ou deux grains de sables et on ne remarque pas la différence. L’entropie corréspond donc à une manque d’information, des données qu’on pourrait connaître mais qu’on ignore. Comme si le grain de sable numéro 2.143.768 est au-dessus du numéro 3.472.540 ou le contraire. Donc, le plus élévé l’entropie, le moins d’information qu’on a sur le micro-état qui sous-tend le macro-état.

L’entropie est donc une mesure de l’information cachée dans un système.

Mais on ne va pas entrer dans la théorie de l’information ici.

Et enfin, l’univers

Les galaxies, les étoiles, notre système solaire, notre bonne vieille terre13 – tous des systèmes ordonnées par la dépense d’énergie d’origine nucléaire dans les étoiles ou dans l’espace même (champs gravitationnels ou électromagnétiques, champs d’origine quantique). Un jour, l’entropie va gagner là aussi. Même si l’espèce humaine trouve refuge ailleurs avant que notre soleil explose dans cinq milliard d’années14, l’univers entier deviendra à la longue diffus, froid et ténébreux. Carpe diem !

Cela nous laisse quand même le temps de réaliser (ou pas) nos rêves de gloire ou de richesses, d’art ou de prouesses sexuelles, de connaissances ou d’humanité afin de prouver à nous-mêmes que nous sommes capables de dépasser notre héritage évolutionnaire15 et, par notre propre énergie, de faire reculer – provisoirement – l’entropie de notre petit coin de l’univers.1

Why entropy?

Entropy, you said “entropy”? What’s that?

Well, I’ll try to explain. And since I replied poorly to the same question once, I ‘ll try to do it right this time. We will see that entropy is essential for understanding why certain things happen and others do not. For example, a whole egg can become a broken egg (all too easily), while the contrary has never been known to happen. And this concept goes a long way.

Here is a spoiler: Entropy is a measure of disorder. Nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time. If you have already understood that, you can skip the rest of this article. Otherwise, let’s go on.

Entropy is a concept, a mathematical construction, often calculated as the quotient of two other figures. (For instance, in a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T. But you do not have to remember that.) The thing which makes it difficult is that you must calculate entropy. You can not hold a grain of entropy in your hand, or even feel it, like warmth or cold. There is no such thing as an entropy meter. But despite the abstract nature of entropy, we can have a qualitative and usable understanding of it.

Across history, entropy has been studied at different times for different purposes, for example, to build the best possible heat engine. In a word, the notion of entropy has evolved. Different goals sometimes have brought up different aspects of entropy. It has been shown that they are compatible. There is even an entropy matching information, which I admit I do not understand.

Entropy is a central concept in the branch of physics called thermodynamics. Thermodynamics is the science of all forms of energy, so it underlies all sciences. Energy is the capacity for doing work, such as moving or lifting a heavy object. Another form of “work” for which energy is required is the movement of electrons in a wire (called electric current) or the movement of electrically charged ions in the nervous system. There are many others.

Biological, chemical and physical sciences explain that everything is composed of other smaller things, with a few exceptions (the most elementary particles, such as quarks). Even an electromagnetic field, such as light, is composed of photons. To understand the word “composed”, one can imagine a assembly of molecules or grains of something. Food lovers can think of a sauce with water and wine molecules, ground peppercorns, salt ions, aromatic molecules of mushroom, basil and other good things. Yum! (“Yum”in French is “miam”. Thought you’d like to know.)

Now, back to entropy. There are several ways to explain entropy, but the following is certainly the most intuitive version.

Entropy is (approximately) the probability that something composed occurs in a certain state, i. e., with a certain configuration or order of its components.

Paul’s socks

Examples, you want examples. All right, OK

I have a friend whose mother was very hard on him during his childhood. I’ll explain. She insisted that he place his pale blue socks on the left in a drawer of his dresser and his pink socks on the right in the same drawer. You understand immediately how much it upset poor Paul. So what does entropy have to do with it? Simple, my friend.

For simplicity, imagine that Paul has but three pairs of socks, three pale blue socks and three pink ones. (He always wore socks of two different colors. (He did not say whether he had a specific color to each foot and I did not ask.)) At the end of wash day, he can put away his socks in the following manner.

Let’s start with the pale blue socks and assume they are numbered so that we can refer to them separately. (This calculation is not rigorous, but it gives the idea.) For the first sock, Paul can choose number one, number two or number three, having therefrawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that numore three possible choices. After this difficult selection, he has left only two pale blue socks, i. e., only two choices, sock number two or number three, assuming he has already chosen number one. And finally, there is only one third choice, the last remaining sock. So there are three choices for the first. For each of these, there are two choices for the second, making a total of six choices (two for each of the three, 3 times 2 is 6). And for the last, there is only one sock and therefore one choice. The same goes for the pink socks. So in all, Paul has twelve choices, six blue and six pink. There are 12 ways to store his socks in order. Note that, since the socks are not really numbered, Paul’s mother, checking up on her son’s obedience, has no way of knowing which of the 12 choices he made, but does see that the socks are arranged the way she wants.

One day, in a fit of despair, Paul pulled out the drawer and shook it violently for a minute or two before calming down. In what state were the socks after that? All mixed together, n’est-ce pas? So when Paul shook the drawer, entropy struck! Now the socks are in a fairly complete mess. Let’s see how we can do this starting the same wash day. Now any sock can go anywhere in the drawer. So for the first, Paul may take any of the six socks. For the second choice, he still has five socks to choose from for each choice of the first sock. That makes 6 times 5 or 30 choices for just the first two socks. For the third choice, there remains four socks, so the total number of choices so far is 6 times 5 times 4 = 120 choices already. We still have 3 socks and we already know that 3 socks may be chosen six ways, so the total is 120 times 6 or 720 choices!

So there are 12 possible choices for storing socks in order by pale blue or pink and 720 ways to do it in disorder. Which is more likely? Obviously, disorder, by 720 against 12. And entropy is a measure of this disorder.

Of course, physicists are serious people and do not speak (much) of the entropy of a sock drawer, but they so talk about the entropy of molecules, for example, those of a gas. For a gas, a sample of which may contain something like 1023 molecules, disorder wins every time. Physicists talk about the number of different micro-states (pink sock here, blue sock there) corresponding to the same macro-state.

Raoul’s marbles

Raoul has a box full of identical marbles. Suppose that we exchange two marbles. You can specify the location of each marble individually and thereby talk about their detailed states (micro-states) before and after the exchange of the two marbles. These micro-states are different. But to little Raoul’s eyes, the box of marbles (the macro-state) has not changed at all; Raoul is unable to distinguish between the two micro-states. In this way, a lot of different micro-states are taken as being the same macro-state. The greater the number of marbles, the greater the number of combinations that seem identical to Raoul and the greater the entropy of the box of marbles.

How simple it really is!

The laws of thermodynamics

Ok, this part is a tiny bit hairy, then we get back to fun stuff. So hang on to your seat.
The physicist who came up with this way of defining entropy was named Ludwig Boltzmann. On his tombstone is engraved the formula th he introduced to the world:

S = k logW

where S is the entropy, k is a constant named after Herr Dr Boltzmann, and W is the number of possible micro-systems (12 or 720 for socks) which correspond to the same macro-system (the sock drawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100. In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states. Greater entropy corresponds to a greater degree of disorder.

But why are we talking about entropy, what’s the point? Well, because the second law of thermodynamics says:

In a physical process, the entropy of the world always increases.

It’s pretty obvious when you think about the story of Paul when he shakes the sock drawer. For your information, the first law says:

The energy of the universe does not change; it is always conserved.

This is the justly famous law of conservation of energy.

There is a third law, which would take longer to explain, but it goes like this:

The entropy of a system at a temperature of absolute zero (0 Kelvin, or -273.15° Celsius) is zero.

But there’s one more. After all that, someone noticed that there should be another law more fundamental than the three others, making the laws of thermodynamics a trinity of four laws; it was therefore named the zeroth law:

If A is a system in thermal equilibrium with the system B, and B is in thermal equilibrium with the system C, then the systems A and C are also in thermal equilibrium.

So what, you ask? Well, this means that there is a way to measure the thermal state of a system. It could be the system B, where B is simply a thermometer, which is in thermal equilibrium with A and C, where “be in thermal equilibrium” means that both have the same temperature. This is very important. What would we talk about if we did not have the zeroth law? “I think that the logarithm of the entropy will increase tomorrow.” No way!

Thermodynamics is assumed to be valid everywhere in the universe except for very small objects such as elementary particles (protons, electrons, quarks, etc.). Physicists are well aware that today’s theories will be modified and augmented by those of tomorrow… except for the second law of thermodynamics. They believe firmly in it!

More on disorder

Paul’s wife, Drusilla, often reminisces about her childhood holidays at the seaside and about the sand castles she loved to build. Beautiful castles with moats, drawbridges, battlements and a huge tower in the center. Let’s think about that.

When Drusilla built a sand castle, she placed the grains exactly where she wanted them. Since she was very exacting (or finicky), it was as if she placed each grain in a specific location of the castle and not anywhere else. Nothing was left to chance. But when the mother of Drusilla called her to come and eat and go to bed, she got into bed with sadness, because she knew what she would discover the next morning. Getting up early and running in pajamas to the beach with the morning sunlight in her eyes and the strident noise of screeching seagulls in her ears, she came to her castle – or what was left of it – and found that the tide (the assistant of entropy) had done its job. The beautiful, well-ordered castle had been reduced to a shapeless heap of wet sand. Mud. But Drusilla was brave – or stubborn, if you prefer – she ate her breakfast and began again.

After having seen the case of Paul’s socks, it is clear that the choice of destination of Drusilla’s sand grains was similar, although the grains were much more numerous than Paul’s socks and the destinations (wall, keep, etc. but not the moats) outnumbered both sides of Paul’s dresser drawer. She placed the grains, as already stated, to create order. If she had just thrown the grains anywhere on the site of the future non-castle, it would have formed a heap quite similar to that left by the tide, in complete disorder – and therefore greater entropy.

So we should notice two things:

  • Nature has selected the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.
  • Drusilla had to work, i. e., to expend energy, to build the ordered castle. Although she reduces by a small amount the entropy of the universe in the form of the ordered castle, the work she has done has helped to increase the entropy elsewhere, so the total entropy of the universe has increased.

That is why the Second Law and entropy are important to us. It takes work, i. e., energy (work capacity) to reduce the entropy of a bit of the universe, while increasing the entropy of the whole universe. But in the end, nature always finds a way to thwart our plans. It is permitted to find that a little sad.

And the egg

We can now see the role of entropy in everyday events, such as an egg that breaks. Or rather, I break because I don’t pay enough attention to what I am doing. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.

But if I break an egg on purpose in order to make an omelet, the well-stirred egg is even more disorganized and therefore in an even greater state of entropy.

We can say that time never backs up, but always runs forwards, because that way entropy may increase, as in the following example …

The human body

Entropy = death! Alas, yes. Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy, just like Drusilla did for her castles. The energy comes from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (Adenosine triphosphate, since you asked.), which transmits the energy it contains to cells in order to maintain the metabolism of the whole body, electricity in neurons and flexing of the muscles. But with age, disorder comes in the form of small pains, arthritis and more serious things. And death is followed by disintegration (total disorder = higher entropy). Old rockin’ chair may get us, but entropy will get us last.

More abstract – information

We have seen that entropy is a measure of the number of indistinguishable micro-states which correspond to the same macro-state of something. Exchange two blue socks or two grains of sand and we do not notice the difference. So entropy corresponds to missing information, data that we could imagine knowing but don’t. Like if grain of sand 2,143,768 is on top of gran 3,472,540 or the other way around. So the higher the entropy, the less information we know about the underlying micro-state.

The entropy is thus a measure of the hidden information contained in a system.

But we will not go into information theory here.

And finally, the universe

Galaxies, stars, our solar system and good old Earth (Carl Sagan’s “pale blue dot”) are all ordered by the expense of nuclear energy from the stars or space itself (gravitational fields or electromagnetic fields of quantum origin). One day, entropy will “prevail” there too. Even if the human species finds refuge elsewhere before our sun explodes in five billion years or so, the entire universe will eventually become diffuse, cold and dark. Carpe Diem!

This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by ou