Thermodynamics

Thermodynamics is the science of forms of energy, one of our most important and interesting topics, as it underlies other sciences. Energy is the capacity for doing work. Among forms of “work” for which energy is required are lifting a heavy object, making electrons flow in a wire (electric current), streaming electrically charged ions in the nervous system or effecting contraction of muscle fibers (to lift that heavy object).

Over time, thermodynamics has evolved, adapting itself to new discoveries. First elaborated to deal with steam engines and boring cannons1The double entendre is intended., it has been adapted to include such phenomena as electromagnetism and even information.

Thermodynamics is summarized in the form of three laws2The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid..

  1. The energy of the universe does not change; it is always conserved.3There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
  2. In a physical process, the entropy of the universe never decreases.
  3. The entropy of a system at a temperature of absolute zero4Absolute zero is 0 Kelvin, or -273.15° Celsius. is zero.

Law number 1 is the justly famous law of conservation of energy. Law number 2 explains why physical processes don’t play out backwards in time. Number 3 is somewhat more obscure, but important.

Actually, after these three, someone noticed that there should be another, yet more fundamental law; it was therefore named the zeroth law, making the laws of thermodynamics a trinity of four:

0. If A is a system in thermal equilibrium with system B, and B is in thermal     equilibrium with system C, then systems A and C are also in thermal equilibrium.

This provides a way to measure the thermal state of a system. If system B is simply a thermometer, this says that if A and C are each in thermal equilibrium with it, they both have the same temperature.

Entropy is a concept, a mathematical construction, in some contexts calculated as the quotient of two other quantities: In a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T). We shall soon see another formula for it. The point is that you cannot hold a grain of entropy in your hand or feel it, like warmth or cold. There is no such thing as a simple entropy meter. But the abstract nature of entropy does not keep it from being a useful — in fact, an essential — idea.

Most simply stated, entropy is a measure of disorder. As every housewife or mother knows, nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time.

From a statistical viewpoint, entropy can be defined as follows.

Entropy is the probability5In fact, the logarithm of the probability. that something occurs in a certain state, i. e., with a certain configuration or order of its components.

Currently, many physicists and chemists prefer to speak of entropy as a measure of energy dispersal, rendering the dispersed energy unusable.6Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html An example of such dispersed energy is that lost to friction as heat, like that which renders a perpetual-motion machine impossible.

Physical, chemical and biological sciences explain that everything is composed of other smaller things, with a few exceptions, such as some of the most elementary particles, quarks. Even an electromagnetic field like light or radio waves is composed of photons. To understand the word “composed”, one can imagine an assembly of molecules or grains of something. If you interchange two identical molecules, you have a different system but one which you cannot distinguish from the first. One macro-state (like a bowl of sugar) is said to be composed of different, indistinguishable micro-states (because we do not know where the individual grains are – nor do we care). The number of interchangeable micro-states can be quite large. The probability of a macro-state’s occurring is a function of the number of indistinguishable states and so is its entropy.

S = k logW

where S is the entropy, k is the so-called Boltzmann constant, and W is the number of possible micro-systems which correspond to the same macro-system. (Originally, W stood for Wahrscheinlichkeit, German for probability.) In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states7In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.. Greater entropy corresponds to a higher probability, in turn reflecting a greater degree of disorder, since more component particles can be exchanged without our noticing.

Imagine building a sand castle. Small lumps of wet sand are placed in a precisely ordered fashion to create the various structures of the castle – walls, moats, turrets and so forth. But later, the tide does its job, reducing the well-ordered castle to a shapeless heap of wet sand. The grains are no longer in order, but are as if they had been thrown down just anywhere, leaving a formless heap, with more indistinguishable micro-states and in complete disorder – and therefore greater entropy.

So we should notice two things:

  • Nature selects the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.

  • Building the ordered sand castle takes work, i. e., expenditure of energy. Although the castle’s construction in an ordered form reduces by a small amount the entropy of the universe, the work it took has consumed energy and therefore increased the entropy elsewhere, so the total entropy of the universe has increased. In general, one can say that increasing complexity requires more energy input to keep entropy change positive.

That is why the Second Law and entropy are important. It takes work, i. e., energy, to reduce the entropy locally, although it nevertheless increases the entropy of the universe as a whole.

Another much-cited example of increased disorder, and therefore entropy, is the breaking of an egg. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.

As for us, entropy = death! Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (adenosine triphosphate), which then transmits the energy it contains to cells which use it to maintain the metabolism of the body. But with age, disorder comes in the form of small pains and dysfunctions. Old entropy’s got us.8To be sung to the tune of “Old rockin’ chair”.

The universe itself is not exempt from the Second Law. Galaxies, stars, our solar system and good old Earth are all ordered at the expense of nuclear energy from the stars or space itself (gravitational or electromagnetic fields or other more exotic or less-well-understood fields like the Higgs field or dark energy). One day, entropy will prevail there too. Even if the human species should find refuge elsewhere before our sun explodes in some five billion years, the entire universe will ultimately become cold, dark and diffuse.9This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest. Carpe Diem!

This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by our own efforts reverse – temporarily – the entropy of our little corner of the universe.

Now you can go read about something stranger yet, quantum mechanics.

Notes

Notes
1 The double entendre is intended.
2 The French, ever alert to nuance, call them principles, not laws, because there is no way to prove them from theory, although there are convincing reasons for accepting them as valid.
3 There is a problem with this concept in General Relativity. In curved space, we don’t really know how to compare energy at different points in spacetime.
4 Absolute zero is 0 Kelvin, or -273.15° Celsius.
5 In fact, the logarithm of the probability.
6 Frank Lambert, Entropy site. http://entropysite.oxy.edu/entropy_isnot_disorder.html
7 In case you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100.
8 To be sung to the tune of “Old rockin’ chair”.
9 This scenario is not quite certain, so there are other possibilities, but this is the most serious one, as well as the gloomiest.