Entropy, you said “entropy”? What’s that?
Well, I’ll try to explain. And since I replied poorly to the same question once, I ‘ll try to do it right this time. We will see that entropy is essential for understanding why certain things happen and others do not. For example, a whole egg can become a broken egg (all too easily), while the contrary has never been known to happen. And this concept goes a long way.
Here is a spoiler: Entropy is a measure of disorder. Nature seeks disorder, and therefore increased entropy. Entropy thus serves to predict which direction a process will take, forwards or backwards in time. If you have already understood that, you can skip the rest of this article. Otherwise, let’s go on.
Entropy is a concept, a mathematical construction, often calculated as the quotient of two other figures. (For instance, in a process where heat energy is transferred to an object, the entropy of the process is the amount of heat divided by the temperature (S = Q / T. But you do not have to remember that.) The thing which makes it difficult is that you must calculate entropy. You can not hold a grain of entropy in your hand, or even feel it, like warmth or cold. There is no such thing as an entropy meter. But despite the abstract nature of entropy, we can have a qualitative and usable understanding of it.
Across history, entropy has been studied at different times for different purposes, for example, to build the best possible heat engine. In a word, the notion of entropy has evolved. Different goals sometimes have brought up different aspects of entropy. It has been shown that they are compatible. There is even an entropy matching information, which I admit I do not understand.
Entropy is a central concept in the branch of physics called thermodynamics. Thermodynamics is the science of all forms of energy, so it underlies all sciences. Energy is the capacity for doing work, such as moving or lifting a heavy object. Another form of “work” for which energy is required is the movement of electrons in a wire (called electric current) or the movement of electrically charged ions in the nervous system. There are many others.
Biological, chemical and physical sciences explain that everything is composed of other smaller things, with a few exceptions (the most elementary particles, such as quarks). Even an electromagnetic field, such as light, is composed of photons. To understand the word “composed”, one can imagine a assembly of molecules or grains of something. Food lovers can think of a sauce with water and wine molecules, ground peppercorns, salt ions, aromatic molecules of mushroom, basil and other good things. Yum! (“Yum”in French is “miam”. Thought you’d like to know.)
Now, back to entropy. There are several ways to explain entropy, but the following is certainly the most intuitive version.
Entropy is (approximately) the probability that something composed occurs in a certain state, i. e., with a certain configuration or order of its components.
Paul’s socks
Examples, you want examples. All right, OK
I have a friend whose mother was very hard on him during his childhood. I’ll explain. She insisted that he place his pale blue socks on the left in a drawer of his dresser and his pink socks on the right in the same drawer. You understand immediately how much it upset poor Paul. So what does entropy have to do with it? Simple, my friend.
For simplicity, imagine that Paul has but three pairs of socks, three pale blue socks and three pink ones. (He always wore socks of two different colors. (He did not say whether he had a specific color to each foot and I did not ask.)) At the end of wash day, he can put away his socks in the following manner.
Let’s start with the pale blue socks and assume they are numbered so that we can refer to them separately. (This calculation is not rigorous, but it gives the idea.) For the first sock, Paul can choose number one, number two or number three, having therefrawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that numore three possible choices. After this difficult selection, he has left only two pale blue socks, i. e., only two choices, sock number two or number three, assuming he has already chosen number one. And finally, there is only one third choice, the last remaining sock. So there are three choices for the first. For each of these, there are two choices for the second, making a total of six choices (two for each of the three, 3 times 2 is 6). And for the last, there is only one sock and therefore one choice. The same goes for the pink socks. So in all, Paul has twelve choices, six blue and six pink. There are 12 ways to store his socks in order. Note that, since the socks are not really numbered, Paul’s mother, checking up on her son’s obedience, has no way of knowing which of the 12 choices he made, but does see that the socks are arranged the way she wants.
One day, in a fit of despair, Paul pulled out the drawer and shook it violently for a minute or two before calming down. In what state were the socks after that? All mixed together, n’est-ce pas? So when Paul shook the drawer, entropy struck! Now the socks are in a fairly complete mess. Let’s see how we can do this starting the same wash day. Now any sock can go anywhere in the drawer. So for the first, Paul may take any of the six socks. For the second choice, he still has five socks to choose from for each choice of the first sock. That makes 6 times 5 or 30 choices for just the first two socks. For the third choice, there remains four socks, so the total number of choices so far is 6 times 5 times 4 = 120 choices already. We still have 3 socks and we already know that 3 socks may be chosen six ways, so the total is 120 times 6 or 720 choices!
So there are 12 possible choices for storing socks in order by pale blue or pink and 720 ways to do it in disorder. Which is more likely? Obviously, disorder, by 720 against 12. And entropy is a measure of this disorder.
Of course, physicists are serious people and do not speak (much) of the entropy of a sock drawer, but they so talk about the entropy of molecules, for example, those of a gas. For a gas, a sample of which may contain something like 1023 molecules, disorder wins every time. Physicists talk about the number of different micro-states (pink sock here, blue sock there) corresponding to the same macro-state.
Raoul’s marbles
Raoul has a box full of identical marbles. Suppose that we exchange two marbles. You can specify the location of each marble individually and thereby talk about their detailed states (micro-states) before and after the exchange of the two marbles. These micro-states are different. But to little Raoul’s eyes, the box of marbles (the macro-state) has not changed at all; Raoul is unable to distinguish between the two micro-states. In this way, a lot of different micro-states are taken as being the same macro-state. The greater the number of marbles, the greater the number of combinations that seem identical to Raoul and the greater the entropy of the box of marbles.
How simple it really is!
The laws of thermodynamics
Ok, this part is a tiny bit hairy, then we get back to fun stuff. So hang on to your seat.
The physicist who came up with this way of defining entropy was named Ludwig Boltzmann. On his tombstone is engraved the formula th he introduced to the world:
S = k logW
where S is the entropy, k is a constant named after Herr Dr Boltzmann, and W is the number of possible micro-systems (12 or 720 for socks) which correspond to the same macro-system (the sock drawer). (W is for Wahrscheinlichkeit, german for probability.) If you have forgotten, the logarithm to base 10 of a number is the power of 10 which corresponds to that number. E.g., log(100)=2, because 102 = 100. In other words, the greater the number of micro-systems, the greater the entropy, since it is proportional to the logarithm of the number of micro-states. Greater entropy corresponds to a greater degree of disorder.
But why are we talking about entropy, what’s the point? Well, because the second law of thermodynamics says:
In a physical process, the entropy of the world always increases.
It’s pretty obvious when you think about the story of Paul when he shakes the sock drawer. For your information, the first law says:
The energy of the universe does not change; it is always conserved.
This is the justly famous law of conservation of energy.
There is a third law, which would take longer to explain, but it goes like this:
The entropy of a system at a temperature of absolute zero (0 Kelvin, or -273.15° Celsius) is zero.
But there’s one more. After all that, someone noticed that there should be another law more fundamental than the three others, making the laws of thermodynamics a trinity of four laws; it was therefore named the zeroth law:
If A is a system in thermal equilibrium with the system B, and B is in thermal equilibrium with the system C, then the systems A and C are also in thermal equilibrium.
So what, you ask? Well, this means that there is a way to measure the thermal state of a system. It could be the system B, where B is simply a thermometer, which is in thermal equilibrium with A and C, where “be in thermal equilibrium” means that both have the same temperature. This is very important. What would we talk about if we did not have the zeroth law? “I think that the logarithm of the entropy will increase tomorrow.” No way!
Thermodynamics is assumed to be valid everywhere in the universe except for very small objects such as elementary particles (protons, electrons, quarks, etc.). Physicists are well aware that today’s theories will be modified and augmented by those of tomorrow… except for the second law of thermodynamics. They believe firmly in it!
More on disorder
Paul’s wife, Drusilla, often reminisces about her childhood holidays at the seaside and about the sand castles she loved to build. Beautiful castles with moats, drawbridges, battlements and a huge tower in the center. Let’s think about that.
When Drusilla built a sand castle, she placed the grains exactly where she wanted them. Since she was very exacting (or finicky), it was as if she placed each grain in a specific location of the castle and not anywhere else. Nothing was left to chance. But when the mother of Drusilla called her to come and eat and go to bed, she got into bed with sadness, because she knew what she would discover the next morning. Getting up early and running in pajamas to the beach with the morning sunlight in her eyes and the strident noise of screeching seagulls in her ears, she came to her castle – or what was left of it – and found that the tide (the assistant of entropy) had done its job. The beautiful, well-ordered castle had been reduced to a shapeless heap of wet sand. Mud. But Drusilla was brave – or stubborn, if you prefer – she ate her breakfast and began again.
After having seen the case of Paul’s socks, it is clear that the choice of destination of Drusilla’s sand grains was similar, although the grains were much more numerous than Paul’s socks and the destinations (wall, keep, etc. but not the moats) outnumbered both sides of Paul’s dresser drawer. She placed the grains, as already stated, to create order. If she had just thrown the grains anywhere on the site of the future non-castle, it would have formed a heap quite similar to that left by the tide, in complete disorder – and therefore greater entropy.
So we should notice two things:
- Nature has selected the state of greater disorder; in doing so, it follows the Second Law and increases the entropy of the universe.
- Drusilla had to work, i. e., to expend energy, to build the ordered castle. Although she reduces by a small amount the entropy of the universe in the form of the ordered castle, the work she has done has helped to increase the entropy elsewhere, so the total entropy of the universe has increased.
That is why the Second Law and entropy are important to us. It takes work, i. e., energy (work capacity) to reduce the entropy of a bit of the universe, while increasing the entropy of the whole universe. But in the end, nature always finds a way to thwart our plans. It is permitted to find that a little sad.
And the egg
We can now see the role of entropy in everyday events, such as an egg that breaks. Or rather, I break because I don’t pay enough attention to what I am doing. The whole egg is a well-ordered structure; the broken egg is much less organized, so in a higher state of entropy.
But if I break an egg on purpose in order to make an omelet, the well-stirred egg is even more disorganized and therefore in an even greater state of entropy.
We can say that time never backs up, but always runs forwards, because that way entropy may increase, as in the following example …
The human body
Entropy = death! Alas, yes. Our bodies’ cells, tissues and organs are highly ordered biological objects. We maintain this order by expending energy, just like Drusilla did for her castles. The energy comes from the food we eat, which our bodies and cells digest and reduce to elements of energy, especially ATP (Adenosine triphosphate, since you asked.), which transmits the energy it contains to cells in order to maintain the metabolism of the whole body, electricity in neurons and flexing of the muscles. But with age, disorder comes in the form of small pains, arthritis and more serious things. And death is followed by disintegration (total disorder = higher entropy). Old rockin’ chair may get us, but entropy will get us last.
More abstract – information
We have seen that entropy is a measure of the number of indistinguishable micro-states which correspond to the same macro-state of something. Exchange two blue socks or two grains of sand and we do not notice the difference. So entropy corresponds to missing information, data that we could imagine knowing but don’t. Like if grain of sand 2,143,768 is on top of gran 3,472,540 or the other way around. So the higher the entropy, the less information we know about the underlying micro-state.
The entropy is thus a measure of the hidden information contained in a system.
But we will not go into information theory here.
And finally, the universe
Galaxies, stars, our solar system and good old Earth (Carl Sagan’s “pale blue dot”) are all ordered by the expense of nuclear energy from the stars or space itself (gravitational fields or electromagnetic fields of quantum origin). One day, entropy will “prevail” there too. Even if the human species finds refuge elsewhere before our sun explodes in five billion years or so, the entire universe will eventually become diffuse, cold and dark. Carpe Diem!
This still leaves us time to realize (or not) our dreams of glory or wealth, art or sexual prowess, knowledge or humanity – to prove to ourselves that we can shake off our evolutionary inheritance and by ou
Entropy is not a measure of disorder
The late great Professor Lambert went a great deal of trouble to get our text books corrected.:
http://www.esalq.usp.br/lepse/imgs/conteudo_thumb/Entropy-Is-Simple—If-We-Avoid-The-Briar-Patches.pdf
And this youtube gives a really good visual summary
https://www.youtube.com/watch?v=YM-uykVfq_E
I do know about Lambert’s ideas and refer to them on my (other) page Thermodynamics Thanks for your comment. Nice video.