The Irony of Entropy

If you’ve studied or read much physics, or science in general—or, more recently, information theory—you’ve probably come across the subject of entropy.  Entropy is one of the most prominent and respected, if not revered, concepts in all of science.  It’s often roughly characterized as a measure of the “disorder” in a system, but this doesn’t refer to disorder in the sense of “chaos”, where outcomes are so dependent upon initial states, at such high degrees of feedback and internal interaction, that there’s no way to know based on any reasonable amount of information what those specific outcomes will be.  Entropy is, in a sense, just the opposite of that.  A state of maximal entropy is a state where the outcome is always—or near enough that it doesn’t matter—going to be pretty much the same as it is now.  A cliché way of demonstrating this is to compare a well-shuffled deck of cards to one with the cards “in order”.  Each possible shuffled configuration is unique, but for practical purposes nearly all of them are indistinguishable, and there are vastly greater numbers of ways for a deck to be “out of order” than “in order”.

Let’s quickly do that math.  The number of orders into which a deck of fifty-two cards can be randomly shuffled is 52 x 51 x 50 x……x 3 x 2 x 1, notated traditionally as 52!  It’s a big number.  How big?

80658175170943878571660636856403766975289505440883277824000000000000.

To quote Stephen Fry on Q.I., “If every star in our galaxy had a trillion planets, and each planet had a trillion people living on it, and each person had a trillion packs of cards, which they somehow managed to shuffle simultaneously at 1000 times per second, and had done this since the Big Bang, they would only just, in 2012, be starting to get repeat shuffles.”  Now, how many ways are there to have a deck of cards arranged with the cards in increasing order (Ace to King) within each suit, even without worrying about the ordering between the suits?  If my math is correct, there are only 4! ways to do that, which is 4 x 3 x 2 x 1, or 24.  To call that a tiny fraction of the above number is a supreme understatement.  This comparison should give you an idea of just how potent the tendencies are with which we’re dealing.

You could describe entropy as a state of “useless” energy.  Entropy is, famously, the subject of the Second Law of Thermodynamics, and that law states that, in any closed system, entropy always tends to stay the same or increase, usually the latter.  (The First Law is the one that says that in a closed system total energy is constant.)

When energy is “partitioned”—say you have one part of a room that’s hot and another part of a room that’s cold—there’s generally some way to harness that energy’s tendency to want to equilibrate, and to get work done that’s useful for creatures like us.  Entropy is the measure of how much that energy has achieved a state of equilibrium, in which there’s no useful difference between one part of the room and the other.

This draws attention to the irony of entropy.  The tendency of systems to become more entropic drives the various chemical and physical processes on which life depends.  Energy tends to flow down gradients until there’s no energy gradient left, and its this very tendency that creates the processes that life uses to make local order.  But that local order can only be accomplished by allowing, and even encouraging, the entropy of the overall world to increase, often leading to a more rapid general increase than would have happened otherwise.  Think of burning gasoline to make a car go.  You achieve useful movement that can accomplish many desired tasks, but in the process, you burn fuel into smaller, simpler, less organized, higher entropy states than they would have arrived at, if left alone, for a far longer time.  The very processes that sustain life—that are life—can only occur by harnessing and accelerating the increase of net entropy in the world around them.

Although it seems like the principle most well-embodied in Yeats’s Second Coming, wherein he states, “Things fall apart, the centre cannot hold; / mere anarchy is loosed upon the world,” entropy is held in highest regard—or at least unsurpassed respect—by physicists.  Sir Arthur Eddington famously pointed out that, if your ideas seem to contradict most understood laws of physics, or seem to go against experimental evidence, it’s not necessarily disreputable to maintain them, but if your ideas contradict the Second Law of Thermodynamics, you’re simply out of luck.  And Einstein said of the laws of thermodynamics, and of entropy in particular, “It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”

The reason the Second Law of Thermodynamics is so indisputable is because, at root, it owes its character to basic mathematics, to probability and statistics.  As I demonstrated in the playing card example, there are vastly more ways for things to be “disordered”—more arrangements of reality that are indistinguishable one from another—than there are configurations that contain gradients or differences that can give rise to patterns and “useful” information.  WAAAAAAY more.

The Second Law isn’t at its heart so much a law of physics as it is a mathematical theorem, and mathematical theorems don’t change.  You don’t need to retest them, because logic demands that, once proven, they remain correct.  We know that, in a flat plane, the squares of the lengths of the two shorter sides of a right triangle add up to the square of the length of the longest side (You can prove this for yourself relatively easily; it’s worth your time, if you’re so inclined.)  We know that the square root of two is an irrational number (one that cannot be expressed as a ratio of any two whole numbers, no matter how large).  We know that there are an infinite number of prime numbers, and that the infinity of the “real” numbers is a much larger infinity than that which describes the integers.  These facts have been proven mathematically, and we need no longer doubt them, for the very logic that makes doubt meaningful sustains them.  It’s been a few thousand years since most of these facts were first demonstrated, and no one has needed to update those theorems (though they might put them in other forms).  Once a theorem is done, it’s done.  You’re free to try to disprove any of the facts above, but I would literally bet my life that you will fail.

The Second Law of Thermodynamics has a similar character, because it’s just a statement of the number of ways “things” can be ordered in indistinguishable ways compared to the number of ways they can be ordered in ways that either carry useful information or can be harnessed to be otherwise useful in supporting—or in being—lifeforms such as we.  Entropy isn’t the antithesis of life, for without its tendency to increase, life could neither exist nor sustain itself.  But its nature demands that, in the long run, all relative order will come to an end, in the so-called “heat death” of the universe.

Of course, entropy is probabilistic in character, so given a universe-sized collection of random elementary particles, if you wait long enough, they will come together in some way that would be a recognizable universe to us.  Likewise, if you shuffle a deck of cards often enough, you will occasionally shuffle them into a state of ordered suits, and if you play your same numbers in the Powerball lottery often enough, for long enough, you will eventually win.

Want my advice?  Don’t hold your breath.

Odds are we should teach probability and statistics

probability

Among the many educational reforms that I think we ought to enact in the United States, and probably throughout the world, one of the most useful would be to begin teaching all students about probability and statistics.  These should be taught at a far younger age than that at which most people begin to learn them—those that ever do.  Most of us don’t get any exposure to the concepts until we go to university, if we do even there.  My own first real, deep exposure to probability and statistics took place when I was in medical school…and I had a significant scientific background even before then.

Why should we encourage young people to learn about such seemingly esoteric matters?  Precisely because they seem so esoteric to us.  Statistics are quoted with tremendous frequency in the popular press, in advertising, and in social media of all sorts, but the general public’s understanding of them is poor.  This appears to be an innate human weakness, not merely a failure of education.  We learn basic arithmetic with relative ease, and even the fundamentals of Newtonian physics don’t seem too unnatural when compared with most people’s intuitions about the matter.  Yet in the events of everyday life, statistics predominate.  Even so seemingly straightforward a relation as the ideal gas law (PV=nRT, relating the volume, temperature, and pressure of a gas) is the product of the statistical effects of innumerable molecules interacting with each other.  In this case, the shorthand works well enough, because the numbers involved are so vast, but in more ordinary interactions of humans with each other and with the world, we do not have numbers large enough to produce reliable, simplified formulae.  We must deal with statistics and probability.  If we do not, then we will fail to deal with reality as accurately as we could, which cannot fail to have consequences, usually bad ones.  As I often say (paraphrasing John Mellencamp) “When you fight reality, reality always wins.” Continue reading “Odds are we should teach probability and statistics”

In defense of scientism

[Originally posted on robertelessar.com on July 20th, 2017]

On this 48th anniversary of the Apollo 11 moon landing, I want to talk a little bit about science, and how it, in principle, can apply to nearly every subject in life.

The word science is derived from Latin scientia, and earlier scire, which means “to know.”  I am, as you might have guessed, a huge fan of science, and have in the past even been a practitioner of it.  But science is not just a collection of facts, as many have said before me.  Science is an approach to information, and more generally to reality itself, a blend of rationalism and empiricism that calls on us to apply reason to the phenomena which we find in our world and to understand, with increasing completeness, the rules by which our world operates.  Personally, I think there are few—and possibly no—areas into which the scientific method cannot be applied to give us a greater understanding of, insight into, and control of, our world and our experience. Continue reading “In defense of scientism”

What Are Black Holes?

Originally posted February 23, 2012

A very old friend of mine—that is, one I’ve known a long time, he’s no older than I am, and I hope I don’t yet count as “very” old—suggested that I write an article about what exactly black holes are. So, I thought about it, for all of about two seconds, and realized that black holes would be a great topic for a general science article. Continue reading “What Are Black Holes?”

Diabetes For Beginners – Part 1

Diabetes is an illness of which I suspect almost all adults in America are aware. I also suspect that most people know that it has something to do with high blood sugar and that having high blood sugar is a bad thing. Still, I imagine there are a fair few people out there who haven’t really got a lot more understanding of it than that—including some people who have the disease—because they haven’t really had it explained to them in terms they can follow. After all, doctors—of which I am one—don’t often take the time necessary to make sure that their patients fully understand the ins and outs of a disease process. Partly this is because, when one understands something on a very complex level, it seems like it’s going to take serious effort to explain it to someone who doesn’t have the same educational background. However, I think this is a failure of imagination and a bit of mental laziness on our part as doctors. The Nobel-Prize-winning physicist Richard Feynman used to prepare “freshman lectures” about physics subjects when laypeople asked him about topics they didn’t understand. If he found that he couldn’t prepare one, he recognized that failure as an indication that the subject wasn’t well-enough understood! Continue reading “Diabetes For Beginners – Part 1”