The Irony of Entropy

If you’ve studied or read much physics, or science in general—or, more recently, information theory—you’ve probably come across the subject of entropy.  Entropy is one of the most prominent and respected, if not revered, concepts in all of science.  It’s often roughly characterized as a measure of the “disorder” in a system, but this doesn’t refer to disorder in the sense of “chaos”, where outcomes are so dependent upon initial states, at such high degrees of feedback and internal interaction, that there’s no way to know based on any reasonable amount of information what those specific outcomes will be.  Entropy is, in a sense, just the opposite of that.  A state of maximal entropy is a state where the outcome is always—or near enough that it doesn’t matter—going to be pretty much the same as it is now.  A cliché way of demonstrating this is to compare a well-shuffled deck of cards to one with the cards “in order”.  Each possible shuffled configuration is unique, but for practical purposes nearly all of them are indistinguishable, and there are vastly greater numbers of ways for a deck to be “out of order” than “in order”.

Let’s quickly do that math.  The number of orders into which a deck of fifty-two cards can be randomly shuffled is 52 x 51 x 50 x……x 3 x 2 x 1, notated traditionally as 52!  It’s a big number.  How big?

80658175170943878571660636856403766975289505440883277824000000000000.

To quote Stephen Fry on Q.I., “If every star in our galaxy had a trillion planets, and each planet had a trillion people living on it, and each person had a trillion packs of cards, which they somehow managed to shuffle simultaneously at 1000 times per second, and had done this since the Big Bang, they would only just, in 2012, be starting to get repeat shuffles.”  Now, how many ways are there to have a deck of cards arranged with the cards in increasing order (Ace to King) within each suit, even without worrying about the ordering between the suits?  If my math is correct, there are only 4! ways to do that, which is 4 x 3 x 2 x 1, or 24.  To call that a tiny fraction of the above number is a supreme understatement.  This comparison should give you an idea of just how potent the tendencies are with which we’re dealing.

You could describe entropy as a state of “useless” energy.  Entropy is, famously, the subject of the Second Law of Thermodynamics, and that law states that, in any closed system, entropy always tends to stay the same or increase, usually the latter.  (The First Law is the one that says that in a closed system total energy is constant.)

When energy is “partitioned”—say you have one part of a room that’s hot and another part of a room that’s cold—there’s generally some way to harness that energy’s tendency to want to equilibrate, and to get work done that’s useful for creatures like us.  Entropy is the measure of how much that energy has achieved a state of equilibrium, in which there’s no useful difference between one part of the room and the other.

This draws attention to the irony of entropy.  The tendency of systems to become more entropic drives the various chemical and physical processes on which life depends.  Energy tends to flow down gradients until there’s no energy gradient left, and its this very tendency that creates the processes that life uses to make local order.  But that local order can only be accomplished by allowing, and even encouraging, the entropy of the overall world to increase, often leading to a more rapid general increase than would have happened otherwise.  Think of burning gasoline to make a car go.  You achieve useful movement that can accomplish many desired tasks, but in the process, you burn fuel into smaller, simpler, less organized, higher entropy states than they would have arrived at, if left alone, for a far longer time.  The very processes that sustain life—that are life—can only occur by harnessing and accelerating the increase of net entropy in the world around them.

Although it seems like the principle most well-embodied in Yeats’s Second Coming, wherein he states, “Things fall apart, the centre cannot hold; / mere anarchy is loosed upon the world,” entropy is held in highest regard—or at least unsurpassed respect—by physicists.  Sir Arthur Eddington famously pointed out that, if your ideas seem to contradict most understood laws of physics, or seem to go against experimental evidence, it’s not necessarily disreputable to maintain them, but if your ideas contradict the Second Law of Thermodynamics, you’re simply out of luck.  And Einstein said of the laws of thermodynamics, and of entropy in particular, “It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”

The reason the Second Law of Thermodynamics is so indisputable is because, at root, it owes its character to basic mathematics, to probability and statistics.  As I demonstrated in the playing card example, there are vastly more ways for things to be “disordered”—more arrangements of reality that are indistinguishable one from another—than there are configurations that contain gradients or differences that can give rise to patterns and “useful” information.  WAAAAAAY more.

The Second Law isn’t at its heart so much a law of physics as it is a mathematical theorem, and mathematical theorems don’t change.  You don’t need to retest them, because logic demands that, once proven, they remain correct.  We know that, in a flat plane, the squares of the lengths of the two shorter sides of a right triangle add up to the square of the length of the longest side (You can prove this for yourself relatively easily; it’s worth your time, if you’re so inclined.)  We know that the square root of two is an irrational number (one that cannot be expressed as a ratio of any two whole numbers, no matter how large).  We know that there are an infinite number of prime numbers, and that the infinity of the “real” numbers is a much larger infinity than that which describes the integers.  These facts have been proven mathematically, and we need no longer doubt them, for the very logic that makes doubt meaningful sustains them.  It’s been a few thousand years since most of these facts were first demonstrated, and no one has needed to update those theorems (though they might put them in other forms).  Once a theorem is done, it’s done.  You’re free to try to disprove any of the facts above, but I would literally bet my life that you will fail.

The Second Law of Thermodynamics has a similar character, because it’s just a statement of the number of ways “things” can be ordered in indistinguishable ways compared to the number of ways they can be ordered in ways that either carry useful information or can be harnessed to be otherwise useful in supporting—or in being—lifeforms such as we.  Entropy isn’t the antithesis of life, for without its tendency to increase, life could neither exist nor sustain itself.  But its nature demands that, in the long run, all relative order will come to an end, in the so-called “heat death” of the universe.

Of course, entropy is probabilistic in character, so given a universe-sized collection of random elementary particles, if you wait long enough, they will come together in some way that would be a recognizable universe to us.  Likewise, if you shuffle a deck of cards often enough, you will occasionally shuffle them into a state of ordered suits, and if you play your same numbers in the Powerball lottery often enough, for long enough, you will eventually win.

Want my advice?  Don’t hold your breath.

5 thoughts on “The Irony of Entropy”

  1. A curious implication of Entropy is that begs the question about the possibility of life ever developing. If the tendency of life or order is to become disorder then how could life/order arise in the first place? Entropy presupposes hat we have ‘started’ with a fixed amount of order to progressively break down for energy. That is a very undesirable (even implausible) condition of theoretic validity, a paradox.

    Like with all paradoxes we should assume that our understanding or conceptualisation of order and energy are fundamentally flawed, rather than accept the paradoxical consequence of the present understanding. Some additional principle must be at play. This could mean that the law of conservation of energy is false, that energy (and order) can in fact be created, ad infinitum, which perhaps entails rejection of materialism.

    Like

Leave a comment