A cosmic perspective in everyday life

It can be intimidating to consider the size, scale, and scope of the universe in space and time, and to compare it to the size and length of our everyday lives.  It can make many of our daily concerns seem not merely small and trivial, but utterly irrelevant on the scale of all that happens.  If we’re not careful, it can even drive us into nihilism, or something close to it.  On a cosmic scale, nothing we do ever really matters or seems at first glance to have an impact.  This can be daunting and disheartening.

While I think it’s not useful to go so far as to conclude that everything that happens to everyone is truly meaningless, I do think that taking a larger perspective—even a cosmic perspective—can be both illuminating and useful and might even make us approach life more rationally and more productively. Continue reading “A cosmic perspective in everyday life”

The Irony of Entropy

If you’ve studied or read much physics, or science in general—or, more recently, information theory—you’ve probably come across the subject of entropy.  Entropy is one of the most prominent and respected, if not revered, concepts in all of science.  It’s often roughly characterized as a measure of the “disorder” in a system, but this doesn’t refer to disorder in the sense of “chaos”, where outcomes are so dependent upon initial states, at such high degrees of feedback and internal interaction, that there’s no way to know based on any reasonable amount of information what those specific outcomes will be.  Entropy is, in a sense, just the opposite of that.  A state of maximal entropy is a state where the outcome is always—or near enough that it doesn’t matter—going to be pretty much the same as it is now.  A cliché way of demonstrating this is to compare a well-shuffled deck of cards to one with the cards “in order”.  Each possible shuffled configuration is unique, but for practical purposes nearly all of them are indistinguishable, and there are vastly greater numbers of ways for a deck to be “out of order” than “in order”.

Let’s quickly do that math.  The number of orders into which a deck of fifty-two cards can be randomly shuffled is 52 x 51 x 50 x……x 3 x 2 x 1, notated traditionally as 52!  It’s a big number.  How big?

80658175170943878571660636856403766975289505440883277824000000000000.

To quote Stephen Fry on Q.I., “If every star in our galaxy had a trillion planets, and each planet had a trillion people living on it, and each person had a trillion packs of cards, which they somehow managed to shuffle simultaneously at 1000 times per second, and had done this since the Big Bang, they would only just, in 2012, be starting to get repeat shuffles.”  Now, how many ways are there to have a deck of cards arranged with the cards in increasing order (Ace to King) within each suit, even without worrying about the ordering between the suits?  If my math is correct, there are only 4! ways to do that, which is 4 x 3 x 2 x 1, or 24.  To call that a tiny fraction of the above number is a supreme understatement.  This comparison should give you an idea of just how potent the tendencies are with which we’re dealing.

You could describe entropy as a state of “useless” energy.  Entropy is, famously, the subject of the Second Law of Thermodynamics, and that law states that, in any closed system, entropy always tends to stay the same or increase, usually the latter.  (The First Law is the one that says that in a closed system total energy is constant.)

When energy is “partitioned”—say you have one part of a room that’s hot and another part of a room that’s cold—there’s generally some way to harness that energy’s tendency to want to equilibrate, and to get work done that’s useful for creatures like us.  Entropy is the measure of how much that energy has achieved a state of equilibrium, in which there’s no useful difference between one part of the room and the other.

This draws attention to the irony of entropy.  The tendency of systems to become more entropic drives the various chemical and physical processes on which life depends.  Energy tends to flow down gradients until there’s no energy gradient left, and its this very tendency that creates the processes that life uses to make local order.  But that local order can only be accomplished by allowing, and even encouraging, the entropy of the overall world to increase, often leading to a more rapid general increase than would have happened otherwise.  Think of burning gasoline to make a car go.  You achieve useful movement that can accomplish many desired tasks, but in the process, you burn fuel into smaller, simpler, less organized, higher entropy states than they would have arrived at, if left alone, for a far longer time.  The very processes that sustain life—that are life—can only occur by harnessing and accelerating the increase of net entropy in the world around them.

Although it seems like the principle most well-embodied in Yeats’s Second Coming, wherein he states, “Things fall apart, the centre cannot hold; / mere anarchy is loosed upon the world,” entropy is held in highest regard—or at least unsurpassed respect—by physicists.  Sir Arthur Eddington famously pointed out that, if your ideas seem to contradict most understood laws of physics, or seem to go against experimental evidence, it’s not necessarily disreputable to maintain them, but if your ideas contradict the Second Law of Thermodynamics, you’re simply out of luck.  And Einstein said of the laws of thermodynamics, and of entropy in particular, “It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”

The reason the Second Law of Thermodynamics is so indisputable is because, at root, it owes its character to basic mathematics, to probability and statistics.  As I demonstrated in the playing card example, there are vastly more ways for things to be “disordered”—more arrangements of reality that are indistinguishable one from another—than there are configurations that contain gradients or differences that can give rise to patterns and “useful” information.  WAAAAAAY more.

The Second Law isn’t at its heart so much a law of physics as it is a mathematical theorem, and mathematical theorems don’t change.  You don’t need to retest them, because logic demands that, once proven, they remain correct.  We know that, in a flat plane, the squares of the lengths of the two shorter sides of a right triangle add up to the square of the length of the longest side (You can prove this for yourself relatively easily; it’s worth your time, if you’re so inclined.)  We know that the square root of two is an irrational number (one that cannot be expressed as a ratio of any two whole numbers, no matter how large).  We know that there are an infinite number of prime numbers, and that the infinity of the “real” numbers is a much larger infinity than that which describes the integers.  These facts have been proven mathematically, and we need no longer doubt them, for the very logic that makes doubt meaningful sustains them.  It’s been a few thousand years since most of these facts were first demonstrated, and no one has needed to update those theorems (though they might put them in other forms).  Once a theorem is done, it’s done.  You’re free to try to disprove any of the facts above, but I would literally bet my life that you will fail.

The Second Law of Thermodynamics has a similar character, because it’s just a statement of the number of ways “things” can be ordered in indistinguishable ways compared to the number of ways they can be ordered in ways that either carry useful information or can be harnessed to be otherwise useful in supporting—or in being—lifeforms such as we.  Entropy isn’t the antithesis of life, for without its tendency to increase, life could neither exist nor sustain itself.  But its nature demands that, in the long run, all relative order will come to an end, in the so-called “heat death” of the universe.

Of course, entropy is probabilistic in character, so given a universe-sized collection of random elementary particles, if you wait long enough, they will come together in some way that would be a recognizable universe to us.  Likewise, if you shuffle a deck of cards often enough, you will occasionally shuffle them into a state of ordered suits, and if you play your same numbers in the Powerball lottery often enough, for long enough, you will eventually win.

Want my advice?  Don’t hold your breath.

What is it with gravitons and black holes?

I occasionally wonder about what physicist really think regarding the hypothetical particles, gravitons, those carriers of the gravitational force mandated by the need* to quantize all the forces of nature.  Specifically, I wonder how they behave in and around black holes.

I know, from my understanding of General Relativity, that the influence of gravity travels at the speed of light, and the recent LIGO results, and all other experimental results of which I’ve heard, are consistent with that.  This must surely mean that the proposed gravitons travel at the speed of light, and are thus mass-less particles.  And if they are carrying a force, they must have some form of inherent energy, which means that, according to Einstein at least, their path would be affected by gravity.  This seems contradictory in some ways, but it’s my understanding that the electrical force produced by a moving electron also acts backward on itself, so I guess that’s not completely unreasonable…though here I’m veering further away from any deep knowledge, much to my sorrow.

My real question applies to the surface of an event horizon, that boundary in space-time within which all things are separated from the outside by the strength of the gravitational force – more particularly, according to Einstein, by the degree of curvature of space-time.  If gravitons are particles, carrying the gravitational force, are they constrained by the effects of the event horizon, or –  presumably because they wouldn’t be self-interacting – do they simply pass through it, it being irrelevant to their motion, unlike all other things with finite speeds…which means everything.  That sometimes seems contradictory to me, though by no means am I certain that I’m thinking correctly about this.  Could it be that the gravitons within and outside of an event horizon are two separate populations of gravitons, with the external ones somehow being generated at the horizon?  If not, then how can a particle ignore the degree of gravity, unless, of course, as a mentioned above, they are not self-interacting – which wouldn’t be unusual, since, if I understand correctly, photons also don’t interact with other photons.  But photons would, obviously, interact with gravitons, of course, otherwise they wouldn’t be effected by gravity, as we know they are…the most extreme example of this being at a black hole.

I know that a possible explanation for this might be found in M theory, in which we exist in a 3-brane that floats in a larger, higher-dimensional “bulk,” and that gravitons, unlike all the more “ordinary” particles are not constrained to remain within that brane, but can go above and below it, so to speak, thus bypassing any barrier that is exclusive to the brane.  But I don’t know if this really deals with the issue.

And, of course, how can the idea of gravity as a force, mediated by a quantum particle, be reconciled with the convincing and highly fruitful model of gravity as the consequence of the curvature of space-time?  Obviously, I don’t expect anyone to know the deep answer to this question, since it’s the biggest, most fundamental problem in modern physics:  our two best, most powerful theories of the world don’t work when brought together.  But if anyone out there has any idea of at least the form of such a possible reconciliation – i.e. do proponents of quantum gravity think that it will eliminate the notion of curved space-time, or do they think, somehow, that it will be an expression thereof – I would be delighted to hear from you.  My best reading to date on things like string theory hasn’t given me any real insight into the possible shape of such a unification.

Anyway, these are some of the thoughts that are troubling me this Monday morning.  I’d love to know any of your thoughts in response, or if you have any recommendations on further study materials, I would welcome those as well.


* due to the Uncertainty Principle, among other things.