The wavefunction, many “worlds”, and Fourier analysis…some light-hearted morning thoughts

Okay, I had a bit of a bee in my bonnet over this subject this morning (this isn’t the first time), and I thought I’d just get my thoughts out and into the digital universe…hopefully to trigger someone else thinking, and maybe to get someone talking to me about the subject, if they’re interested.  Here you go, wherever you are:

Here’s a link to Sean Carrol’s latest book Something Deeply Hidden

Here’s a link to Max Tegmark’s book Our Mathematical Universe

Here’s a link to David Deutsch’s book The Fabric of Reality

Here’s a link to David Deutsch’s book The Beginning of Infinity

Human civilization is just not at equilibrium

1200px-Chaos_star.svg

I was riding in to work this morning, listening to volume two of the Audible version of the Feynman Lectures on Physics (he was discussing symmetry in physical laws).  An accident ahead had closed down all but one lane of traffic, so I got to listen to the lecture much longer than I would have normally; even on a Saturday morning, to have only one lane of traffic open meant that things were very much backed up and slowed down.  In such cases, it’s truly a blessing to be able to listen to one of the great minds of the twentieth century sharing his understanding of the nature of the world.  It almost made the traffic jam pleasant.

As I rode along, listening to Feynman, I thought about how cool and interesting I’ve always found physics to be, especially when explained by one of its greatest practitioners and teachers.  And because Feynman mentioned some specifics of biology and chemistry, such as how the chirality of biological molecules is an accident of history, not a law of nature, I realized how cool and interesting I thought the complexity of biology and chemistry were, too.  After that take-off point, I thought about how cool and interesting was the mathematics that underlies physics, and thus underlies chemistry and biology.  And mathematics is much broader and more complex than just what’s used for physics, and it can all be tremendously interesting.  Even the stuff that’s way beyond my expertise* is fascinating when it’s explained by people who are experts, as on the videos on “Numperphile”, and “Three Blue, One Brown”, for instance.

Even human psychology, with all its biases and heuristics, its “system one” and “system two”, it’s knee-jerk reactions and all the irrationality it entails, is fascinating.  Though it frequently seems irreducibly silly, we can often discern why it’s silly, as a system that evolved under a particular set of circumstances that didn’t necessarily require it always to be fully rational.

So why, then, I wondered, is human sociology—and its compatriot, human history—so ­un­fascinating? Not to say that the details can’t be interesting, but the “pattern” it plays out, especially at the level of politics, popular entertainment, social mores, celebrity, and nowadays social media and the rest of the internet and web, is such a muddle.  The movement of vast flocks of starlings and of immense schools of fish can seem eerily precise, and we know that such epiphenomena can be produced merely by having each unit follow a few simple rules.  But large-scale human interactions are almost never reducible to anything consistent.  If there are mathematical patterns, they are difficult to discern.

I don’t know about you, but I find much of human interaction at the largish scale to be simply irritating and stupid.  It’s muddy.  It’s just a mess.  There’s no fractal-level chaos here, with hidden, self-similar intricacy.  It’s just the chaos of an untended garbage dump.

Then, suddenly**, it occurred to me:  human society is such a mess at least partly because it’s not in any kind of equilibrium.  It’s an unstable system whose parameters are constantly changing.  The human population is growing, and has been for millennia, at an ever-increasing rate.  New technologies, from the initiation of agriculture, to the invention of money, to the creation of the wheel, and to weapons, on into the modern age of deep science and potent technology, produce an ever-changing background set of assumptions in the system.  New methods of interaction and exchange of information, from spoken language, to written language, to moveable type, to the telegraph, to radio, to television, to the internet, have—especially in recent centuries—changed irrevocably the state of what had come before, producing new and ever-messier epiphenomena.

There has not been anything like the time needed for any kind of sociological and civilizational natural selection to take place.  There’s been no way for long-term evolutionarily stable strategies*** to be selected amongst the phenomena of human interactions at large scales, because before any such selection could happen, something fundamental in the driving parameters of the system changed radically.

So I guess maybe we shouldn’t feel too bad about the fact that politics is such an insane mess, that fashion and celebrity and entertainment are bastions of such goofiness, that we have trouble working out the best economic system (if there is such a thing, and if we are even able to define “best by what measure, best for what purpose?”), that social media is such a nightmare of infantile behavior, and that history is such a catalog of tragedy and horror.

The weather may be a chaotic system that’s all but impossible to predict in specifics beyond a few days, but the physics of it is at least consistent****.  The “physics” of human interaction is being subjected to ever-changing constants of nature (if you’ll allow the metaphor).  They may not change by that much at any given time, but we know that even tiny changes in the true “constants of nature” would lead to radically different universes, in most of which we would not be able to exist even for a microsecond.  We should probably be in awe of the fact that civilization survives at all.

This doesn’t mean we should stop trying to solve all the many and varied problems of human civilization.  We must solve them.  But perhaps we (meaning I) should be more forgiving of just how stupid and inefficient and counterproductive and puerile and horrible so many of our institutions seem so often to be.  We’re still in the primordial soup here, and the primordial soup is cloudy.  We must work on it; we must strive to make it ever clearer.  If we don’t, natural selection will do what it seems to do best, which is to wipe out things that don’t have an evolutionarily stable strategy.  But we (meaning I) can at least perhaps try not to be so judgmental about the idiocy of our institutions.  You don’t judge the mind of a trilobite by the standards of the nervous system of a naked house ape.

But you still must address your problems and try to reach some semblance of an evolutionarily stable strategy (or set thereof) for society and civilization.  The trilobites are extinct and have been for a very long time, and the same thing could easily happen to human civilization, and to human individuals.  Nature would not care, would not give us any second chances, would not bend its rules in the slightest for us.  As far as we can tell, it never has, and it never will.

Still, though it seems any mathematical and modellable science of sociology is at a tremendous disadvantage (Asimov’s “psychohistory” is a long way away) and may never be able to become formalized until after humans have reached a Trantor-level of equilibrium, maybe the problems of trying to reach that stage—trying to survive long enough to reach that stage—can be interesting enough in and of themselves.  In any case, interesting or not, they’re problems that can’t be dodged.  Sometimes you just have to shut up work.


*The physics being discussed at that point in the Feynman lectures was not beyond my expertise, for what it’s worth.  I was a Physics major for a little over a year undergrad at Cornell, so I’m not quite a layman in the field.  And as an M.D., my familiarity with biology—at least parts of it—meets the legal definition of “expert”.

**Really.  It was honestly like an epiphany.

***I’m not referring to biological evolution here, but to a broader form of natural selection of sociological states.

****And thus, predictive climate science can be done

If you want to play the game well, you need to learn the rules

the gambler

Today I’m going to deal with something that’s a bit of a pet peeve of mine, albeit one that may seem nebulous at first.  The thing to which I refer is the common attitude that math and science, such as is taught in primary and secondary school, are not relevant to most people’s daily lives.

I don’t mean by this that most people don’t recognize how important math and science are to technology and to civilization.  I doubt there are many people with such a limited worldview.  I’m thinking of the people who may recognize how powerful and useful mathematics and science are, and who see all things they’ve done for humanity, but who think that their own time learning anything beyond the basics of arithmetic was a waste of time, and that they’ve never used any of it since they learned it in school.

This is silly.  This is foolish.  I have a few rejoinders to such claims.

First, I want to point out that people who exercise—by running, biking, lifting weights, doing push-up, whatever—don’t complain that they’ve never had to do push-ups in their day-to-day life or as part of their jobs.  They don’t whine that they never need to flee predators or chase down prey to survive.  The reason you don’t hear such nonsense is that most people know that the purpose of most exercise isn’t to perfect one’s ability to, say, lift weights beautifully, but to be healthier, stronger, and more physically able.  This attitude can be applied at least as well to mental exercise, whether or not you use the specific tasks with which you exercise your mind anywhere else.  Indeed, mental exercise is probably even more useful and beneficial than bodily workouts are, for a few reasons.  First, the brain has greater and more enduring plasticity and freedom to improve than the body does, and that ability of your brain to grow “stronger” can continue throughout your life, barring neurological illness.

Let’s be honest:  humans don’t rule the world because of our physical acumen.  Numerous animals are stronger, faster, and more fearsome physically than humans (though our endurance is world class).  The reason we are so powerful is because of our outrageously overgrown brains, with which we’ve created external memories and communications that allow us to be social in ways that make honeybees and termites look like hikikomori.  So, if you want to maximize and optimize what makes humans strong, you need to maximize and optimize your mind.  Math and science are among the most rigorous and effective ways to do that.

At a deeper level, though, math and science are fundamental to reality itself in ways that all other endeavors are not.  Galileo famously said that the book of nature is written in the language of mathematics, and he was not the first or the last great mind to proclaim such Pythagorean sentiments, because it is a very deep and true statement about the universe.  I like to think of it as follows:  mathematics is the programming language of reality.  Everything that happens obeys it, and it cannot be contradicted, because such contradictions—in reality—simply cannot exist.  And science is our best attempt to learn the specifics of the program is in which we live.

Any good gamer will tell you, if you want to win a game—or thrive, or get as high a score as you’re able, or last as long as you can, whatever your goal might be when playing any given game—you need to know the rules.  Math and science are the rules, ultimately, that our game follows.  Human laws and customs are parochial and provincial; laws of nature are absolute.  If you try to go against them, you’ll eventually collide with them, and when you collide with the laws of nature, it’s always you that breaks.

Also, mathematics and science are a lot of fun if you give them a chance.  Contrary to popular belief, to understand and enjoy math and science can be tremendously captivating and inspiring, and they require and stimulate the imagination in ways that mere human stories and cultural creations never could.  As J. B. S. Haldane said, nature is not only queerer than we suppose, but queerer than we can suppose.  The only thing big enough really to simulate the universe is the universe itself, so we can never completely predict all of what we’re going to find before we find it, but math and science allow us to understand as much of it as possible.

That’s pretty exciting.  And it’s deeper and more real than anything else you’ll ever encounter.  Empires rise and fall; fashion is a form of ugliness so severe that we have to change it every few months; religions come and go; politics is notoriously ephemeral.  But physics is here for good, and as the saying goes, physicists defer only to mathematicians.

And mathematicians defer only to God…if ever.

Audio Blog #4: Disprove Your Theorems by Night

In this audio blog, I discuss the advice (featured in the excellent book How Not To Be Wrong), that you should try to prove your theorems right during the day and try to prove them wrong by night.  I liken this to the very nature of scientific epistemology and the notions of free expressions championed by John Stuart Mill.  I decry the tendency of true believers to try to shut down dissent as failing themselves and their own arguments…among other problems.

The Irony of Entropy

If you’ve studied or read much physics, or science in general—or, more recently, information theory—you’ve probably come across the subject of entropy.  Entropy is one of the most prominent and respected, if not revered, concepts in all of science.  It’s often roughly characterized as a measure of the “disorder” in a system, but this doesn’t refer to disorder in the sense of “chaos”, where outcomes are so dependent upon initial states, at such high degrees of feedback and internal interaction, that there’s no way to know based on any reasonable amount of information what those specific outcomes will be.  Entropy is, in a sense, just the opposite of that.  A state of maximal entropy is a state where the outcome is always—or near enough that it doesn’t matter—going to be pretty much the same as it is now.  A cliché way of demonstrating this is to compare a well-shuffled deck of cards to one with the cards “in order”.  Each possible shuffled configuration is unique, but for practical purposes nearly all of them are indistinguishable, and there are vastly greater numbers of ways for a deck to be “out of order” than “in order”.

Let’s quickly do that math.  The number of orders into which a deck of fifty-two cards can be randomly shuffled is 52 x 51 x 50 x……x 3 x 2 x 1, notated traditionally as 52!  It’s a big number.  How big?

80658175170943878571660636856403766975289505440883277824000000000000.

To quote Stephen Fry on Q.I., “If every star in our galaxy had a trillion planets, and each planet had a trillion people living on it, and each person had a trillion packs of cards, which they somehow managed to shuffle simultaneously at 1000 times per second, and had done this since the Big Bang, they would only just, in 2012, be starting to get repeat shuffles.”  Now, how many ways are there to have a deck of cards arranged with the cards in increasing order (Ace to King) within each suit, even without worrying about the ordering between the suits?  If my math is correct, there are only 4! ways to do that, which is 4 x 3 x 2 x 1, or 24.  To call that a tiny fraction of the above number is a supreme understatement.  This comparison should give you an idea of just how potent the tendencies are with which we’re dealing.

You could describe entropy as a state of “useless” energy.  Entropy is, famously, the subject of the Second Law of Thermodynamics, and that law states that, in any closed system, entropy always tends to stay the same or increase, usually the latter.  (The First Law is the one that says that in a closed system total energy is constant.)

When energy is “partitioned”—say you have one part of a room that’s hot and another part of a room that’s cold—there’s generally some way to harness that energy’s tendency to want to equilibrate, and to get work done that’s useful for creatures like us.  Entropy is the measure of how much that energy has achieved a state of equilibrium, in which there’s no useful difference between one part of the room and the other.

This draws attention to the irony of entropy.  The tendency of systems to become more entropic drives the various chemical and physical processes on which life depends.  Energy tends to flow down gradients until there’s no energy gradient left, and its this very tendency that creates the processes that life uses to make local order.  But that local order can only be accomplished by allowing, and even encouraging, the entropy of the overall world to increase, often leading to a more rapid general increase than would have happened otherwise.  Think of burning gasoline to make a car go.  You achieve useful movement that can accomplish many desired tasks, but in the process, you burn fuel into smaller, simpler, less organized, higher entropy states than they would have arrived at, if left alone, for a far longer time.  The very processes that sustain life—that are life—can only occur by harnessing and accelerating the increase of net entropy in the world around them.

Although it seems like the principle most well-embodied in Yeats’s Second Coming, wherein he states, “Things fall apart, the centre cannot hold; / mere anarchy is loosed upon the world,” entropy is held in highest regard—or at least unsurpassed respect—by physicists.  Sir Arthur Eddington famously pointed out that, if your ideas seem to contradict most understood laws of physics, or seem to go against experimental evidence, it’s not necessarily disreputable to maintain them, but if your ideas contradict the Second Law of Thermodynamics, you’re simply out of luck.  And Einstein said of the laws of thermodynamics, and of entropy in particular, “It is the only physical theory of universal content, which I am convinced, that within the framework of applicability of its basic concepts will never be overthrown.”

The reason the Second Law of Thermodynamics is so indisputable is because, at root, it owes its character to basic mathematics, to probability and statistics.  As I demonstrated in the playing card example, there are vastly more ways for things to be “disordered”—more arrangements of reality that are indistinguishable one from another—than there are configurations that contain gradients or differences that can give rise to patterns and “useful” information.  WAAAAAAY more.

The Second Law isn’t at its heart so much a law of physics as it is a mathematical theorem, and mathematical theorems don’t change.  You don’t need to retest them, because logic demands that, once proven, they remain correct.  We know that, in a flat plane, the squares of the lengths of the two shorter sides of a right triangle add up to the square of the length of the longest side (You can prove this for yourself relatively easily; it’s worth your time, if you’re so inclined.)  We know that the square root of two is an irrational number (one that cannot be expressed as a ratio of any two whole numbers, no matter how large).  We know that there are an infinite number of prime numbers, and that the infinity of the “real” numbers is a much larger infinity than that which describes the integers.  These facts have been proven mathematically, and we need no longer doubt them, for the very logic that makes doubt meaningful sustains them.  It’s been a few thousand years since most of these facts were first demonstrated, and no one has needed to update those theorems (though they might put them in other forms).  Once a theorem is done, it’s done.  You’re free to try to disprove any of the facts above, but I would literally bet my life that you will fail.

The Second Law of Thermodynamics has a similar character, because it’s just a statement of the number of ways “things” can be ordered in indistinguishable ways compared to the number of ways they can be ordered in ways that either carry useful information or can be harnessed to be otherwise useful in supporting—or in being—lifeforms such as we.  Entropy isn’t the antithesis of life, for without its tendency to increase, life could neither exist nor sustain itself.  But its nature demands that, in the long run, all relative order will come to an end, in the so-called “heat death” of the universe.

Of course, entropy is probabilistic in character, so given a universe-sized collection of random elementary particles, if you wait long enough, they will come together in some way that would be a recognizable universe to us.  Likewise, if you shuffle a deck of cards often enough, you will occasionally shuffle them into a state of ordered suits, and if you play your same numbers in the Powerball lottery often enough, for long enough, you will eventually win.

Want my advice?  Don’t hold your breath.

Odds are we should teach probability and statistics

probability

Among the many educational reforms that I think we ought to enact in the United States, and probably throughout the world, one of the most useful would be to begin teaching all students about probability and statistics.  These should be taught at a far younger age than that at which most people begin to learn them—those that ever do.  Most of us don’t get any exposure to the concepts until we go to university, if we do even there.  My own first real, deep exposure to probability and statistics took place when I was in medical school…and I had a significant scientific background even before then.

Why should we encourage young people to learn about such seemingly esoteric matters?  Precisely because they seem so esoteric to us.  Statistics are quoted with tremendous frequency in the popular press, in advertising, and in social media of all sorts, but the general public’s understanding of them is poor.  This appears to be an innate human weakness, not merely a failure of education.  We learn basic arithmetic with relative ease, and even the fundamentals of Newtonian physics don’t seem too unnatural when compared with most people’s intuitions about the matter.  Yet in the events of everyday life, statistics predominate.  Even so seemingly straightforward a relation as the ideal gas law (PV=nRT, relating the volume, temperature, and pressure of a gas) is the product of the statistical effects of innumerable molecules interacting with each other.  In this case, the shorthand works well enough, because the numbers involved are so vast, but in more ordinary interactions of humans with each other and with the world, we do not have numbers large enough to produce reliable, simplified formulae.  We must deal with statistics and probability.  If we do not, then we will fail to deal with reality as accurately as we could, which cannot fail to have consequences, usually bad ones.  As I often say (paraphrasing John Mellencamp) “When you fight reality, reality always wins.” Continue reading “Odds are we should teach probability and statistics”