Noise in the Machine: the homogeneous chaos blues
[ opinion - january 09 ]
For Roger Carlson
Gilbert Ryle nailed Cartesian dualism by killing the ghost in the machine. Now someone named Carl Zimmer wants to use noise in the machine to kill a straw man standing in for genetic determinism. This mushy-headed blather arises as an attempt to simulate science-talk to people inured to comic book encapsulation of the most complex ideas. Who knows what the author intended to convey, or why, but the premise demands deconstruction like Lon Cheney Junior demanded a dew claw.
I don't know squat about Zimmer, having run across this article in a roundabout manner. A Brazilian in an Orkut community opened a topic with the heading 'Fim do determinismo genético'. He posted a link to a Portuguese translation of an article appearing in a Brazilian online newspaper.
Intrigued, I went to the link and began as I usually do, reading the opening sentence claiming that humans differ from one another in an infinity of aspects. It made me hope there had been a mistranslation. No respectable science writer would claim the existence of an actual infinity of physical anythings (though it isn't clear aspects need be physical).
This led me to look for the original source from The New York Times, which I found online, dated April 22, 2008. The author wrote that humans differ in too many ways to count, which is a far cry from differing in an infinity of aspects.
The original essay is entitled 'Expressing our individuality, the way E. Coli do.' Catchy, no? Were it not already penned, I'd have to invent it.
If you take the time to read this short bit, the straw man pops right up into your headlights, assuming you have them on. Otherwise you might miss him, disguised as he is: Zimmer contends we put a bigger premium on nature than nurture when it comes to our individuality. I'm not sure where he gets this idea, unless it's his own weltanschauung. It's not mine.
Nor is it the view I ever run across, excepting the bizarre commercial that instructs me I get my cholesterol from my aunt. Most of the people I know believe that when they get a disease or condition, it's their own fault, usually dietary, and not that of their genes. They like to feel in control, I think. Sort of the inverse of conspiracy theory or Existentialism as substitute for God.
As you read along, you find this Zimmer trying to convince you that you think bacteria like E. coli (proper name Escherichia) are machines. Which is amusing. The only human I know of who said any kind of animals (I'm loosening the notion of animal here) were machines was Rene Descartes, who Bertrand Russell claimed didn't believe it but wanted to avoid the physical duress of the Inquisition's enforced insistence that humans were the only souled creatures. Another kind of ghost in the not-machine, so to speak.
I personally would have been surprised if all bacteria in a colony behaved alike, or that the behavior was predictable. But that is not the most interesting aspect of what Zimmer writes. After informing us that bacteria "are not simple machines," he brings in the idea of noise changing the way the E. coli bacteria behave. He says that unlike transistors and wires, "E. coli molecules are floppy, twitchy and unpredictable." This he contrasts to the deterministic behavior of electronic devices.
So right off the bat here, Zimmer misrepresents electronic devices in the way that Descartes misrepresented animals, though my guess is Zimmer did it for money and not fear of torture (he might simply be ignorant). Anyone who has worked with electronic gizmos on spacecraft will have experienced these mechanisms getting out of hand. In the early days of GPS when the satellites disappeared from view for a few hours, on-board atomic clocks might decide to leap in time. When they reappeared they would be so far off that the Kalman filter at the Master Control Station could be falsely persuaded by new measurements that the satellite had hopped to a lunar orbit.
Even more interesting is metastability in certain binary electronic devices known as flip-flops. With only two possible states, the device can become confused and take an indeterminate (random) time to decide whether to flip or flop.
And who is the culprit for these and too many other aspects of electronic misbehavior to count?
Noise, as it turns out. Which is in fact like a weed. That is, akin to the chicory in my garden that is currently out of hand, inedible and blooming and propagating even as I continually pull it up all summer so other plants can grow. But is it a weed? Not in the winter, when the leaves darken red and purple to become radicchio.
Noise is a random process, a not well-defined thingy in the real world from what I can gather. Random processes do have precise mathematical meaning, however fraught with difficulties in the quotidian world swept under the rug of operational definition. If you doubt the difficulties, read chapters two and four of Leo Breiman's classic text Probability. Or at least the discussion at the end of the chapters, though the discourse on conditional probability is particularly mind-bending.
Here is the real deal. It seems the "real world" we live in is a world of aggregates: averages of random stuff at the microscopic level. At least that seems to be the world according to quantum theory. Or statistical mechanics. So to say that electronic or mechanical or electro-mechanical devices behave at the atomic level like machines, that is mechanistically, is specious.
The examples cited above (and numerous others) provide counterexamples to Zimmer's quasi-example, at least as represented. E. coli behaves oddly at the macroscopic level because it is not predictable at the microscopic level, even given identical genes and identical situations. Well, there is a problem with that identical situation bit, since it is not clear there is ever any identical situation. But just as E. coli get trapped in various deviant feedback loops and other aberrant behavior, so can electronic devices. And not predictably, though one might try to replicate a situation exactly.
In fact with noise the idea of replication is fraught with difficulty. In a computer simulation it is possible to use a pseudo-random number generator and begin it with the same seed, getting the exact same pseudo-random sequence of numbers (which is why it is not a random number generator). However, the simulation is not the device itself. This point has been well demonstrated by John Searle in regards to the problem of consciousness in strong artificial intelligence, wherein he points out that the idea of a machine that simulates consciousness is not equivalent to the machine being conscious. Consciousness, argues the materialist Searle, is a physical process akin to digestion, and simulating digestion is not digesting (but consider for argument sake Wim Delvoye's Cloaca). And pseudo-random noise in a software-controlled machine is not random noise in a software-controlled machine.
The mathematical idea of random process is not likely what an engineer might imagine in any case. The idea is that the manifestation is not itself random, but is in fact determined. What is random is the picking of that particular manifestation. As if some infinite dice roll hands us the result. Of course, not being privy to the outcome beforehand, we must contend with the manifestation as if it itself unfolded randomly in time.
So where are we? Starting with an attempt to befuddle us by claiming we confuse phenotype with genotype, we end up with the claim that bacteria are different from machines because of noise. More sleight of hand, it seems, illustrated in the essay's conclusion: "Living things are more than just programs run by genetic software."
And yet electronic machines run by software can become confused as to whether an input is yes or no. (There was a time when engineers believed such behavior from a digital device to be impossible, causing all manner of difficulty in searching for a non-existent software bug, another kind of ghost in the machine.) And noise is the culprit.
The classical feedback-loop known as the phase-locked loop, essential for tracking signals in all sorts of radios and myriad other electronic devices, can suffer all manner of unspeakably unpredictable behavior with noise, from the audible clicks of cycle-slips in FM radio to false lock. More complex types of software feedback loops like the extended Kalman filter can become so confused they eventually insist on the spurious, divergence sometimes termed instability, driving the machines they control to irrational behavior. (The honest-to-God Kalman filter under certain circumstance cannot (in the long run) tell a lie, a condition known as stability.) What is more, the phase-locked loop has been taken as a model for the behavior of living cells.
The real issue, it seems, is not machine versus living thing, but determinism versus randomness.
It brings to mind (the word only a metaphor, but for what?) a conversation with an engineer some years ago. Not a unique conversation, to be sure, because the idea she expressed is common. She said that nothing was ever random. If you knew all the initial conditions and all the forces in a coin toss, for example, you would be able to tell exactly how the coin would land. I replied that was a metaphysical assumption and could not be demonstrated experimentally. Much of the discussion above goes to the heart of showing that such an argument is metaphysical, not physical. It is a metaphysical ideology known as determinism.
To be sure, all evidence of which I am aware is contrary to determinism. No matter how careful a calculation is made, for example, the implementation always misses the mark. Approximate cause and effect seems the nub of perceived determinism. But let's see why causality is probably (in what sense here this word?) not demonstrable.
If I smash a plate with a hammer, I demonstrate a cause and its effect. Determinism at work: the hammer smashes the plate. Right? But what if I hit exactly the same plate, with exactly the same hammer, in exactly the same way again? Would the plate break into the same exact pieces? No one can say. There's the rub. If this cannot be repeated, then it cannot be predicted, and thus is not what could be called deterministic. Hence not really cause and effect except via the fog of imprecision; that is, so long as one doesn't look too closely.
In order to ascertain if this particular event is truly determined, in the sense that given the precise initial conditions one can exactly predict the outcome at whatever level one examines, one might decide to repeat the experiment. If we cannot ever know all the initial conditions, we ought still be able, by repeating the precise experiment, to get precisely the same outcome in a deterministic world.
But how can we repeat this experiment?
Suppose I build a machine to precisely smash a plate placed in precise position, a plate made of precisely the same materials in precisely the same way as its predecessor.
You could argue, This is not the same experiment. What are pieces exactly the same? At what level? Molecular? Atomic? Subatomic? Is it possible to use the same exact hammer again, given that it had already smashed a plate? Is it possible to construct the exact same plate? In fact, would that original plate be the same exact plate a fractional step later? That is, if I smashed the plate a picosecond later would I be smashing the same plate that had existed a picosecond earlier? On a macro enough level, certainly. But do the micro properties make a difference even in this temporally loaded case?
This argument is along the lines of the famous saying of Heraclitus to the effect that one cannot step into the same river twice. Nothing new here. (Actually, we stand before a crossroad with a branch leading to questions of dependence on initial conditions, stability, and chaos theory, but we forgo the fork since it would take us too far afield. Suffice it to say the chaos of chaos theory is not random behavior, though it might appear so to the untrained eye. But look to Smale's horseshoe!)
The question I am putting is, if it were possible to exactly replicate the experiment, would the results be the same?
I see no reason to believe so. I foresee noise fucking things up, changing the outcome ever so slightly. Perhaps only detectable at microscopic level, but still and all different.
So what the hell is deterministic, anyway? Well, it isn't the opposite of random, that's for damned sure.
Deterministic is if you start at some place with specified conditions, you will without fail end up at a place at a later time that can be precisely predicted without deviation. You follow a trajectory, if you will, not in the sense of bogus-speak  as demonstrated by Petraeus and Crocker, but a trajectory in the sense of mechanics. Traditionally, determinism is behavior obtained from differential equations, be they ordinary or partial. The idea is that starting from some initial conditions, like the position and velocity of my hammer, the outcome is completely known once the differential equation describing the hammer blow is known.
As parenthetically hinted above, there is oversimplification here due in part to my own desire to avoid the complication of chaos and the confusion of chaotic bopping for randomness. The little book Chaotic Evolution and Strange Attractors by David Ruelle is a lovely side excursion if you have the time and fuel for it, with a mathematical description of a chaotic information-creating machine. (Akin to an information Cloaca?) However, for our purposes let's pretend to be happy Taoists following the trajectory of the wise man, avoiding struggle as we go with the flow, so to speak (flow in the precise sense).
Noise is a random process, and randomness is not the opposite of determinism. Already we've alluded to as much above, discussing noise at the micro level averaging up to what we experience at the macro level, the quasi-regularity engineers love to extrapolate to determinism fucked by imperfect information. This seems to be perhaps Ruelle's viewpoint, though I am not certain. (There are those who attempt to explain away noise by equating it with chaotic behavior and lack of information as part and parcel of the metaphysical assumption of determinism.) But random behavior is controlled by laws that allow predictions in the long run; of course, as Keynes noted, in the long run we're all dead.
If you're still along, grab a padded, well-secured seat and clutch something substantial. We face a wild-ass ride coming right up, so irregular it will shake loose anything not bolted down. Can't be helped, but you might like it. An honest-to-God Coney Island of the mind, so to speak.
In 1827, English botanist Robert Brown noted peculiar behavior of small particles suspended in fluid. They were, it seems, floppy, twitchy and unpredictable. This jitterbug waltz gained the title Brownian motion. And though this erraticism came to be considered a manifestation of molecular motion, it wasn't until 1905 that Albert Einstein invented a suitable theory. Said kinetic theory got itself amended in the physical world later, since Einstein's own explanation was not completely satisfactory, leading to infractions of one or two laws of nature. The fun part came, however, not from any considerations of natural-law abiding physicists but from lawless mathematicians, beginning with Norbert Wiener.
Norbert ran with this illegal idea and created from it a fantastical creature representing Einstein's logically wanting physical theory. (This academic behavior is a source of irritation to physicists who cannot understand why it is necessary to create an elaborate ruse to make logically inconsistent ideas meet the rigor demanded by the mathematical artist. The most famous example might be Paul Dirac and Laurent Schwartz. Perhaps Emerson's phrase consistency is the hobgoblin of little minds ought be the motto of physicists.) What Norbert created lived in an infinite dimensional space of functions, and was indeed a way of measuring chunks of such large places, albeit in a rather roundabout way for the likes of most of us. He gave it names like the homogeneous chaos (which has nothing to do with the chaos alluded to above, and in fact predates it by half a century more or less), and his process came to be known as the Wiener process, though a lot of mathematicians still call it Brownian motion, being dubious of, or at least unconcerned with, physical reality.
And Norbert made the engineer's white noise a living mathematical fiction via his construction of Brownian motion.
Literary types love terms like white noise, though they seldom have a pig's-eye view of what the hell it is. To be somewhat carelessly careful, white noise is a random process wherein events are independent of one another, no matter how close together they may occur in time. In other words, if you see what this guy did at some time, you will have no idea what it will do "next" or any time later. The white noise immediately forgets where it was. This implies for the engineer what is called a spectrum that is constant across all frequencies, which means in practice that such a process requires infinite power to run it. That's right: if you plugged your white noise machine into the wall socket you'd drain all energy that ever had been or would be created, and it would still not run. Which means you can't trot down to Best Buy and find one of these babies on the shelf. But it represents in some sense the most random behavior to be found in the universe. And it obeys laws.
The law it follows when its paths are summed (meaning integration, for those who have a smattering of education, be it formal or self) is Wiener's process, the fictive pure Brownian motion without physical impediments like drag. Of course, as noted previously Einstein broke the law with this creation since it had no bounds on its velocity. In fact, it turned out to be so irregular in its trajectories, if you can call them such, that it covers an arbitrarily large distance in any finite interval of time. And all the while with hands never leaving the body, so to speak; that is to say, always turning sharply while remaining continuous. So of course this perky speed demon refused to obey the speed limit of light. Note the adjective for the trajectories was continuous, not smooth. Because these paths are so jagged they have corners on the corners, you might say. For a mathematician this means the paths have no derivatives anywhere, which is to say that if you look at the process's path differences over disjoint intervals of time, they will be independent of one another. If you consider smaller and smaller such disjoint time intervals, you begin to see where white noise gets its bad behavior. You can observe this process for any amount of time you wish; the observation is useless because the process will start from the last observed spot as if it were born afresh in that moment.
And so you ask, How the hell can such a thing as this obey laws? Much like politicians and corporate executives (an interchangeable lot, actually) obey laws. Like when I wrote above that the integrated white noise trajectories are Wiener process sample paths; though formally true, that was a lie. There are a couple lies; one simple due to Wiener and one more complex due to Kyioshi Ito that was made intuitive by Steve Rosencrans in some course notes. Think smoothing out second-order wiggles, at least in the Rosencrans experience of Ito. Still pretty jagged, even at that, and giving rise to pesky second-order wiggles engineers ignore at their peril, especially when unaccounted for said second-order wiggles fuck up the extended Kalman filter's gain, causing it to crash and burn.
And now as we finally peak, take a gander down upon erose terrain and behold an endless landscape of bottomless crevices and twisted precipices. Terrifying. But in the aggregate tamed. Because if we have enough riders on these trails, we can average them and get deterministic. That's right. This nearly totally random monster (Gaussian, for those who care about such things) of Wiener's via Einstein provides us a solution to a deterministic problem. Bound some region nicely and give it some functional preconceptions meeting a smattering of conditions and then release a passel of these maniacal wrigglers, capturing them as they try to cross the perimeter and averaging them with respect to those conditioned preconceptions and you solve a classical deterministic partial differential equation.
How the hell does this happen? Well, let's skip the long story and just say that the friendly little BM (as the Brownian motion in its incarnation as Wiener process is often nicknamed) is related to an operator called the Laplacian that lives in infinite-dimensions. This operator has a long history in mathematics and physics. Moreover this operator can be related to the diffusion of heat, the heat equation which is supposed to model the flow of heat via a partial differential equation. Deterministically.
For example, take an infinite wire with perfect insulation and hit it with an outrageously hot torch at a single spot. Just for an instant. Then the heat distributes along this wire according to this heat equation. But there are some issues. For one, no matter how far away from the torch you hold this infinite wire, at torch touch you immediately feel some heat. That somehow seems wrong, but then no one has ever seen an infinitely long, perfectly insulated wire. What is curious is that the Brownian particle dances in the same manner as the heat distributes, which explains why it is compelled to get so far so fast.
Actually the dance of the BM is described probabilistically by the solution to this heat equation. That is to say, the heat equation lays down the law to this floppy, twitchy, and unpredictable process. So this relationship isn't so surprising. Moreover, by choosing more general "elliptic" operators than the Laplacian one can get seemingly more exotic dancers. These are called diffusion processes, and what is amusing is the existence of people who apply this to finance. They have been involved in some remarkable disasters with their techniques, notwithstanding the Nobel prize for economics to the inventors.
The unpredictability of the noise process is kind of misleading anyway. One can make predictions but only in terms of the long run. It's a lot like when a misguided meteorologist claims that because climate scientists cannot predict the short term behavior, they cannot predict the long term either. That is simply false, as work with stable devices such as atomic clocks or gyroscopes demonstrates. Short term forecasts are notoriously bad with noise, but long term trends are better predicted because of averaging. There is a statistical tool developed by engineers for this sort of behavior in such devices, called the Allan variance.
Anyway, it turns out that noise can be used to construct the deterministic, and the deterministic lays down the law to noise. That was what I meant with the idea of weeds and noise: for me, that pesky Brownian motion is not noise so much as just another side of the regular world. Besides solving deterministic problems (albeit inefficiently in a numerical sense), it can also explore both the geometry and the topology of certain mathematical creations residing in arbitrary dimensions.
But for an engineer with a GPS receiver, noise can be a damned weed. And though we have been stuck on white noise, we also encounter what is sometimes called colored or pink noise. This noise has relationships with itself over time. One of the more bizarre variants is the so-called 1/f noise, which is related to fractals it seems. Benoit Mandelbrot is the guy to see about this, and all we can say here is that this noise is very unlike white noise. While white noise has no relationship to its own behavior at any time in the past, 1/f noise has self-relationships that go to the infinite past, whatever that is. It never forgets where it has been, so to speak, whereas white noise immediately forgets. Yet perhaps all these colored noises are the progeny of white noise, but that is way outside the realm of where we ought to go.
Noise permeates everything because so far as I can tell, everything is noise. Unless of course there is the not-noise and not-deterministic. That would be the haphazard, as my old friend and teacher Roger Carlson liked to call it. The opposite of determinism is haphazard, not random. For example, haphazard would be if your neighbors became rhinoceroses; worse, if the E. coli in your gut become rhinoceroses. Small ones, say. Statistical mechanics has no place for such events to occur often, though there is a famous theorem of Poincare regarding events eventually recurring no matter how small their probability (that is, no matter how contrived they may have been originally), so long as positive and you can wait long enough. Sort of a mathematical version of Murphy's law. But we don't expect to see it on the macro scale outside Kafka or Ionesco or similar fantasists. Laws of averages hold sway, keeping Wiener's homogeneous chaos blues away. Hopefully.
1 Bogus-speak is language with the intent of sounding scientific or precise. A terrifying example that has invaded the language is the use of impact to replace the noun effect and verb affect (and perhaps sometimes the verb effect, but this is not clear). Some have argued that this hideous abomination is akin to a fecal impaction of language (mouth turds refusing to budge), much like the use of awesome as an adjective to describe some triviality you might find slightly special. Impacted prose, particularly in the verb case, might be the result of broadcast journalists not being literate enough to grasp the difference between affect and effect, and hence choosing to smear the meaning of a once precise verb as substitute. That this is incorrect can be seen with a bit of reflection: media journalists (and perhaps print journalists, too) are not sufficiently literate to realize there is a difference between affect and effect as verbs (and have been known to use the noun affect when effect was meant).
The real culprit seems to be economists, who in their need for certification as scientists appropriated language and mathematical technique to become a modern cargo cult, emptily parodying physics like ceremonial magic with none of the result. Of course, the excuse is that they cannot control experiments as can physicists, in a lab, though I seem to remember neither Newton nor Einstein were able to bring our planetary system into a lab (or even the Sun and one planet). The actual difference is that when physical theories provide incorrect predictions (and the ability to predict is the hallmark of a theory), physicists replace those theories. Hence the perihelion of Mercury, and relativistic versus classical mechanics. When an economic "theory" makes a prediction, a rare event, and it is wrong, the economist blames reality, not his own ideas. Though these rare events are not easy to come by, consider the beautiful mathematics of Black-Scholes, based on the theory of random processes known as martingales and their integration via Kiyoshi Ito's formula, and the debacle of Long Term Capital Management. A failure of the "theory" of economic engineering. Not the first, nor the last, given the prevalence of derivatives in our financial web. And the culprit, as it turns out, was reality. Out of step with a mathematical model pawned off as an "economic theory."
At any rate, the testimony of Crocker and Petraeus before Congress provides an instructive example of the use of bogus-speech in its most common form. Words like trajectory are employed to give an aura of determinism, of control, as if things are going as expected along the path to which they have been steered. The terms sound precise, scientific, as if taken from automatic control theory, but here the "trajectory" is the conversion of the new Iraqi government into a satrap of the US, an outpost in which to base troops for the empire. Clearly this is not going quite as determined by the initial conditions, at least as seen from the Pentagon or the Bush White House.
Reality is that bogus-speak is an elaborate form of smoke-blowing. When you hear it in most contexts you can be certain someone is bullshitting you. In its most common form, as before Congress, it pits one or more actors against an appropriately august body divided into two or more sides engaged in a zero sum game for which the speakers are tokens, said speakers comporting with appropriate demeanor presenting appropriate language which no one really understands but designed to make everyone concerned delude themselves and wallow in the obvious lie, or else use the lie as an appropriate ass-cover and perhaps excuse for later mea culpas. It is also used to help other parties not part of the august body, such as citizen-consumers, be at ease with what everyone knows in their heart to be a fucking lie. That is, it is part and parcel of a mass self-delusion. [Back]