Выбрать главу

This property gives rise to one explanation for the 'arrow of time', the curious fact that it is easy to scramble an egg but impossible to unscramble one. Time flows in the direction of increasing entropy. So scrambling an egg makes the egg more disordered -that is, increases its entropy which is in accordance with the Second Law. Unscrambling the egg makes it less disordered, and decreases energy, which conflicts with the Second Law. An egg is not a gas, mind you, but thermodynamics can be extended to solids and liquids, too.

At this point we encounter one of the big paradoxes of physics, a source of considerable confusion for a century or so. A different set of physical laws, Newton's laws of motion, predicts that scrambling an egg and unscrambling it are equally plausible physical events. More precisely, if any dynamic behaviour that is consistent with Newton's laws is run backwards in time, then the result is also consistent with Newton's laws. In short, Newton's laws are 'time-reversible'.

However, a thermodynamic gas is really just a mechanical system built from lots of tiny spheres.

In this model, heat energy is just a special type of mechanical energy, in which the spheres vibrate but do not move en masse. So we can compare Newton's laws with the laws of thermodynamics. The First Law of Thermodynamics is simply a restatement of energy conservation in Newtonian mechanics, so the First Law does not contradict Newton's laws.

Neither does the Third Law: absolute zero is just the temperature at which the spheres cease vibrating. The amount of vibration can never be less than zero.

Unfortunately, the Second Law of Thermodynamics behaves very differently. It contradicts Newton's laws. Specifically, it contradicts the property of time-reversibility. Our universe has a definite direction for its 'arrow of time', but a universe obeying Newton's laws has two distinct arrows of time, one the opposite of the other. In our universe, scrambling eggs is easy and unscrambling them seems impossible. Therefore, according to Newton's laws, in a time-reversal of our universe, unscrambling eggs is easy but scrambling them is impossible. But Newton's laws are the same in both universes, so they cannot prescribe a definite arrow of time.

Many suggestions have been made to resolve this discrepancy. The best mathematical one is that thermodynamics is an approximation, involving a 'coarse-graining' of the universe in which details on very fine scales are smeared out and ignored. In effect, the universe is divided into tiny boxes, each containing (say) several thousand gas molecules. The detailed motion inside such a box is ignored, and only the average state of its molecules is considered.

It's a bit like a picture on a computer screen. If you look at it from a distance, you can see cows and trees and all kinds of structure. But if you look sufficiently closely at a tree, all you see is one uniformly green square, or pixel. A real tree would still have detailed structure at this scale leaves and twigs, say -but in the picture all this detail is smeared out into the same shade of green.

In this approximation, once 'order' has disappeared below the level of the coarse-graining, it can never come back. Once a pixel has been smeared, you can't unsmear it. In the real universe, though, it sometimes can, because in the real universe the detailed motion inside the boxes is still going on, and a smeared-out average ignores that detail. So the model and the reality are different. Moreover, this modelling assumption treats forward and backward time asymmetrically. In forward time, once a molecule goes into a box, it can't escape. In contrast, in a time-reversal of this model it can escape from a box but it can never get in if it wasn't already inside that box to begin with.

This explanation makes it clear that the Second Law of Thermodynamics is not a genuine property of the universe, but merely a property of an approximate mathematical description.

Whether the approximation is helpful or not thus depends on the context in which it is invoked, not on the content of the Second Law of Thermodynamics. And the approximation involved destroys any relation with Newton's laws, which are inextricably linked to that fine detail. Now, as we said, Shannon used the same word 'entropy' for his measure of the structure introduced by statistical patterns in an information source. He did so because the mathematical formula for Shannon's entropy looks exactly the same as the formula for the thermodynamic concept. Except for a minus sign. So thermodynamic entropy looks like negative Shannon entropy: that is, thermodynamic entropy can be interpreted as 'missing information'. Many papers and books have been written exploiting this relationship -attributing the arrow of time to a gradual loss of information from the universe, for instance. After all, when you replace all that fine detail inside a box by a smeared-out average, you lose information about the fine detail. And once it's lost, you can't get it back. Bingo: time flows in the direction of information-loss.

However, the proposed relationship here is bogus. Yes, the formulas look the same ... but they apply in very different, unrelated, contexts. In Einstein's famous formula relating mass and energy, the symbol c represents the speed of light. In Pythagoras's Theorem, the same letter represents one side of a right triangle. The letters are the same, but nobody expects to get sensible conclusions by identifying one side of a right triangle with the speed of light. The alleged relationship between thermodynamic entropy and negative information isn't quite that silly, of course. Not quite.

As we've said, science is not a fixed body of 'facts', and there are disagreements. The relation between Shannon's entropy and thermodynamic entropy is one of them. Whether it is meaningful to view thermodynamic entropy as negative information has been a controversial issue for many years. The scientific disagreements rumble on, even today, and published, peer-reviewed papers by competent scientists flatly contradict each other.

What seems to have happened here is a confusion between a formal mathematical setting in which 'laws' of information and entropy can be stated, a series of physical intuitions about heuristic interpretations of those concepts, and a failure to understand the role of context. Much is made of the resemblance between the formulas for entropy in information theory and thermodynamics, but little attention is paid to the context in which those formulas apply. This habit has led to some very sloppy thinking about some important issues in physics.

One important difference is that in thermodynamics, entropy is a quantity associated with a state of the gas, whereas in information theory it is defined for an information source: a system that generates entire collections of states ('messages'). Roughly speaking, a source is a phase space for successive bits of a message, and a message is a trajectory, a path, in that phase space. In contrast, a thermodynamic configuration of molecules is a point in phase space. A specific configuration of gas molecules has a thermodynamic entropy, but a specific message does not have a Shannon entropy. This fact alone should serve as a warning. And even in information theory, the information 'in' a message is not negative information-theoretic entropy. Indeed the entropy of the source remains unchanged, no matter how many messages it generates.