An English physicist, Lord Rayleigh (1842-1919), worked out an equation which showed exactly this. The radiation emitted by a black body increased steadily as one went up the frequencies. However, in actual practice, a frequency peak was reached after which, at higher fre quencies still, the quantity of radiation decreased again.
Rayleigh's equation was interesting but did not reflect reality.
Physicists referred to this prediction of the Rayleigh equation as the "Violet Catastrophe"-the fact that every body that bad energy to radiate ought to radiate practically all of it in the ultraviolet and beyond.
Yet the whole point is that the Violet Catastrophe does not take place. A radiating body concentrated its radiation in the low frequencies. It radiated chiefly in the infrared at temperatures below, say, 1000' C., and radiated mainly in the visible region even at a temperature as high as
6000' C., the temperature of the solar surface.
Yet Rayleigh's equation was worked out according to the very best principles available anywhere in physical theory-at the time. His work was an ornament of what we now call Classical Physics.
Wien himself worked out an equation which described the frequency distribution of black-body radiation in the bigh-frequency range, but he had no explanation for why it worked there, and besides it only worked for the high frequency range, not for the low-frequency.
Black, black, black was the color of the physics mood all through the later 1890s.
Bt4t then arose in 1899 a champion, a German physicist, Max Karl Ernst Ludwig Planck. He reasoned as fol -lows…
If beautiful equations worked out by impeccable reason ing from highly respected physical foundations do not de scribe the truth as we observe it, then either the reason ing or the physical foundations or both are wrong.
And if there is nothing wrong about the reasoning (and nothing wrong could be found in it), then the physical foundations had to be altered.
The physics of the day required that all frequencies of light be radiated with equal probability by a black body, and Planck therefore proposed that, on the contrary, they were not radiated with equal probability. Since the equal probability assumption required that more and more light of higher and higher frequency be radiated, whereas the reverse was observed, Planck further proposed that the probability of radiation ought to decrease as frequency increased.
In that case, we would now have two effects. The first effect would be a tendency toward randomness which would favor high frequencies and increase radiation as frequency was increased. Second, there was the new Planck effect of decreasing probability of radiation as frequency went up. This would favor low frequencies and decrease radiation as frequency was increased.
In the low-frequency range the first effect is dominant, but in the high-frequency range the second effect increas ingly overpowers the first. Therefore, in black-body tadia tion, as one goes up the frequencies, the amount of radia tion first increases, reaches a peak, then decreases again exactly as is observed.
Next, suppose the temperature is raised. 'ne first effect can't be changed, for randomness is randomness. But sup 178 pose that as the temperature is raised, the probability of emitting high-frequency radiation increases. The second effect, then, is steadily weakened as the temperature goes up. In that case, the radiation continues to increase with increasing frequency for a longer and longer time before it is overtaken and repressed by the gradually weakening second effect. The peak radiation, consequently, moves into higher and higher frequencies as the temperature goes up-precisely as Wien had discovered.
On this basis, Planck was able to work out an equation that described black-body radiation very nicely both in the low-frequency and high-frequency range.
However, it is all very well to say that the higher the frequency the lower the probability of radiation, but why?
There was nothing in the physics of the time to explain that, and Planck had to make up something new.
Suppose that energy did not flow continuously, as physicists had, always assumed, but was given off in pieces.
Suppose there were "energy atoms" and these increased in size as frequency went up. Suppose, still further, that light of a particular frequency could not be emitted unless enough energy had been accumulated to make up an "energy atom" of the size required by that frequency.
The higher the frequency the larger the "energy atom" and the smaller the probability of its accumulation at any given instant of time. Most of the energy would be lost as radiation of lower frequency, where the "energy atoms" were smaller and more easily accumulated. For that rea son, an object at a temperature of 400' C. would radiate its heat in the infrared entirely. So few "energy atoms" of visible light size would be accumulated that no visible glow would be produced.
As temperature went up, more energy would be gen erally available and the probabilities of accumulating a high-frequency "energy atom" would increase. At 6000' C. most of the radiation would be in "energy atoms" of visible light, but the still larger "energy atoms" of ultraviolet would continue to be formed only to a minor extent.
But how big is an "energy atom"? How much energy does it contain? Since this "how much" is a key question, Planck, with admirable directness, named the "energy atom" a quantum, which is Latin for "how much?" the plural is quanta.
For Planck's equation for the distribution of black-body radiation to work, the size of the quantum had to be directly proportional to the frequency of the radiation. To express this mathematically, let us represent the size of the quantum, or the amount of energy it contains, by e (for energy). The frequency of radiation is invariably repre sented by physicists by means of the Greek letter nu (v).
If energy (e) is proportional to frequency (v), then e must be equal to v multiplied by some constant. This con stant, called Planck's constant, is invariably represented as h. The equation, giving the size of a quantum for a par ticular frequency of radiation, becomes: e = hv (Equation 1)
It is this equation, presented to the world in 1900, which is the Continental Divide that separates Classical Physics from Modern Physics. In Classical Physics, energy was considered continuous; in Modern Physics it is con sidered to be composed of quanta. To put it another way, in Classical Physics the value of h is considered to be 0; in Modern Physics it is considered to be greater than 0.
It is as though there were a sudden change from con sidering motion as taking place in a smooth glide, to mo tion as taking place in a series of steps.
There would be no confusion if steps were long ga lumphing strides. It would be easy, in that case, to dis tinguish steps from a glide. But suppose one minced along in microscopic little tippy-steps, each taking a tiny frac tion of a second. A careless glance could not distinguish that from a glide. Only a painstaking study would show that your head was bobbing slightly with each step. The smaller the steps, the harder to detect the difference from a glide.
In the same way, everything would depend on just how big individual- quanta were; on how "grainy" energy was.
The size of the quanta depends on 'the size of Planck's constant, so let's consider that for a while.
If we solve Equation I for h, we get: h = elv (Equation 2) Energy is very frequently measured in ergs (see Chapter 13). Frequency is measured as "so many per second" and its units are therefore "reciprocal seconds" or "I/second."
We must treat the units of h as we treat h itself. We get h by dividing e by v; so we must get the units of h by dividing the units of e by the units of v. When we divide ergs by I/second we are multiplying ergs by sec onds, and we find the units of h to be "erg-seconds." A unit which is the result of multiplying energy by time is said, by physicists, to be one of "action." Therefore, Planck's constant is expressed in units of action.