Over the years, compelling arguments dismissing a heart-disease epidemic, like the 1957 AHA report, have been published repeatedly in medical journals. They were ignored, however, not refuted. David Kritchevsky, who wrote the first textbook on cholesterol, published in 1958, called such articles “unobserved publications”: “They don’t fit the dogma and so they get ignored and are never cited.” Thus, the rise and fall of the coronary-heart-disease epidemic is still considered a matter of unimpeachable fact by those who insist dietary fat is the culprit. The likelihood that the epidemic was a mirage is not a subject for discussion.
“The present high level of fat in the American diet did not always prevail,” wrote Ancel Keys in 1953, “and this fact may not be unrelated to the indication that coronary disease is increasing in this country.” This is the second myth essential to the dietary-fat hypothesis—the changing-American-diet story. In 1977, when Senator George McGovern announced publication of the first Dietary Goals for the United States, this is the reasoning he evoked: “The simple fact is that our diets have changed radically within the last fifty years, with great and often very harmful effects on our health.” Michael Jacobson, director of the influential Center for Science in the Public Interest, enshrined this logic in a 1978 pamphlet entitled The Changing American Diet, and Jane Brody of the New York Times employed it in her best-selling 1985 Good Food Book. “Within this century,” Brody wrote, “the diet of the average American has undergone a radical shift away from plant-based foods such as grains, beans and peas, nuts, potatoes, and other vegetables and fruits and toward foods derived from animals—meat, fish, poultry, eggs, and dairy products.” That this changing American diet went along with the appearance of a great American heart-disease epidemic underpinned the argument that meat, dairy products, and other sources of animal fats had to be minimized in a healthy diet.
The changing-American-diet story envisions the turn of the century as an idyllic era free of chronic disease, and then portrays Americans as brought low by the inexorable spread of fat and meat into the American diet. It has been repeated so often that it has taken on the semblance of indisputable truth—but this conclusion is based on remarkably insubstantial and contradictory evidence.
Keys formulated the argument initially based on Department of Agriculture statistics suggesting that Americans at the turn of the century were eating 25 percent more starches and cereals, 25 percent less fats, and 20 percent less meat than they would be in the 1950s and later. Thus, the heart-disease “epidemic” was blamed on the apparently concurrent increase in meat and fat in the American diet and the relative decrease in starches and cereals. In 1977, McGovern’s Dietary Goals for the United States would set out to return starches and cereal grains to their rightful primacy in the American diet.
The USDA statistics, however, were based on guesses, not reliable evidence. These statistics, known as “food disappearance data” and published yearly, estimate how much we consume each year of any particular food, by calculating how much is produced nationwide, adding imports, deducting exports, and adjusting or estimating for waste. The resulting numbers for per-capita consumption are acknowledged to be, at best, rough estimates.
The changing-American-diet story relies on food disappearance statistics dating back to 1909, but the USDA began compiling these data only in the early 1920s. The reports remained sporadic and limited to specific food groups until 1940. Only with World War II looming did USDA researchers estimate what Americans had been eating back to 1909, on the basis of the limited data available. These are the numbers on which the changing-American-diet argument is constructed. In 1942, the USDA actually began publishing regular quarterly and annual estimates of food disappearance. Until then, the data were particularly sketchy for any foods that could be grown in a garden or eaten straight off the farm, such as animals slaughtered for local consumption rather than shipped to regional slaughterhouses. The same is true for eggs, milk, poultry, and fish. “Until World War II, the data are lousy, and you can prove anything you want to prove,” says David Call, a former dean of the Cornell University College of Agriculture and Life Sciences, who made a career studying American food and nutrition programs.
Historians of American dietary habits have inevitably observed that Americans, like the British, were traditionally a nation of meat-eaters, suspicious of vegetables and expecting meat three to four times a day. One French account from 1793, according to the historian Harvey Levenstein, estimated that Americans ate eight times as much meat as bread. By one USDA estimate, the typical American was eating 178 pounds of meat annually in the 1830s, forty to sixty pounds more than was reportedly being eaten a century later. This observation had been documented at the time in Domestic Manners of the Americans, by Fanny Trollope (mother of the novelist Anthony), whose impoverished neighbor during two summers she passed in Cincinnati, she wrote, lived with his wife, four children, and “with plenty of beef-steaks and onions for breakfast, dinner and supper, but with very few other comforts.”
According to the USDA food-disappearance estimates, by the early twentieth century we were living mostly on grains, flour, and potatoes, in an era when corn was still considered primarily food for livestock, pasta was known popularly as macaroni and “considered by the general public as a typical and peculiarly Italian food,” as The Grocer’s Encyclopedia noted in 1911, and rice was still an exotic item mostly imported from the Far East.
It may be true that meat consumption was relatively low in the first decade of the twentieth century, but this may have been a brief departure from the meat-eating that dominated the century before. The population of the United States nearly doubled between 1880 and 1910, but livestock production could not keep pace, according to a Federal Trade Commission report of 1919. The number of cattle only increased by 22 percent, pigs by 17 percent, and sheep by 6 percent. From 1910 to 1919, the population increased another 12 percent and the livestock lagged further behind. “As a result of this lower rate of increase among meat animals,” wrote the Federal Trade Commission investigators, “the amount of meat consumed per capita in the United States has been declining.” The USDA noted further decreases in meat consumption between 1915 and 1924—the years immediately preceding the agency’s first attempts to record food disappearance data—because of food rationing and the “nationwide propaganda” during World War I to conserve meat for “military purposes.”
Another possible explanation for the appearance of a low-meat diet early in the twentieth century was the publication in 1906 of Upton Sinclair’s book The Jungle, his fictional exposé on the meatpacking industry. Sinclair graphically portrayed the Chicago abattoirs as places where rotted meat was chemically treated and repackaged as sausage, where tubercular employees occasionally slipped on the bloody floors, fell into the vats, and were “overlooked for days, till all but the bones of them had gone out to the world as Anderson’s Pure Leaf Lard!” The Jungle caused meat sales in the United States to drop by half. “The effect was long-lasting,” wrote Waverly Root and Richard de Rochemont in their 1976 history Eating in America. “Packers were still trying to woo their customers back as late as 1928, when they launched an ‘eat-more-meat’ campaign and did not do very well at it.” All of this suggests that the grain-dominated American diet of 1909, if real, may have been a temporary deviation from the norm.