Finally U.S. physicians are paid more than their counterparts in other countries. This isn’t, however, a large source of the difference in costs compared with administration, drugs, and other problems. The authors of that study comparing U.S. and Canadian administrative costs estimate that higher U.S. physicians’ salaries account for only about 2 percent of the difference in overall costs.
There’s one more terrible defect I should mention in the U.S. system: Insurers have little incentive to pay for preventive care, even when it would save large amounts in future medical costs. The most notorious example is diabetes, where insurers often won’t pay for treatment that might control the disease in its early stages but will pay for the foot amputations that are all too often a consequence of diabetes that gets out of control. This may seem perverse, but consider the incentives to the insurer: The insurer bears the cost when it pays for preventive care, but it’s unlikely to reap the benefits since people often switch insurers, or go from private insurance to Medicare when they reach sixty-five. So medical care that costs money now but saves money in the future may not be worth it from an individual insurance company’s perspective. By contrast, universal systems, which cover everyone for life, have a strong incentive to pay for preventive care.
So far I’ve made the U.S. system sound like a nightmare, which it is for many people. Nonetheless about 85 percent of Americans do have health insurance, and most of them receive decent care. Why does the system work even that well?
Part of the answer is that even in America the government plays a crucial role in providing health coverage. In 2005, 80 million Americans were covered by government programs, mostly Medicare and Medicaid plus other programs such as veterans’ health care. This was less than the 198 million covered by private health insurance—but because both programs are largely devoted to the elderly, who have much higher medical costs than younger people, the government actually pays for more medical care than do private insurers. In 2004 government programs paid for 44 percent of health care in America, while private insurance paid for only 36 percent; most of the rest was out-of-pocket spending, which exists everywhere.
The rest of the reason why the American system works as well as it does is that the great majority of Americans who do have private health insurance get it through their employers. This is partly the result of history—during World War II companies weren’t allowed to raise wages to compete for workers, so many offered health benefits instead. It’s also in large part the result of a special tax advantage: Health benefits, unlike salary, aren’t subject to income or payroll taxes. In order to get this tax advantage, however, an employer has to offer the same health plan to all its employees, regardless of their health history. Employment-based coverage, then, mitigates to some extent the problem of insurers screening out those who really need insurance. Also, large employers to some extent stand up for their employees’ rights to treatment.
As a result of these advantages, employment-based insurance has long provided a workable solution to the health care problem for many Americans—a solution that was good enough to head off demands for a fundamental overhaul of the system. But now that solution, such as it was, is breaking down.
The basic outlines of the U.S. health care system haven’t changed much since 1965, when LBJ created Medicare and Medicaid. Government insurance for the elderly and the poor; employment-based insurance for workers with good jobs at good companies; personal insurance, if you can get it, for those not lucky enough to get employment-based coverage; a scary life without insurance for a significant number of Americans. While the outlines have remained the same, however, the numbers have changed. Employment-based insurance is gradually unraveling. Medicaid has taken up some but not all of the slack. And fear of losing health insurance has come to pervade middle-class America.
The slow-motion health care crisis began in the 1980s, went into brief remission for part of the nineties, and is now back with a vengeance. The core of the crisis is the decline in employment-based insurance. As recently as 2001, 65 percent of American workers had employment-based coverage. By 2006 that was down to 59 percent, with no sign that the downward trend was coming to an end.[9] What’s driving the decline in employment-based coverage is, in turn, the rising cost of insurance: The average annual premium for family coverage was more than eleven thousand dollars in 2006, more than a quarter of the median worker’s annual earnings.[10] For lower-paid workers that’s just too much—in fact, it’s close to the total annual earnings of a full-time worker paid the minimum wage. One study found that even among “moderate income” Americans, which it defined as members of families with incomes between twenty and thirty-five thousand dollars a year, more than 40 percent were uninsured at some point over a two-year period.[11]
Why is insurance getting more expensive? The answer, perversely, is medical progress. Advances in medical technology mean that doctors can treat many previously untreatable problems, but only at great expense. Insurance companies pay for these treatments but compensate by raising premiums.
The trend of rising medical costs goes back for many decades. Table 8 shows total U.S. health care spending as a percentage of GDP since 1960; except for one brief episode, of which more later, it has been rising steadily. As long as medical costs were relatively low, however, rising spending posed little problem: Americans shouldered the financial burden, and benefited from medical progress.
By the 1980s, however, medical costs had risen to the point where insurance was becoming unaffordable for many employers. As medical costs continued to rise, employers began dropping coverage for their employees, increasing the number of people without insurance, who often fail to receive even basic care. As Robin Wells and I wrote back in 2006:
Our health care system often makes irrational choices, and rising costs exacerbate those irrationalities. Specifically, American health care tends to divide the population into insiders and outsiders. Insiders, who have good insurance, receive everything modern medicine can provide, no matter how expensive. Outsiders, who have poor insurance or none at all, receive very little….
In response to new medical technology, the system-spends even more on insiders. But it compensates for higher spending on insiders, in part, by consigning more people to outsider status—robbing Peter of basic care in order to pay for Paul’s state-of-the-art treatment. Thus we have the cruel paradox that medical progress is bad for many Americans’ health.[12]
Table 8. Health Care Spending | ||
---|---|---|
Year | Percentage of GDP | |
1960 | 5.2 | |
1970 | 7.2 | |
1980 | 9.1 | |
1990 | 12.3 | |
1993 | 13.7 | |
2000 | 13.8 | |
2005 | 16.0 |
9.
Kaiser Family Foundation,
11.
12.
Paul Krugman and Robin Wells, “The Health Care Crisis and What to Do About It,”