Marilena’s instruments are poking around the edges of a glistening yolk-colored blob. The blob is known among plastic surgeons as the malar fat pad. “Malar” means relating to the cheek. The malar fat pad is the cushion of youthful padding that sits high on your cheekbone, the thing grandmothers pinch. Over the years, gravity coaxes the fat from its roost, and it commences a downward slide, piling up at the first anatomical roadblock it reaches: the nasolabial folds (the anatomical parentheses that run from the edges of a middle-aged nose down to the corners of the mouth). The result is that the cheeks start to look bony and sunken, and bulgy parentheses of fat reinforce the nasolabial lines. During face-lifts, surgeons put the malar fat pad back up where it started out.
“This is great,” says Marilena. “Beautiful. Just like real, but no bleeding. You can really see what you’re doing.”
Though surgeons in all disciplines benefit from the chance to try out new techniques and new equipment on cadaveric specimens, fresh parts for surgical practice are hard to come by. When I telephoned Ronn Wade in his office in Baltimore, he explained that the way most willed body programs are set up, anatomy labs have first priority when a cadaver comes in. And even when there’s a surplus, there may be no infrastructure in place to get the bodies from the anatomy department of the medical school over to the hospitals where the surgeons are— and no place at the hospital for a surgical practice lab. At Marilena’s hospital, surgeons typically get body parts only when there’s been an amputation.
Given the frequency of human head amputations, an opportunity like today’s would be virtually nonexistent outside of a seminar.
Wade has been working to change the system. He is of the opinion—and it’s hard to disagree with him—that live surgery is the worst place for a surgeon to be practicing a new skill. So he got together with the heads—sorry, chiefs—of surgery at Baltimore’s hospitals and worked out a system. “When a group of surgeons want to get together and try out, say, some new endoscopic technique, they call me and I set it up.” Wade charges a nominal fee for the use of the lab, plus a small per-cadaver fee.
Two-thirds of the bodies Wade takes in now are being used for surgical practice.
I was surprised to learn that even when surgeons are in residencies, they aren’t typically given an opportunity to practice operations on donated cadavers. Students learn surgery the way they have always learned: by watching experienced surgeons at work. At teaching hospitals affiliated with medical schools, patients who undergo surgery typically have an audience of interns. After watching an operation a few times, the intern is invited to step in and try his or her hand, first on simple maneuvers such as closures and retractions, and gradually at more complicated steps. “It’s basically on-the-job training,” says Wade. “It’s an apprenticeship.”
It has been this way since the early days of surgery, the teaching of the craft taking place largely in the operating room. Only in the past century, however, has the patient routinely stood to gain from the experience.
Nineteenth-century operating “theaters” had more to do with medical instruction than with saving patients’ lives. If you could, you stayed out of them at all cost.
For one thing, you were being operated on without anesthesia. (The first operations under ether didn’t take place until 1846.) Surgical patients in the late 1700s and early 1800s could feel every cut, stitch, and probing finger. They were often blindfolded—this may have been optional, not unlike the firing squad hood—and invariably bound to the operating table to keep them from writhing and flinching or, quite possibly, leaping from the table and fleeing into the street. (Perhaps owing to the presence of an audience, patients underwent surgery with most of their clothes on.)
The early surgeons weren’t the hypereducated cowboy-saviors they are today. Surgery was a new field, with much to be learned and near-constant blunders. For centuries, surgeons had shared rank with barbers, doing little beyond amputations and tooth pullings, while physicians, with their potions and concoctions, treated everything else. (Interestingly, it was proctology that helped pave the way for surgery’s acceptance as a respectable branch of medicine. In 1687, the king of France was surgically relieved of a painful and persistent anal fistula and was apparently quite grateful for, and vocal about, his relief.)
Nepotism, rather than skill, secured a post at early-nineteenth-century teaching hospitals. The December 20, 1828, issue of The Lancet contains excerpts from one of the earliest surgical malpractice trials, which centered on the incompetency of one Bransby Cooper, nephew of the famed anatomist Sir Astley Cooper. Before an audience of some two hundred colleagues, students, and onlookers, the young Cooper proved beyond question that his presence in the operating theater owed everything to his uncle and nothing to his talents. The operation was a simple bladder stone removal (lithotomy) at London’s Guy’s Hospital; the patient, Stephen Pollard, was a hardy working-class man. While lithotomies were normally completed in a matter of minutes, Pollard was on the table for an hour, with his knees at his neck and his hands bound to his feet while the clueless medic tried in vain to locate the stone. “A blunt gorget was also introduced, and the scoop, and several pair of forceps,” recalled one witness. Another described the “horrible squash, squash of the forceps in the perineum.” When a succession of tools failed to produce the stone, Cooper “introduced his finger with some force…” It was around this point that Pollard’s endurance[3] ran dry. “Oh! Let it go!”
he is quoted as saying. “Pray let it keep in!” Cooper persisted, cursing the man’s deep perineum (in fact, an autopsy showed it to be a quite normally proportioned perineum). After digging with his finger for some ungodly amount of time, he got up from his seat and “measured fingers with those of other gentlemen, to see if any of them had a longer finger.”
Eventually he went back to his toolkit and, with forceps, conquered the recalcitrant rock—a relatively small one, “not larger than a common Windsor bean”—brandishing it above his head like an Academy Award winner. The quivering, exhausted mass that was Stephen Pollard was wheeled to a bed, where he died of infection and God knows what else twenty-nine hours later.
Bad enough that some ham-handed fop in a waistcoat and bowtie was up to his wrists in your urinary tract, but on top of that you had an audience—not just the young punters from the medical school but, judging from a description of another lithotomy at Guy’s Hospital in an 1829 Lancet, half the city: “Surgeons and surgeons’ friends… French visitors, and interlopers filled the space around the table… There was soon a general outcry throughout the gallery and upper rows—‘hat’s off,’ ‘down heads,’ …was loudly vociferated from different parts of the theatre.”
The cabaret atmosphere of early medical instruction began centuries before, in the standing-room-only dissecting halls of the renowned Italian medical academies of Padua and Bologna. According to C. D. O’Malley’s biography of the great Renaissance anatomist Andreas Vesalius, one enthusiastic spectator at a crowded Vesalius dissection, bent on a better view, leaned too far out and tumbled from his bench to the dissecting platform below. “Because of his accidental fall… the unfortunate Master Carlo is unable to attend and is not very well,” read the note proffered at the next lecture. Master Carlo, one can be sure, did not seek treatment at the place he went for lectures.
3
The human being of centuries past was clearly in another league, insofar as pain endurance went. The farther back you go, it seems, the more we could take. In medieval England, the patient wasn’t even tied down, but sat obligingly upon a cushion at the foot of the doctor’s chair, presenting his ailing part for treatment. In an illustration in