I watch Marilena gingerly probing the woman’s exposed tissue. What she is doing, basically, is getting her bearings: learning—in a detailed, hands-on manner—what’s what and what’s where in the complicated layering of skin, fat, muscle, and fascia that makes up the human cheek. While early face-lifts merely pulled the skin up and stitched it, tightened, into place, the modern face-lift lifts four individual anatomical layers. This means all of these layers must be identified, surgically separated from their neighbors, individually repositioned, and sewn into place—all the while taking care not to sever vital facial nerves. With more and more cosmetic procedures being done endoscopically—by introducing tiny instruments through a series of minimally invasive incisions—knowing one’s way around the anatomy is even more critical. “With the older techniques, they peeled everything down and they could see it all in front of them,” says Ronn Wade, director of the Anatomical Services Division of the University of Maryland School of Medicine. “Now when you go in with a camera and you’re right on top of something, it’s harder to keep yourself oriented.”
Marilena’s instruments are poking around the edges of a glistening yolk-colored blob. The blob is known among plastic surgeons as the malar fat pad. “Malar” means relating to the cheek. The malar fat pad is the cushion of youthful padding that sits high on your cheekbone, the thing grandmothers pinch. Over the years, gravity coaxes the fat from its roost, and it commences a downward slide, piling up at the first anatomical roadblock it reaches: the nasolabial folds (the anatomical parentheses that run from the edges of a middle-aged nose down to the corners of the mouth). The result is that the cheeks start to look bony and sunken, and bulgy parentheses of fat reinforce the nasolabial lines. During face-lifts, surgeons put the malar fat pad back up where it started out.
“This is great,” says Marilena. “Beautiful. Just like real, but no bleeding. You can really see what you’re doing.”
Though surgeons in all disciplines benefit from the chance to try out new techniques and new equipment on cadaveric specimens, fresh parts for surgical practice are hard to come by. When I telephoned Ronn Wade in his office in Baltimore, he explained that the way most willed body programs are set up, anatomy labs have first priority when a cadaver comes in. And even when there’s a surplus, there may be no infrastructure in place to get the bodies from the anatomy department of the medical school over to the hospitals where the surgeons are— and no place at the hospital for a surgical practice lab. At Marilena’s hospital, surgeons typically get body parts only when there’s been an amputation.
Given the frequency of human head amputations, an opportunity like today’s would be virtually nonexistent outside of a seminar.
Wade has been working to change the system. He is of the opinion—and it’s hard to disagree with him—that live surgery is the worst place for a surgeon to be practicing a new skill. So he got together with the heads—sorry, chiefs —of surgery at Baltimore’s hospitals and worked out a system. “When a group of surgeons want to get together and try out, say, some new endoscopic technique, they call me and I set it up.” Wade charges a nominal fee for the use of the lab, plus a small per-cadaver fee.
Two-thirds of the bodies Wade takes in now are being used for surgical practice.
I was surprised to learn that even when surgeons are in residencies, they aren’t typically given an opportunity to practice operations on donated cadavers. Students learn surgery the way they have always learned: by watching experienced surgeons at work. At teaching hospitals affiliated with medical schools, patients who undergo surgery typically have an audience of interns. After watching an operation a few times, the intern is invited to step in and try his or her hand, first on simple maneuvers such as closures and retractions, and gradually at more complicated steps. “It’s basically on-the-job training,” says Wade. “It’s an apprenticeship.”
It has been this way since the early days of surgery, the teaching of the craft taking place largely in the operating room. Only in the past century, however, has the patient routinely stood to gain from the experience.
Nineteenth-century operating “theaters” had more to do with medical instruction than with saving patients’ lives. If you could, you stayed out of them at all cost.
For one thing, you were being operated on without anesthesia. (The first operations under ether didn’t take place until 1846.) Surgical patients in the late 1700s and early 1800s could feel every cut, stitch, and probing finger. They were often blindfolded—this may have been optional, not unlike the firing squad hood—and invariably bound to the operating table to keep them from writhing and flinching or, quite possibly, leaping from the table and fleeing into the street. (Perhaps owing to the presence of an audience, patients underwent surgery with most of their clothes on.)
The early surgeons weren’t the hypereducated cowboy-saviors they are today. Surgery was a new field, with much to be learned and near-constant blunders. For centuries, surgeons had shared rank with barbers, doing little beyond amputations and tooth pullings, while physicians, with their potions and concoctions, treated everything else. (Interestingly, it was proctology that helped pave the way for surgery’s acceptance as a respectable branch of medicine. In 1687, the king of France was surgically relieved of a painful and persistent anal fistula and was apparently quite grateful for, and vocal about, his relief.)
Nepotism, rather than skill, secured a post at early-nineteenth-century teaching hospitals. The December 20, 1828, issue of The Lancet contains excerpts from one of the earliest surgical malpractice trials, which centered on the incompetency of one Bransby Cooper, nephew of the famed anatomist Sir Astley Cooper. Before an audience of some two hundred colleagues, students, and onlookers, the young Cooper proved beyond question that his presence in the operating theater owed everything to his uncle and nothing to his talents. The operation was a simple bladder stone removal (lithotomy) at London’s Guy’s Hospital; the patient, Stephen Pollard, was a hardy working-class man. While lithotomies were normally completed in a matter of minutes, Pollard was on the table for an hour, with his knees at his neck and his hands bound to his feet while the clueless medic tried in vain to locate the stone. “A blunt gorget was also introduced, and the scoop, and several pair of forceps,” recalled one witness. Another described the “horrible squash, squash of the forceps in the perineum.” When a succession of tools failed to produce the stone, Cooper “introduced his finger with some force…” It was around this point that Pollard’s endurance [3] The human being of centuries past was clearly in another league, insofar as pain endurance went. The farther back you go, it seems, the more we could take. In medieval England, the patient wasn’t even tied down, but sat obligingly upon a cushion at the foot of the doctor’s chair, presenting his ailing part for treatment. In an illustration in The Medieval Surgery , we find a well-coiffed man about to receive treatment for a troublesome facial fistula. The patient is shown calmly, almost fondly, lifting his afflicted face toward the surgeon. Meanwhile, the caption is going: “The patient is instructed to avert his eyes and… the roots of the fistula are then seared by taking an iron or bronze tube through which is passed a red hot iron.” The caption writer adds, “The doctor would appear to be left-handed in this particular picture,” as if perhaps trying to distract the reader from the horrors just read, a palliative technique fully as effective as asking a man with a red-hot poker closing in on his face to “avert his eyes.”
ran dry. “Oh! Let it go!”
Читать дальше