Whatever the truth may be, the ups and downs of my own nervous system and my encounters with doctors illustrate the ambiguities of illness and diagnosis. The philosophical ideas that lie beneath calling one thing by one name and another by another often remain unexamined, and they may be determined more by intellectual fashions than by rigorous thought. The New York Times headline “Is Hysteria Real?” upholds the conventional belief: if you can see it, it’s real and physical. If you can’t, it’s unreal and mental. Or, rather, most scientists agree that what’s mental is really physical but can’t describe how that works. Then again, for other scientists, there is no physical reality that can be understood as if we could leap out of our own heads and become objective; everything we live is given to us via the mental. The world is mind. Whatever the case may be, on a more pedestrian “level,” there is no simple identifiable cause and effect to illuminate what exactly is wrong with me, no linear motion from one thing to another, but a number of factors that may or may not play a role in the vagaries of the shaking woman’s path.
A FRIEND OF MINE has a sister who has had epileptic seizures since she was a little girl. L. used to wake up at night to see her sister flailing and flapping in the bed next to her own. L. told me that her sister does not feel alienated from her auras and fits. In fact, they are so deeply part of her, she is reluctant to medicate them away. In his essay “Witty Ticky Ray,” Oliver Sacks describes a Tourette’s patient who, after his tics vanished with a drug, missed them so much that he began taking weekend holidays from his pharmaceuticals, so he could tic again with happy abandon. 181The bipolar patient P. who produced the seven-thousand-page manuscript made it clear to me that she mourned her mania terribly. I felt sure that once the authorities released her from the hospital, she would stop taking her lithium. After his voices stopped, a schizophrenic patient felt alone for the first time in years and wasn’t sure he liked it. In her book The Midnight Disease, the neurologist Alice Flaherty describes and analyzes her postpartum hypergraphia, which began not long after she gave birth to twin boys who died. She was also visited by a host of metaphorical images that made the world around her feel uncommonly vivid but were intrusive and distracting. When a drug brought them to a halt, she writes, “the world became so dead that my psychiatrist and I lowered the doses until I could have at least some of my tyrannical metaphors back.” 182What if it is a disease? Prince Myshkin asked. I, too, have become curiously attached to my migraines and the various feelings that have accompanied them. I cannot really see where the illness ends and I begin; or, rather, the headaches are me, and rejecting them would mean expelling myself from myself.
None of us chooses chronic illness. It chooses us. Over time, L.’s sister did not accommodate her life to having tonic-clonic seizures; her seizures became woven into the very fabric of her conscious identity, her narrative self, as are my migraines, P.’s mania, and Dr. Flaherty’s metaphors and hypergraphia, for better and for worse. Perhaps because she was a late arrival, I have had a much harder time integrating the shaking woman into my story, but as she becomes familiar, she is moving out of the third person and into the first, no longer a detested double but an admittedly handicapped part of my self.
Exactly what a self is remains controversial. The neuroscientist Jaak Panksepp maintains that human beings have a core self that can be located in the brain, a mammalian self outside of language but crucial to a state of wakeful awareness; the periaqueductal gray (PAG) region of the brain is a very small spot indeed, but when it is damaged, wakeful consciousness goes with it. 183Antonio Damasio also proposes a core self, although he differs somewhat with Panksepp as to precisely where it is. 184Both would agree that this core being is not the autobiographical self, not the person saying or writing “I remember.”
Michael Gazzaniga, the scientist who worked with split-brain patients and coined the neat term “left-brain interpreter,” marshals evidence for a view of the self through selection theory: “all we do in life is discover what was built into our brains.” 185According to Gazzaniga, the environmental influences on a person select from options that are already there. This apparently innocuous idea of innate ability — people do not fly, except in their dreams and on airplanes, because we have no native capacity for it — becomes chillier as the theory is extended into the social realm. It leads serially to other beliefs: that parents have very little influence on children (they are immune to instruction) and that social programs designed to support people with various problems are counterproductive because what individuals really need is to be thrown into survival mode. Cancer patients should be encouraged to “fight” their illnesses because the combat will help them live longer. Gazzaniga is one of several scientists who, when he publishes a book for a broad audience, has taken to throwing swords at the “blank slate” view of human beings.
Steven Pinker, a respected cognitive psychologist, has written several popular books about his field. He, too, rails against blank slaters. 186The blank slate idea, often attributed to John Locke, argues that human beings are born blanks and then are written on by experience. But Locke did not disavow native human capacities. He was arguing against Descartes’s theory of innate ideas, that there are universal truths we are all born with, shared by all people. Whatever his philosophical flaws may be, in An Essay Concerning Human Understanding , Locke delineates a developmental, interactive view of life: you need to have had the experience of red before you know what red is. In truth, it would be very difficult to find any serious proponent of a radical blank slate theory, just as one could not find a sane advocate of absolute biological determinism. Even the most extreme constructionist doesn’t argue against genes. Even people who maintain that the self — or, rather, the “subject”—is a fiction founded in language, a figment that is constantly being recast in terms of the dominant ideology of a particular historical epoch, don’t believe that human beings have no inherent capacity for speech. To put it bluntly: what is at issue here is emphasis — genes over experience or experience over genes.
Gazzaniga, Pinker, and many others believe, with good reason, that inside academic institutions, some scholars have stressed human malleability to a degree that is unfounded. But their confidence that research has proven, for example, that parents have no effect on their offspring is remarkable. I would direct them to the growing laboratory research on mammals that indicates that genetic makeup is modified by environmental factors, including maternal care. 187Ideas quickly become beliefs, and beliefs quickly become bullets in ideological wars. What we are and how we’re made is surely a battleground in one of those wars. The tough and the tender are constantly shooting off artillery at each other. Near the end of a PowerPoint presentation and lecture on the brain I attended in February 2009, the Harvard neuroscientist Hans Breiter turned to the image on his screen: a huge blue rectangle. Inside that rectangle was a tiny red square. “That’s how much we know about the brain,” he said, referring not to the vast blue but to the minute red spot. What we know often becomes an excuse to extrapolate endlessly, but my hunch is that most of the time intellectual humility takes one further than arrogance.
In Buddhism, the self is an illusion. There is no self. Some cognitive scientists agree with that formulation. Others don’t. Freud’s model of the self was dynamic, complex, divided into three, and provisional. He truly believed that science would elaborate on his ideas, and it has, albeit in many conflicting directions. In psychoanalytic object relations theory as it developed, the self is also plural. The images of important others inhabit us forever. D. W. Winnicott let more air into the psychic room than did Freud, whose model of mental structures is more confined, more likely to deal with fantasies and identifications than with real other people and actual experiences. Winnicott believed that we all have a true self, as well as false selves. Our social selves necessarily have false aspects — the polite smile or the “I’m fine” response to “How are you?” 188I don’t know what a self is. Defining it, whatever it is, is clearly a semantic problem, a question of borders and perception, as well as any psychobiological truths we might be able to discover.
Читать дальше