He wasn’t scoffing back. He was serious.
“You’re really asking me if it’s possible?” she said.
“I’m just saying, face transplants were considered impossible a few years ago. They’re doing them now. If you think about the medical advances that have been achieved in the last few years…it’s staggering. And the hits just keep on coming. We’ve mapped out the human genome. We’ve cloned a sheep. Heart tissue has just been successfully created from stem cells. So, I don’t know. Maybe this is possible.”
“Of course it isn’t,” Mia replied dismissively.
“I saw this documentary once. About this Russian scientist, back in the fifties — I think his name was Demikhov — he was researching head transplants. To prove it was doable, he grafted the head and upper body of a puppy onto a bigger mastiff and created a two-headed dog. The thing ran around happily and survived for six days.” He shrugged. “And that’s just one we know about.”
Mia leaned forward, her eyes bristling with conviction. “Transplants are about reconnecting nerves and veins and, yes, maybe even spinal cords one day. But this is different. This is about stopping the damage that happens to our cells, to our DNA, to our tissues and organs, with every breath we take. It’s about errors in DNA replication, it’s about molecules inside our body getting bombarded by free radicals and mutating wrongly and just degrading over time. It’s about wear and tear.”
“But that’s my point. It’s not the years, it’s the mileage,” he said pointedly. “You’re talking about cells getting damaged and breaking down, which is very different from saying they’re programmed to live a certain length of time, and then die. It’s like, if you buy a new pair of trainers. You wear them, you jog in them, the soles wear out and the shoes fall apart. If you don’t wear them, they don’t just disintegrate after a few years in their box. Wear and tear. It’s why we die, right? There’s no ticking clock that tells our body its time is up. We’re not programmed to die, are we?”
Mia shifted in her seat. “That’s one line of thought.”
“But it’s the one that’s carrying the day right now, isn’t it?”
Mia knew it was. It was a specialization she had flirted with, but she’d ultimately veered off into another direction, knowing that antiaging research was the embarrassing relative no one wanted to talk about. Biogerontology — the science of aging — had been having a tough time since, well, the Jurassic era.
In official circles, it wasn’t far removed from the quackery of alchemists and the charlatanry of the snake-oil salesmen of yesteryear. Serious scientists, clinging to the traditional belief that growing old is inevitable, were wary of pursuing something that was doomed to failure, and even warier of being ridiculed if they attempted to explore it. Governmental bodies wouldn’t fund it: They dismissed it as an unachievable pipe dream and were loath to be seen funding something that their electorate didn’t really believe — because of what they’d been told and taught — was achievable. Even when presented with compelling arguments and breakthroughs, the holders of the purse strings still wouldn’t go near it because of deeply held religious beliefs: Humans age and die. It’s the way of the world. It’s what God intended. It’s pointless and immoral to try to overcome that. Death is a blessing, whether we realize it or not. The good will become immortal, of course — but only in heaven. And don’t even think about arguing it with the President’s Council on Bioethics. The prevention of aging is, even more than Al Qaeda, an evil threat to our dignified human future.
Case closed.
And yet, in a broader context, scientists had been spectacularly successful in prolonging human life so far. Average life expectancy — the average number of years humans are expected to live — hovered between twenty and thirty years for most of human history. This average was skewed downwards due to one main cause: infant mortality. Three or four infants died for every person who managed to evade the plague, dodge the blade of a sword, and reach eighty. Hence the low average. Medical and hygienic advances — clean water, antibiotics, and vaccines — allowed babies to survive to adulthood, allowing this average to increase dramatically over the last hundred years in what is referred to as the first longevity revolution. It hit forty in the nineteenth century, fifty in 1900, and it was now around eighty in developed countries. Whereas early man had a one-in-twenty-million chance of living to a hundred, that’s now one in fifty. In fact, since 1840, average life expectancy had been growing at a quarter of a year every year. Demographers predicting an upper limit to our expected life spans had consistently been proven wrong.
The crucial difference was that life extension had been achieved by developing vaccines and antibiotics that weren’t conceived with the aim of prolonging life, but rather, to help combat illnesses, an unarguably noble goal. The nuance was critical. And only recently had a paradigm shift occurred in the medical-research community’s attitude towards aging, from perceiving it as something inevitable and predestined, to considering it something far less draconian:
A disease.
A simple analogy was that, until recently, the term Alzheimer’s was only used when referring to sufferers of that form of dementia who were under a certain age — around sixty-five or so. Any older than that, and they didn’t have a disease — they were just senile , and there was no point in doing anything about it. It was part of growing old. This changed in the 1970s, when a demented ninety-year-old was treated no differently from a forty-year-old with Alzheimer’s — both were now equally considered to be suffering from a disease that medical researchers were working hard to understand and cure.
Much in the same way, “old age” was now, more and more, being viewed as an illness. A highly complex, multifaceted, perplexing illness. But an illness nevertheless.
And illnesses can be cured.
The key realization that triggered this new approach was a deceptively simple answer to the fundamental question “Why do we age?” The answer was, simply put, that we age because, in nature, nothing else did.
Or, more accurately, almost nothing ever did.
For thousands of years — throughout virtually all of human evolution — in the wild and away from the cosseting care and advances of the civilized world, humans and animals hardly ever reached old age. They were ravaged by predators, disease, starvation, and weather.
They didn’t get a chance to grow old.
And nature’s preoccupation has always been to make sure its organisms reproduce, to perpetuate the species — nothing more. All it asked of our bodies, all we were designed to do from an evolutionary point of view, was to reach reproductive age, have babies, and nurture them until they were old enough to survive in the wild on their own.
That’s it.
That was all nature cared about.
Beyond that, we were redundant — man and beast alike. All of the cells that made us up had no reason to keep us alive beyond that.
And since we didn’t stand a chance of surviving much beyond the age of reproduction, then nature’s efforts were — rightly — concentrated on stacking the odds for us to reach that age and replicate. Natural selection only cared about our reaching reproductive age, and — rightly, and again unfortunately for those of us who wanted to stick around a little longer — it chose a short life span for us to reproduce in because that was more efficient: It made for shorter time between generations, more mixing of genes, which gave greater adaptability to threatening environments. All of which meant that a process — aging — that never actually manifested itself in nature, in the wild, couldn’t have evolved genetically.
Читать дальше