NATURAL CAUSES
Barbara Ehrenreich
CONTENTS
INTRODUCTION Ix
Chapter One: Midlife Revolt 1
Chapter Two: Rituals of Humiliation 14
Chapter Three: The Veneer of Science 32
Chapter Four: Crushing the Body 51
Chapter Five: The Madness of Mindfulness 71
Chapter Six: Death in Social Context 91
Chapter Seven: The War Between Conflict and Harmony 112
Chapter Eight: Cellular Treason 137
Chapter Nine: Tiny Minds 151
Chapter Ten: "Successful Aging" 162
Chapter Eleven: The Invention of the Self 181
Chapter Twelve: Killing the Self, Rejoicing in a Living World 197
NOTES 213
ABOUT THE AUTHOR 235
=====================
CHAPTER ELEVEN The Invention of the Self
We return now to a question raised earlier in this book. Who is in charge? We seek control over our bodies, our minds, and our lives, but who or what will be doing the controlling? The body can be ruled out because of its tendency to liquefy—or turn into dust—without artful embalming. So the entity we wish to enthrone must be invisible and perhaps immaterial—the mind, the spirit, the self, or perhaps some ineffable amalgam, as suggested by the phrase "mind, body, spirit" or the neologism "mindbody."
The spectacle of decomposition provides a powerful incentive to posit some sort of immaterial human essence that survives the body. Certainly there is very little talk of "mind-body unity" in the presence of a rotting corpse. In fact, the conversation is likely to take a different turn, to an emphasis on the existence of an immortal essence, or soul, that somehow carries on without the body.
182
183
Medieval Catholic artists and clerics deployed images of decomposing bodies—sometimes with maggots wiggling in the nostrils and eye sockets—to underscore the urgency of preparing the soul for the disembodied life that awaits it. Buddhist monks practice "corpse meditation" in the presence of corpses, both fresh and rotting, to impress on themselves the impermanence of life The soul, m both Christian and Islamic philosophy, is the perfect vessel for the immortality that eludes us as fleshly creatures: It's immortal by virtue of the fact it somehow participates in, or overlaps with, an immortal deity. Even nonbelievers today are likely to comfort themselves with the thought of a "soul," or spirit, or vague "legacy" that renders them impervious to decay. As Longfellow famously wrote, "Dust thou art, to dust turnest, was not spoken of the soul."
But no one has detected this entity. There is in fact much firmer evidence for the existence of "dark matter," the hypothesized substance that is invoked to explain the shape of galaxies, than there is for any spirit or soul. At least dark matter can be detected indirectly through its gravitational effects. We can talk about someone's soul and whether it is capacious or shriveled, but we realize that we are speaking metaphorically. Various locations for an immaterial individual essence have been proposed—the heart, the brain, and the liver—but autopsies yield no trace of it, leading some to speculate that it is delocalized like the Chinese qi. In 1901, an American physician reported that the human body loses three-quarters of an ounce, or twenty-one grams, at the moment of death, arguing that this meant the soul is a material substance. But his experiment could not be replicated, suggesting that the soul, if it exists, possesses neither location nor mass. One can't even find the concept of the "immortal soul" in the Bible. It was grafted onto Christian teachings from the pagan Greeks long after the Bible was written.2
The idea of an immortal soul did not survive the Enlightenment unscathed. The soul depended on God to pro‑
vide its immortality, and as his existence—or at least his
attentiveness—was called into question, the immortal soul gave way to the far more secular notion of the self. While
the soul was probably "discovered" by Christians (and Jews)
reading Plato, the self was never discovered; it simply grew by accretion, apparently starting in Renaissance Europe.
Scholars can argue endlessly about when exactly the idea of
the self—or any other historical innovation—arose; precedents can always be claimed. But historians have generally
agreed on the vague proposition that nothing like either
the soul or the self existed in the ancient world. Ego, yes, and pride and ambition, but not the capacity for introspec‑
tion and internal questioning that we associate with the self. Achilles wanted his name and his deeds remembered forever; he did not agonize over his motives or conflicted allegiances. That sort of thinking came later.
Lionel Trilling wrote that "in the late 16th and early 17th centuries, something like a mutation in human nature took
place," which he took to be the requirement for what his‑
torian Frances Yates called "the emergence of modern European and American man."' As awareness of the individual
self took hold, the bourgeoisie bought mirrors, commis‑
sioned portraits, wrote autobiographies, and increasingly honored the mission of trying to "find" oneself among the
buzz of thought engendered by a crowded urban social world. Today we take it for granted that inside the self we
present to others, there lies another, truer self, but the idea was still fresh in the 1780s when Jean-Jacques Rousseau announced triumphantly:
I am forming an undertaking which has no precedent, and the execution of which will have no imitator whatsoever. I wish to show my fellows a man in all the truth of nature; and this man will be myself.
184 185
Myself alone. I feel my heart and I know men. I am not made like any of the ones I have seen; I dare to believe that I am not made like any that exist. If I am worth no more, at least I am different.4
Megalomania, or the proud claim of a rebellious political thinker? Contemporary thought has leaned toward the latter; after all, Rousseau was a major intellectual influence on the French Revolution, which, whatever its bloody outcome, was probably the first mass movement to demand both individual "Liberté" and "Fraternité," or solidarity within the collective. There is something bracing about Rousseau's assertion of his individual self, but the important thing to remember is that it was an assertion—no evidence was offered, not that it is easy to imagine what kind of evidence that might be. As historian John 0. Lyons put it, the self was "invented."5
Another slippery abstraction was taking hold at around the same time as the self, and this was the notion of "society." Like the self, society is not something you can point to or measure, it is a concept that has to be taught or shared, a ghostly entity that arises from an aggregate of individual selves. In material terms, you can imagine a "super-being" composed of numerous subunits clumsily trying to coordinate their movements. It is no coincidence that the concept of society arose along with that of the self, if only because the newly self-centered individual seemed to be mostly concerned with the opinion of others: How do I fit in? How do I compare to them? What impression am I making? We do not look into mirrors, for example, to see our "true" selves, but to see what others are seeing, and what passes for inner reflection is often an agonizing assessment of how others are judging us.
A psychological "mutation" of this magnitude cries out for a historic explanation. Here, historians have generally invoked the social and economic changes accompanying the increasing dominance of a market economy. As fixed feudal roles and obligations lost their grip, it became easier for people to imagine themselves as individuals capable of self-initiated change, including upward mobility. You might be an artisan and learn to dress and speak like a merchant, or a merchant who takes on the airs of an aristocrat. Traditional bonds of community and faith loosened, even making it possible to assume the identity of another person, as in the famous case of the sixteenth-century adventurer who managed to convince the inhabitants of a village that he was their missing neighbor Martin Guerre. He took over the family inheritance and moved in with the real Guerre's wife, at least until the ruse was uncovered three years later.6 If you could move from village to village, from village to city, from one social class to another—and surely the disruptions of intra-European wars played a part in the new mobility—you have to constantly monitor the impression you are making on others. At the same time, those others are becoming less trustworthy; you cannot be sure what true "self" lies behind the façade.
186 187
Related to the rise of capitalism—though how related has long been a subject of debate—was the religious innovation represented by Protestantism, which midwifed the soul's transformation into the modern notion of the self Pre-Reformation Catholics could ensure a blissful postmortem existence by participating in the sacraments or donating large sums to the church, but Protestants and especially Calvinists were assigned to perpetual introspection in an attempt to make their souls acceptable to God. Every transient thought and inclination had to be monitored for the slightest sinful impulse. As science and secularism chipped away at the notion of God, the habit of introspection remained. Psychoanalyst Garth Amundson writes:
People continued to look inward, into the private life of the mind, so as to locate essential truths about their lives, though without the additional notion that these truths are the fruit of a dialogue with God's presence within the self Hence, the Deity that Augustine thought that we discover by looking within the self was dethroned, and replaced by an invigorating confrontation with powerful private emotional states, fantasies, hopes, and needs. An authentic and immediate awareness of one's affective experience became the new center around which to create a life lived truthfully and "fully." In this way, the development of the private life of the self became something of an object of worship.7
Or, as somewhat more simply put by a Spanish historian, "the modern Rousseauist self, which feels and creates its own existence, would appear to be the heir to attributes previously assigned to God."8
In our own time, the language of self-regard has taken on a definite religious quality. We are instructed to "believe" in ourselves, "esteem" ourselves, be true to ourselves, and, above all, "love" ourselves, because otherwise how could anyone else love us? The endless cornucopia of "self-help" advice that began to overflow in the twentieth century enjoins us to be our own "best friends," to indulge ourselves, make time for ourselves, and often celebrate ourselves. If words like "believe" do not sufficiently suggest a religious stance, one site even urges us to "worship ourselves" by creating a shrine to oneself, which might include photos (probably "selfies"), favorite items of jewelry, and "nice smelling things such as perfume, candles or incense"' The self may seem like a patently false deity to worship, but it is no more—and no less—false than the God enshrined in recognized religions. Neither the self nor God is demonstrably present to everyone. Both require the exertion of belief
In today's capitalist culture the self has been further objectified into a kind of commodity demanding continual effort to maintain—a "brand:' Celebrities clearly have well-defined "brands," composed of their talents, if any, their "personalities," and their physical images, all of which can be monetized and sold.
188 189
Even lowly aspirants toward wealth and fame are encouraged to develop a brand and project it confidently into the world, and never mind if it is indistinguishable from that of millions of other people—cheerful, upbeat, and "positive-thinking" has been a favorite since the 1950s, both for office workers and CEOs. If some darker self, containing fears, resentments, and doubts, remains under your carefully constructed exterior, it is up to you to keep it under wraps. Internal "affirmations"—"i am confident, I am lovable, and I will be successful"—are thought to do the trick.
What could go wrong? Of course, with the introduction of "self-knowledge" and "self-love," one enters an endless hail of mirrors: How can the self be known to the self, and who is doing the knowing? If we love ourselves, who is doing the loving? This is the inescapable paradox of self-reflection: How can the self be both the knower and the content of what is known, both the subject and the object, the lover and that which is loved? Other people can be annoying, as Sartre famously suggested, but true hell is perpetual imprisonment in the self. Many historians have argued that the rise of self-awareness starting in roughly the seventeenth century was associated with the outbreak of an epidemic of
"melancholy" in Europe at about the same time, and subjective accounts of that disorder correspond very closely with
what we now call "depression." ° Chronic anxiety, taking the form of "neurasthenia" in the nineteenth century, seems to be another disease of modernism. The self that we love and nurture turns out to be a fragile, untrustworthy thing.
Unlike the "soul" that preceded it, the self is mortal. When we are advised to "come to terms with" our mortal‑
ity, we are not only meant to ponder our decaying corpses, but the almost unthinkable prospect of a world without us
in it, or more precisely, a world without me in it, since I can, unfortunately, imagine a world without other people, even those I love most. A world without me, without a conscious "subject" to behold it, seems inherently paradoxical. As philosopher Herbert Fingarette writes:
Could I imagine this familiar world continuing in existence even though I no longer exist? If I tried, it would be a world imagined by me.... Yes, I can imagine a world without me in it as an inhabitant. But I can't imagine a world as unimagined by me. My consciousness of that world is in-eliminable, and so, too, therefore, is my reaction to it. But this falsifies the meaning of my death, since its distinctive feature is that there won't be consciousness of, or reaction to, anything whatsoever. 11
We are, most of the time, so deeply invested in the idea of an individual conscious self that it becomes both logically and emotionally impossible to think of a world without it. A physician who had narrowly escaped death more than once writes:
Whenever I've tried wrapping my mind around the concept of my own demise—truly envisioned the world continuing on without me, the essence of what I am utterly gone forever—I've unearthed a fear so overwhelming my mind has been turned aside as if my imagination and the idea of my own end were two magnets of identical polarity, unwilling to meet no matter how hard I tried to make them.
i 12
190 191
We may all imagine that some trace of ourselves will persist in the form of children and others whom we have influenced, or through the artifacts and intellectual products we leave behind. At the same time I know, though, that the particular constellation of memories, fantasies, and ambitions that is, for example, me will be gone. The unique—or so I like to imagine—thrum of my consciousness will be silenced, never to sound again. "All too often," wrote philosopher Robert C. Solomon, "we approach death with the self-indulgent thought that my death is a bad thing because it deprives the universe of me" (italics in the original).13 Yet if we think about it, the universe survives the deaths of about fifty-five million unique individuals a year quite nicely.
In the face of death, secular people often scramble to expand their experiences or memorialize themselves in some lasting form. They may work their way through a "bucket list" of adventures and destinations or struggle to complete a cherished project. Or if they are at all rich or famous, they may dedicate their final years and months to the creation of
a legacy, such as a charitable foundation, in the same spirit as an emperor might plan his mausoleum. One well-known
public figure of my acquaintance devoted some of his last months to planning a celebration of his life featuring adulatory speeches by numerous dignitaries including himself. Sadly, a couple of decades later, his name requires some explanation.
So the self becomes an obstacle to what we might call, in the fullest sense, "successful aging." I have seen accomplished people consumed in their final years with jockeying for one last promotion or other mark of recognition, or crankily defending their reputation against critics and potential critics. This is all that we in the modern world have learned how to do. And when we acquire painful neuroses from our efforts to promote and protect ourselves, we often turn to forms of therapy that require us to burrow even more deeply into ourselves. As Amundson writes, "the psychotherapy patient looks within for the truth, and comes away, not with anything that is considered universally valid or absolute in a metaphysical sense, but with a heightened and intensified devotion to such individualistic creeds as 'being true to oneself,' 'loving oneself,' and 'practicing self-care."' 14
There is one time-honored salve for the anxiety of approaching self-dissolution, and that is to submerge oneself into something "larger than oneself," some imagined super-being that will live on without us. The religious martyr dies for God, the soldier for the nation or, if his mind cannot encompass something as large as the nation, at least for the regiment or platoon. War is one of the oldest and most widespread human activities, and warriors are expected to face death willingly in battle, hoping to be memorialized in epics like the Iliad or the Mahabharata or in one of the war monuments that have sprung up since the nineteenth century. For frightened soldiers or, later, their grieving survivors, dying is reconfigured as a "sacrifice"—the "ultimate
sacrifice"—with all the ancient religious connotations of an offering to the gods. And in case thoughts of eventual glory
are not enough to banish fear, the US military is increasingly adopting the tools of alternative medicine, including meditation, dietary supplements, and reiki.'5
192 193
From San Diego up to Maine
The expectation, though, is that true soldiers die calmly and without In every mine and mill
regret. As Winston Churchill said of poet and World War I Where workers strike and organize
recruit Rupert Brooke: Says he, You'llfindJoe Hill 17
He expected to die: he was willing to die for the dear England whose beauty and majesty he knew: and he advanced towards the brink in perfect serenity, with absolute conviction of the rightness of his country's cause and a heart devoid of hate for fellow-men. 16
But you don't have to be a warrior to face death with equanimity. Anyone who lives for a cause like "the revolution" is entitled to imagine that cause being carried on by fresh generations, so that one's own death becomes a temporary interruption in a great chain of endeavor. Some stumble and fall or simply age out, but others will come along to carry on the work. As an old labor song about Joe Hill, a labor activist who was framed for murder and executed in 1915, tells us, it's as if death never happened at all:
I dreamed lsaw Joe Hill last night Alive as you or me
Says I, But Joe, you're ten years dead I never died, says he
In ever died, says he...
Where working men are out on strike .Joe Hill is at their side
Joe Hill is at their side
The revolutionary lives and dies for her people, secure in her belief that someone else will pick up the banner when she falls. To the true believer, individual death is incidental. A luta continua.
The idea of a super-being that will outlive us as individuals is not entirely delusional. Human beings are among the most sociable of living creatures. Studies of orphaned infants in World War II showed that even if kept warm and adequately fed, infants who were not held and touched "failed to thrive" and eventually died.18 Socially isolated adults are less likely to survive trauma and disease than those embedded in family and community. We delight in occasions for unified, collective expression, whether in the form of dancing, singing, or chanting for a demagogue. Even our most private thoughts are shaped by the structure of language, which is of course also our usual medium of interaction with others. And as many have argued, we are ever more tightly entangled by the Internet into a single global mind—although in a culture as self-centric as ours, the Internet can also be used as a mirror, or a way to rate ourselves by the amount of attention we are getting from others, the number of likes.
It is the idea of a continuous chain of human experience and endeavor that has kept me going through an unexpectedly long life. I will stumble and fall; in fact, I already stumble a lot, but others will pick up the torch and continue the race.
194 195
It's not only "my work"—forgive the pompous phrase—that I bequeath to my survivors but all the mental and sensual pleasures that come with being a living human: sitting in the spring sunshine, feeling the warmth of friends, solving a difficult equation. All that will go on without me. I am content, in the time that remains, to be a transient cell in the larger human super-being.
But there are flaws in this philosophic perspective. For one thing, it is entirely anthropocentric. Why shouldn't our "great chain of being" include the other creatures with which we have shared the planet, the creatures we have martyred in service to us or driven out of their homes to make way for our expansion? Surely we have some emotional attachment to them, even if it is hard to imagine passing the figurative torch to dogs or, in one of the worst scenarios, insects and microbes.
Then there is a deeper, more existential problem with my effort to derive some comfort from the notion of an ongoing human super-being: Our species itself appears to be mortal and, in many accounts, imminently doomed, most likely to die by our own hand, through global warming or nuclear war. Some scientists put the chance of a "near extinction event," in which up to 10 percent of our species is wiped out, at a little over 9 percent within a hundred years.19 Others doubt our species will survive the current century. As environmentalist Daniel Drumright writes—and I can only hope he is an alarmist—with the growing awareness of extinction, "We're dealing with a discovery of such epic proportion that it simply reduces everything in existence to nothing." He goes on to say that our emerging circumstances require "a diabolic consciousness to which no living human being has ever had to bear witness. It is an awareness which requires a degree of emotional maturity that's almost indistinguishable from insanity within western culture."20
If your imagination is vigorous enough, you may take comfort from the likely existence of other forms of life throughout the universe. Earth-sized planets abound, potentially offering other habitats similar to our own, with reasonable temperatures and abundant water. In addition, sci-fi fans know that our vision of life based on carbon and water is likely to be far too provincial. There may be life forms based on other chemicals, or self-reproducing entities that do not even consist of conventional matter—patterns of energy bursts, oscillating currents, gluttonous black holes; already we have artificial life in the form of computer programs that can reproduce and evolve to meet changing circumstances. And—who knows?—some of these "life" forms may be suitable heirs for our species, capable of questing and loving.
But even here our yearning for immortality runs into a wall, because the universe itself will come to an end if current predictions are borne out, whether in 2.8 or 22 billion years from now, which of course still gives us plenty of time to get our things in order. In one scenario there will be a "big crunch" in which expansionist forces will rip even atoms apart. In another, the night sky will empty out, the huge void spaces now separating galaxies will grow until they swallow everything. Vacuum and perfect darkness will prevail. Both scenarios lead to the ultimate nightmare of a world "without us in it," and it is infinitely bleaker than a
world without our individual selves—a world, if you can call it that, without anything in it, not the tiniest spark of consciousness or wisp of energy or matter. To cruelly paraphrase Martin Luther King, the arc of history is long, but it bends toward catastrophic annihilation.
196
======================================
CH 12 Killing the Self, Rejoicing in a Living World
Philosophically, we have painted ourselves into a corner. On the one hand, we posit a lifeless material world. As the twentieth-century biochemist Jacques Monod put it, in what I can only imagine was a tone of bitter triumph, "Man at last knows he is alone in the unfeeling immensity of the universe. "1 On the other hand, we hold on to the perception of an endlessly fascinating self, bloated now by at least a century of self-love and self-absorption. We live like fugitives, always trying to keep one step ahead of the inevitable annihilation—one more meal, one more dollar or fortune to win, one more workout or medical screening. And we die. . . Well, we cannot die at all because the death of the self is unthinkable.
The traditional solution to this existential dilemma has been to simply assert the existence of a conscious agency other than ourselves, in the form of a deity, an assertion that has often been backed up by coercion. For about two thousand years, large numbers of people—today a clear majority of the world's population 2—have either insisted that this deity is a single all-powerful individual, or they have at least pretended to go along with the idea.
198 199
Perhaps to make this remote and solitary god more palatable, the "world religions" also assert that he is all-good and all-loving, although this bit of PR had the effect of making him seem preposterous, since a good and loving god would not unleash earthquakes or kill babies. Belief in such a deity takes considerable effort, as many Europeans discovered after the eighteenth-century earthquake that destroyed Lisbon. But it is an effort that most people are willing to make since the alternative is so ghastly: How can anyone live knowing that they will end up as a pile of refuse? Or, as atheists are often asked, how can we die knowing death is followed only by nothingness?
The rise of monotheism has been almost universally hailed by modern scholars as a great moral and intellectual step forward. In myth, the transition to monotheism sometimes occurred as a usurpation of divine power by a particular polytheistic deity within a larger pantheon: Yahweh, for example, had to drive out the earlier Canaanite gods like Asherah and Baal. Politically, the transition could occur suddenly by kingly decree, as in the cases of the pharaoh Akhenaten, the Hebrew king Saul, and the emperor Constantine. The single God's exclusive claim to represent perfect goodness (or, in the case of Yahweh, fierce tribal loyalty) proved, in turn, crucial in legitimating the power of the king, who could claim to rule by divine right. The system is ethically tidy: All morally vexing questions can be answered with the claim that the one deity is the perfection of goodness, even if his motives are inscrutable to us.
But the transition to monotheism can also be seen as a long process of deicide, a relentless culling of the ancient gods and spirits until no one was left except an abstraction so distant that it required "belief." The "primitive"—and perhaps original—human picture was of a natural world crowded with living spirits: animals that spoke and understood human languages, mountains and rivers that encapsulated autonomous beings and required human respect and attention. The nineteenth-century anthropologist Edward Tylor termed this view of an inspirited world "animism," and to this day, indigenous belief systems that seem particularly disorganized and incoherent compared to the great "world religions" like Islam and Christianity are also labeled—or perhaps we should say libeled—as animism.
Historically, animism was followed by polytheism. How the multitudinous spirits of animism congealed into distinct deities is not known, but the earliest polytheistic religion is thought to be Hinduism, arising in about 2500 BCE and still bearing traces of animism in the form of animal deities like Ganesh and Hanuman, as well as in rural shrines centered on rocks. The religions of the ancient Mediterranean world, the Middle East, and the southern part of the Western Hemisphere were all polytheistic, made possible by stratified societies capable of erecting temples and supporting a nonproductive priestly caste.
Not everyone went along cheerfully with the imposition of monotheism, which required the abandonment of so many familiar deities, animal gods, and spirits, along with their attendant festivities. The Egyptians reverted to polytheism as soon as Akhenaten died, while the Hebrew kings fought ruthlessly to suppress their subjects' constant backsliding to the old Canaanite religion. Within the monotheistic religions too, there was a steady drift back toward polytheism.
200 201
The Christian God divided himself into the Trinity; saints proliferated within Christianity and Islam, the remnants of animism flourish alongside Buddhism (which, strictly speaking, shouldn't be considered a form of theism at all).
In the last five hundred years "reform" movements rushed in to curb these deviations. In Europe the Reformation cracked down on the veneration of saints, downplayed the Trinity, and stripped churches of decoration, incense, and other special effects. Within Islam, Wahhabism suppressed Sufism, along with music and artistic depictions of living creatures. The face of religion became blank and featureless, as if to discourage the mere imagining of nonhuman agencies in the world.
It was the austere, reformed version of monotheism that set the stage for the rise of modern reductionist science, which took as its mission the elimination of agency from the natural world. Science did not set out to destroy the monotheistic deity; in fact, as Jessica Riskin explains, it initially gave him a lot more work to do. If nature is devoid of agency, then everything depends on a "Prime Mover" to breathe life into the world.3 But science pushed him into a corner and ultimately rendered him irrelevant. When an iconic 1966 Time magazine cover echoed Nietzsche by asking, "Is God Dead?" the word was out: We humans are alone in a dead universe, the last conscious beings left. This was the intellectual backdrop for the deification of the "self."
It is too late to revive the deities and spirits that en livened the world of our ancestors, and efforts to do so are invariably fatuous. But we can begin to loosen the skeletal grip of the old, necrophiliac science on our minds. In fact, for the sake of scientific rationality, we have to. As Jackson Lears has written recently, the reductionist science that condemns the natural world to death "is not 'science per se but a singular, historically contingent version of it—a version that depends on the notion that nature is a passive mechanism, the operations of which are observable, predictable, and subject to the law-like rules that govern inert matter."4
Only grudgingly, science has conceded agency to life at the cellular level, where researchers now admit that "decisions" are made about where to go and what other cells to kill or make alliances with. This gradual change of mind about agency at the microscopic level is analogous to the increasing scientific acceptance of emotion, reasoning, and even consciousness in nonhuman animals—which was belatedly acknowledged at an international conference of neuroscientists in 2012. As for myself, I am not entirely satisfied with the notion of cellular decision making and would like to know more about how cells arrive at their decisions and how humans could perhaps intervene. But I no longer expect to find out that these decisions are "determined"—in the old Newtonian sense of, say, a rock falling in response to gravity, or by perhaps any forces or factors outside of the cell.
202 203
The question I started with has to do with human health and the possibility of our controlling it. If I had known that this is just part of a larger question about whether
the natural world is dead or in some sense alive, I might have started in many other places, for example with fruit flies, viruses, or electrons that, according to the scientists who study them, appear to possess "free will" or the power to make 'decisions Wherever we look, if we look closely enough, we find nature defying the notion of a dead, inert universe. Science has tended to dismiss the innate activities of matter as Brownian motion or "stochastic noise"—the fuzziness that inevitably arises when we try to measure or observe something, which is in human terms little more than a nuisance. But some of these activities are far more consequential, and do not even require matter to incubate them. In a perfect void, pairs of particles and antiparticles can appear out of nowhere without violating any laws of physics. As Stephen Hawking puts it, "We are the product of quantum fluctuations in the very early universe. God really does play dice."6 Most of these spontaneously generated particle pairs or "quantum fluctuations" are transient and flicker quickly out of existence. But every few billion years or so, a few occur simultaneously and glom together to form a building block of matter, perhaps leading, in a few billion more years, to a new universe.
Maybe then, our animist ancestors were on to something that we have lost sight of in the last few hundred years of rigid monotheism, science, and Enlightenment. And that is the insight that the natural world is not dead, but swarming with activity, sometimes perhaps even agency and intentionality. Even the place where you might expect to find quiet and solidity, the very heart of matter—the interior of a proton or a neutron—turns out to be animated with the ghostly flickerings of quantum fluctuations.7 I would not say that the universe is "alive," since that might invite misleading biological analogies. But it is restless, quivering, and juddering, from its vast vacant patches to its tiniest crevices.
I have done my feeble best here to refute the idea of dead matter. But the other part of the way out of our dilemma is to confront the monstrous self that occludes our vision, separates us from other beings, and makes death such an intolerable prospect. Susan Sontag, who spent her last couple of years "battling" her cancer, as the common military metaphor goes, once wrote in her journal, "Death is unbearable unless you can get beyond the 1.,',88 In his book on her death, her son, David Rieff, commented, "But she who could do so many things in her life could never do that,"9 and devoted her last years and months to an escalating series of medical tortures, each promising to add some extra months to her life.
Just a few years ago, I despaired of any critical discussion of the self as an obstacle to a peaceful death without getting mired in the slippery realm of psychoanalysis or the even more intimidating discourse of postmodern philosophy. But a surprising new line of scientific inquiry has opened up in an area long proscribed, and in fact criminalized—the study of psychedelic drugs. Reports of their use in treating depression, in particular the anxiety and depression of the terminally ill, started surfacing in the media about a decade ago. The intriguing point for our purposes here is that these drugs seem to act by suppressing or temporarily abolishing the sense of "self."
204 205
The new research has been masterfully summarized in a
2015 article by Science writer Michael Pollan.'° In a typical trial, the patient—usually someone suffering from cancer—receives a dose of psiocybin, the active ingredient in "magic mushrooms," lies on a couch in a soothingly appointed room, and "trips' for several hours under the watchful eye of a physician. When the drug wears off, the patient is asked to prepare a detailed chronicle of his or her experience and submit to frequent follow-up interviews. Pollan quotes one of the researchers, a New York University psychiatrist, on the preliminary results:
People who had been palpably scared ofdeath—they lost their fear. The fact that a drug given once can have such an effect for so long [up to six months] is an unprecedented finding. We have never had anything like it in the psychiatric field. 11
When the subjective accounts of patients are supplemented with scans to localize brain activity, it turns out that the drug's effect is to suppress the part of the brain concerned with the sense of self, the "default-mode network." The more thoroughly this function of the brain is suppressed, the more the patient's reported experience resembles a spontaneously occurring mystical experience, in which a person goes through "ego dissolution" or the death of the self—which can be terrifying—followed by a profound sense of unity with the universe, with which the fear of death falls away. And the more intense the psychedelic trip or mystical experience, the more strikingly anxiety and depression are abolished in the patient. A fifty-four-year-old TV news director with terminal cancer reported during his medically supervised psilocybin trip, "Oh God, it all makes sense now, so simple and beautiful." He added later, "Even the germs were beautiful, as was everything in our world and universe. "12 He died, apparently contently, seventeen months later. This sense of an animate universe is confirmed by the subjective account of a psiocybin experience from a British psychologist who was otherwise well and not part of a laboratory study:
At a certain point, you are shifted into an animate, supernormal reality.... Beauty can radiate from everything that one sets one's eyes on, as though one had suddenly woken up more. Everything appears as if alive and in fluidic con-nection.13
In some ways, the ego or self is a great achievement. Certainly it is hard to imagine human history without this internal engine of conquest and discovery. The self keeps us vigilant and alert to threats; vanity helps drive some of our finest accomplishments. Especially in a highly competitive capitalist culture, how would anyone survive without a well-honed, highly responsive ego? But as Pollan observes:
The sovereign ego can become a despot. This is perhaps most evident in depression, when the self turns on itself and uncontrollable introspection gradually shades out reality. 14
The same sort of thing can be said of the immune system. It saves us time and again from marauding microbes, but it can also betray us with deadly effect. The philosopher!
206 207
immunologist Alfred Tauber wrote of the self as a metaphor for the immune system, but that metaphor can be turned around to say that the immune system is a metaphor for the self. Its ostensible job is the defense of the organism, but it is potentially a treacherous defender, like the Praetorian guard that turns its swords against the emperor. Just as the immune system can unleash the inflammations that ultimately kill us, the self can pick at a psychic scar—often some sense of defeat or abandonment—until a detectable illness appears, such as obsessive-compulsive disorder, depression, or crippling anxiety.
So what am I? Or, since individual personality is not the issue here, I might as well ask, what are you? First, the body: It is not a clumsy burden we drag around with us everywhere, nor is it an endlessly malleable lump of clay. Centuries of dissection and microscopy have revealed that it is composed of distinct organs, tissues, and cells, which are connected to form a sort of system—first conceived of as a machine, and more recently as a harmonious interlocking "whole?' But the closer we look, the less harmonious and smooth-running the body becomes. It seethes with cellular life, sometimes even warring cells that appear to have no interest in the survival of the whole organism.
Then, the mind, the conscious mind, and here I am relying, appropriately I think, solely on subjective experience: We may imagine that the mind houses a singular self, an essence of "I-ness," distinct from all other selves and consistent over time. But attend closely, to your thoughts and you find they are thoroughly colonized by the thoughts of others, through language, culture, and mutual expectations.
The answer to the question of what I am, or you are, requires some historical and geographical setting.
Nor is there at the core of mind some immutable kernel. The process of thinking involves conflict and alliances between different patterns of neuronal activity. Some patterns synchronize with and reinforce each other. Others tend to cancel each other, and not all of them contribute to our survival. Depression, for example, or anorexia or compulsive risk taking, represent patterns of synaptic firing that carve deep channels in the mind (and brain), not easily controlled by conscious effort, and sometimes lethal for the organism as a whole, both body and mind. So of course we die, even without help from natural disasters or plagues: We are gnawing away at ourselves all the time, whether with our overactive immune cells or suicidal patterns of thought.
I began this book at a point where death was no longer an entirely theoretical prospect. I had reached a chronological status that could not be euphemized as "middle-aged" and the resulting age-related limitations were becoming harder to deny. Three years later, I continue to elude unnecessary medical attention and still doggedly push myself in the gym, where, if I am no longer a star, I am at least a fixture. In addition, I retain a daily regimen of stretching, some of which might qualify as yoga. Other than that, I pretty much eat what I want and indulge my vices, from butter to wine. Life is too short to forgo these pleasures, and would be far too long without them.
Two years ago, I sat in a shady backyard around a table of friends, all over sixty, when the conversation turned to the age-appropriate subject of death. Most of those pres‑
ent averred that they were not afraid of death, only of any suffering that might be involved in dying.
208 209
I did my best to assure them that this could be minimized or eliminated by insisting on a nonmedical death, without the torment of heroic interventions to prolong life by a few hours or days Furthermore, we now potentially have the means to make the end of life more comfortable, if not actually pleasant—hospices, painkillers, and psychedelics, even, in some places, laws permitting assisted suicide. At least for those who are able to access these, there is little personal suffering to fear. Regret, certainly, and one of my most acute regrets is that I will not be around to monitor scientific progress in the areas that interest me, which is pretty much everything. Nor am I likely to witness what I suspect is the coming deep paradigm shift from a science based on the assumption of a dead universe to one that acknowledges and seeks to understand a natural world shot through with nonhuman agency.
It is one thing to die into a dead world and, metaphorically speaking, leave one's bones to bleach on a desert lit only by a dying star. It is another thing to die into the actual world, which seethes with life, with agency other than our own, and, at the very least, with endless possibility. For those of us, which is probably most of us, who—with or without drugs or religion—have caught glimpses of this animate universe, death is not a terrifying leap into the abyss, but more like an embrace of ongoing life. On his deathbed in 1956, Bertolt Brecht wrote one last poem:
When in my white room at the Charité I woke towards morning
And heard the blackbird, I understood
Better. Alreadyfor some time
Ihad lost ailfear ofdeath. For nothing
Can be wrong with me ifY myself
Am nothing. Now
I managed to enjoy
The song of every blackbird after me too. 15
He was dying, but that was all right. The blackbirds would keep on singing.
214 ENDNOTES ENDNOTES 215
CHAPTER ELEVEN: THE INVENTION OF THE SELF
1. Henry Wadsworth Longfellow, "A Psalm of Life," Poetry Foundation, wwwpoetryfoundation.org!poems-and-poets/poems/detail/44644.
2. Gary Petty, "What Does the Bible Say About the 'Immortal Soul,'" Beyond Today, July 15, 1999, www.ucg.org!the-good-news! what-does-the-bible-say-about-the-immortal-soul.
3. Lionel Trilling, Sincerity andAuthenticity (Cambridge, MA: Harvard University Press, 1973), 19.
4. Jean-Jacques Rousseau, The Confessions and Correspondence, Including the Letters toMalesherbes, trans. Christopher Kelly (Hanover, NH: University Press of New England, 1995), ebook, location 693.
2. John 0. Lyons, The Invention of the Se'f: The Hinge of Consciousness in the Eighteenth Century (Carbondale: Southern Illinois University Press, 1978).
3. "Martin Guerre," Wikipedia, https:!/en.wikIpediaorj/ Martin-Guerre.
4. Garth Amundson, "Psychotherapy, Religion, and the Invention of the Self," Therapy View: Musings on the Work and Play ofPsy-chotherapy, November 1, 2015, https://'therapyviews.com,/2015/ 1 1/01!do-psychiatric-drugs-offer-a-meaningful-resolutionof. human-suffering!.
5. Marino Perez-Alvarez, "Hyperreflexivity as a Condition of Mental Disorder: A Clinical and Historical Perspective," Psi cothema 20, no.2 (2008):181-87.
6. "Worshiping Yourself," The Twisted Rope, March 6, 2014, https:/!thetwistedrope.wordpress.com/2014/03/06,'worshiping yourself!.
4. Barbara Ehrenreich, Dancing in the Streets:A History of Collective Joy (New York: Metropolitan Books, 2006).
5. Herbert Fingarette, Death: Philosophical Soundings (Chicago: Open Court, 1999),34-35.
6. Alex Lickerman, "Overcoming the Fear of Death," Psychology Today, October 8, 2009, ww'psychologytoday.com!blog!happiness-in-world!200910!overcoming-the-fear-death.
6. Robert C. Solomon, Spiritualityfor the Skeptic: The Thoughtful Love ofLfi' (Oxford: Oxford University Press, 2006), 120.
7. Amundson, "Psychotherapy, Religion, and the Invention of the Self."
8. Noah Shachtman, "Troops Use 'Samurai' Meditation to Soothe PTSD," Wired, October 8, 2008, www.wired.com/2008/10/ samurai-soldier!.
9. "Rupert Brooke's Obituary in The Times,"
http:!!exhibits.lib.byu.edu/wwi!poets/rbobituary.html.
8. "Joe Hill," Union Songs, http:!!unionsong.com/u017.html.
9. Daniel Goleman, "The Experience of Touch: Research Points of a Critical Role," New York Times, February 2, 1988, www.nytimes.com/l 988!02!02!science!the-experience-of-touch-research-points-to-a-critical-role.html ?pagewanted=all.
10. Robinson Meyer, "Human Extinction Isn't That Unlikely," Atlantic, April 29, 2016, www.theatlantic.com!technology!archive! 2016!04!a-human-extinction-isnt-that-unlikely!480444!.
11. "The Irreconcilable Acceptance of Near-Term Extinction," Nature Bats Last, April 28, 2013, https:!!guymcpherson.com,'2013/04! the-irreconcilable-acceptance-of-near-term-extinction!.
234 ENDNOTES
CHAPTER TWELVE: KILLING THE SELF, REJOICING IN A
LIVING WORLD
1. "Jacques Monod," Today in Science History,
https : //todayinsci.com/M/Monodjacques/MonodJacques-Quotations.htm.
1. "The Triumph of Abrahamic Monotheism?," Religion Today, November 2, 2011, http://religion-today.blogspot.com/201 1/11/ triumph-of-abrahamic-monotheism.html.
3. Jessica Riskin, The Restless Clock: A History ofthe Centuries-Long Argument over What Makes Things Tick (Chicago: University of Chicago Press, 2016), 3.
1. Jackson Lears, "Material Issue," The Baffler, no. 32 (September 2016), https://thebaffler.com/salvos/material-issue-lears.
George Dvorsky; "Prominent Scientists Sign Declaration That Animals Have Conscious Awareness, Just Like Us," Gizmodo, August 23, 2012, http://io9.gizmodo.com/5937356/prominent-scientists-sign-declaration-that-animals-have-conscious-awareness-just-like-us.
Stephen Hawking, "The Origin of the Universe," Hawking.org.uk, wwhawking.org.uk/the-origin-of-the-universe.html.
Rolf Ent, Thomas Ullrich, and Raju Venugopalan, "The Glue That Binds Us," ScientficAmeri can, May 2015, www.bnLgov/physics/ NTG/linkablejiles/pdf/SciAm-Glue-Final.pdf.
5. David Rieff, Swimming in a Sea ofDeath: A Son's Memoir (New York, Simon & Schuster, 2008), 167.
6. Ibid.
7. Michael Pollan, "The Trip Treatment," New Yorker, February 9, 2015, www.newyorker.com/magazine/2015/02/09/trip-treatment.
Ibid.
8. Ibid.
9. Simon G. Powell, Magic Mushroom Explorer: Psilocybin and the AwakeningEarth (South Paris, ME: Park Street Press, 2015), 30.
Pollan, "The Trip Treatment."
"Bertolt Brecht: When in My White Room at the Charité," reproduced at Tom Clark Beyond the Pale, January 12, 2012, http://tomclarkblog.blogspot.com/2012/01/bertolt-brecht-when-in-my-white-room-at.html.
ABOUT THE AUTHOR
Barbara Ehrenreich is the author of more than a dozen books, including the New York Times bestseller Nickel and Dimed. She has a PhD in cellular immunology from Rockefeller University and writes frequently about health care and medical science, among many other subjects. She lives in Virginia.