In This Is All

Memory, meaning, and the self.

By Jim Holt

Reminiscence of Jinling, by Wang Gai, 1686. The Metropolitan Museum of Art, Gift of Matthew J. Edlund, 2016.

“It is better to be a human being dissatisfied than a pig satisfied; better to be Socrates dissatisfied than a fool satisfied.” So wrote John Stuart Mill in his 1861 treatise Utilitarianism. And most of us are inclined to agree with him. But why? What great thing do humans have that pigs lack? Whatever it is, it had better be something good if it is to compensate us for missing out on the pig’s felicity.

Pigs, so far as we know, are conscious. They have inner experiences, just like us. But are they self-aware? Are they conscious of their own existence? Do they have an inner “I”? Perhaps. There is evidence that some animals have such self-awareness: chimpanzees, dolphins, elephants, even magpies. All these species have passed what is known as the “mirror test” for self-awareness. If you put them in front of a mirror after applying a red spot to their forehead, they react by trying to rub away the spot. They seem to recognize themselves in the reflection. From that, scientists have inferred that these animals share something of our capacity for self-awareness.

Human children become able to pass the mirror test sometime between the ages of eighteen months and twenty-four months. So, like the chimps, etc., babies seem to have a rudimentary sense of self by this age. And like many animals, baby humans can remember things. There is evidence that memory begins even in the womb, since newborns recognize sounds they heard in utero, like their mother’s voice.

The kind of memory that young children share with animals is called episodic memory: the memory of having undergone experiences. This enables children (and animals) to learn from experience: to use knowledge of what has happened to them to guide their behavior in the future.

I have a terrible memory; I never forget a thing.

—Edith Konecky, 1976

But there is another kind of memory that develops considerably later in human children, and never (as far as we know) in nonhuman animals. This is called autobiographical memory. What is the difference between episodic and autobiographical memory? In autobiographical memory, you appear in the frame of the memory. Not only do you remember how you felt on the first day of school, you see yourself going to school and having those feelings. It’s not just a matter of what happened, as with episodic memory; it’s a matter of what happened to me.

When you summon up an autobiographical memory, you engage in a kind of mental time travel. You identify your current “remembering self” with your past “experiencing self”—like the self that was feeling apprehensive on the first day of school. These two selves are connected, not just through a continuously existing physical body but also through a somewhat discontinuous (because interrupted by dreamless sleep) stream of consciousness. The faculty of autobiographical memory allows us to understand ourselves as beings whose existence extends over time.

“Autobiographical memory is one of the most complex, most studied, and most characteristically human of cognitive capacities,” says the philosopher Jenann Ismael. That capacity emerges in each of us later in our lives than we might have guessed. Not until the end of our preschool years do we knit up the past and present into something resembling a continuously existing self. And even at that point in our development, we do not get the chronology right. Eight-year-olds, trying to judge which events going back more than a few months came earlier or later, perform no better than chance. Only in our adolescence do we come to string together our autobiographical memories into a coherent life chronicle.

Splendor of the Procession of General Grant from America, by Toyohara Kunichika, 1879. The Metropolitan Museum of Art, Gift of Lincoln Kirstein, 1962. A print commemorating former U.S. president Ulysses S. Grant’s 1879 visit to Japan.

Now we seem to be getting somewhere in this Socrates/pig business. A pig is—like us—conscious. And it may—also like us—have self-awareness. The pig may be said to possess a synchronic self, a self in the here and now. But a human (by adolescence) has the potential to develop something more: a full-blown diachronic self, a self that extends over time. And our distinctive human capacity for autobiographical memory is crucial to existing in this fuller, transtemporal sense.

That, one might say, is why it is better to be a human than a pig. And it is also why humans spend so much of their lives dissatisfied. Putting together a diachronic self is work. It is a task imposed on us by our capacity for autobiographical memory. This distinctively human form of memory supplies the raw materials for fashioning a self. But it is up to each of us to do the shaping and sculpting. “The final end of every rational being is the building of the self,” declares Roger Scruton—a job that his fellow contemporary philosophers refer to as self-constitution.

But who says we have to do this job? Why can’t we shirk it, living happily-go-luckily, like a pig, in the here and now?

Well, some people do claim to shirk it—even, as we shall see, some philosophers. But most people seem to take the task of self-constitution seriously. They peg away at it even at great cost to their happiness. How do we know this? Listen to what people say when they talk about what it means to live well. People say they want to be happy. They want their needs filled; they want to be free from suffering; they want their life to be enjoyable from moment to moment, to have “positive hedonic tone.” In this they are not dissimilar to pigs.

People are trapped in history, and history is trapped in them.

—James Baldwin, 1953

But people also say they want their lives to be meaningful. And judging from the way they talk about meaning, this seems to be distinct from happiness. Indeed, meaning and happiness can be in conflict—so the psychological evidence tells us. In surveys conducted by the social psychologist Roy F. Baumeister, people correlate the activities that fill their lives with these two goals. And it turns out that activities that increase meaning can reduce happiness—and vice versa. The most striking example is the activity of raising children, which reliably diminishes measured happiness, both from moment to moment and on the whole. Then why do people do it? This has been called the “parenthood paradox.” And its resolution is simple: people have children because doing so gives meaning to their lives.

 

So happiness and meaning are our two masters. They are distinct wishes, since we associate them with different activities. And neither dominates the other: we sometimes sacrifice happiness for meaning, and we sometimes sacrifice meaning for happiness. We might put this conclusion into an equation:

Well-being = happiness + meaning

Exactly what do people have in mind when they talk about meaning? A familiar thought is that meaningfulness has to do with a feeling of being connected to “something larger.” But Baumeister’s research suggests that it has equally to do with “expressing oneself and thinking integratively about past and future.” Just thinking about the past and future tends to raise people’s sense of meaningfulness—while lowering their happiness.

Which brings us back to memory and the self. Let’s say that autobiographical memory furnishes the raw materials for self-constitution. How do we make those raw materials into a meaningful self? The fashionable answer these days, among both psychologists and philosophers, is narrative. To be a self, the thinking goes, is to live a life that is structured like a story—a story of which you are both author and protagonist.

The idea that “living well” means “living narratively” is prefigured in Nietzsche, who wrote that “we want to be the poets of our lives.” It can be traced back to the earliest times in which humans started worrying not just about eating and reproducing but about living a life that matters—a life like that of Achilles, which, though perhaps short and full of destructive emotions, was songworthy. Contemporary champions of the narrative self include the neurologist Oliver Sacks (“Each of us constructs and lives a ‘narrative’ ”), the psychologist Jerome Bruner (“In the end we become the autobiographical narratives by which we ‘tell about’ our lives”), and the philosophers Daniel Dennett (for whom the self is the “center of narrative gravity”) and Alasdair MacIntyre (“The unity of a human life is the unity of a narrative quest”).

And when does this storytelling start? Around the same time that our capacity for autobiographical memory does, in the preschool years. That is when our parents and other caregivers start encouraging us to narrate and evaluate the events in our lives. Developmental psychologists have studied how the style of maternal reminiscences affects a child’s autobiographical-memory skills. The more elaborative the mother’s style, the more detailed and coherent her child’s personal narratives tend to be. Katherine Nelson, one of the leaders in this research, has described how such early autobiographical narratives foster in each of us “a new subjective level of conscious awareness, with a sense of a specific past and awareness of a possible future, as well as with new insight into the consciousness of other people.”

Orpheus and Eurydice (detail), by Augustus Rodin, The Metropolitan Museum of Art, Gift of Thomas F. Ryan, 1910.

Orpheus and Eurydice (detail), by Augustus Rodin, The Metropolitan Museum of Art, Gift of Thomas F. Ryan, 1910.

Some philosophers—among them MacIntyre, Paul Ricoeur, and Charles Taylor—have insisted that if a narrative is to endow a human life with meaning, it must take the form of a quest for the good. But what makes such a quest an interesting story? There had better be some trouble in it, because that’s what drives a drama. If adversity doesn’t figure prominently in your autobiographical memories, your life narrative will be a bit insipid, and your sense of meaningfulness accordingly impaired.

The claim that big troubles are essential ingredients of a good narrative, and hence of a good life, is called by psychologists the “adversity hypothesis.” If true, this hypothesis “has profound implications for how we should live our lives,” observes the psychologist Jonathan Haidt: “It means that we should take more chances and suffer more defeats.” It also means, Haidt adds, that we should expose our children to the same.

But adversity can be taken too far. We don’t want our lives to assume the form of tragedy (easy as it is to weave our autobiographical memories into such a grim narrative, especially when lying awake at three am). Meaning, after all, is not everything; happiness counts, too.

Perhaps that is why in the United States, where the pursuit of happiness is written into the national DNA, the “redemptive self” is such a popular narrative. Here the arc of the story is from trouble to triumph: rags to riches, slavery to freedom, sickness to health, sin to salvation. “Americans deeply value stories of personal redemption,” the psychologist Dan P. McAdams has written:

Sometimes these stories suggest religious meanings, but more often they adopt images and ideas from secular life. In popular fiction, Hollywood movies, television shows from reality TV to The Oprah Winfrey Show, and in many other venues, American protagonists continue to distinguish themselves as rugged and resilient individualists who delight in their nonconformity and who continue to grow and develop, especially in response to failure and setbacks. Indeed, these kinds of redemptive narratives have always held a privileged status in American society, going back to the spiritual autobiographies written by the New England Puritans in the seventeenth century.

This American attitude is to be contrasted, for example, with England, where a narrative of existential failure is perfectly respectable (with, say, the poet Philip Larkin being a case in point).

 

And what of your own case? Let’s say you’ve done the hard narrative work, the work of self-constitution. You’ve put in the “conscious effort” it takes to “extract an intelligible trajectory out of your past,” one that will “make your life as a whole a meaningful unit” (the quoted phrases are from Jenann Ismael). Perhaps your narrative takes the popular American form of redemption, in which the blunders and setbacks and sufferings you’ve undergone are invested with positive meaning in the drama of resilient triumph. You’ve traveled the hard road to the good life. And on the whole, you’re content with the identity you’ve reflectively forged. Are you now a bit like a happy Socrates? Or could it be that you’re more of a satisfied fool?

That would depend on just how good your self-story is. So, at least, we are told by the narrativists. “The only criteria for success or failure in a human life as a whole are the criteria of success or failure in a narrative or to-be-narrated quest,” declares Alasdair MacIntyre. But what makes for narrative “success”? What distinguishes a meaningfully fashioned self from a contemptible or trite one?

There is no greater sorrow than to recall a happy time in the midst of wretchedness.

—Dante Alighieri, 1321

For one thing, the narrative put together out of your autobiographical memories had better not be crazy. It should meet what has been called the “reality constraint.” If you are too selective in the memories you admit into it—or if you are a fabulist—the self you fashion will not fit well into the social world of other selves. Nor will it be reflected in your obituary.

Beyond that minimal constraint, however, lies a radical division of opinion. On one side are those who believe the most important thing about the self you fashion is that it be original and new—not something copied from a preexisting template, not an imitation of someone else. Let’s call these people Nietzscheans. On the other are those who insist that the “good” you are questing after in your self-fashioning story must be truly good—not just something you happen to love for subjective reasons. Let’s call these people Platonists.

The staunchest of Nietzscheans is Nietz­sche himself—whose imperative, as voiced by Zarathustra, is “to re-create all ‘it was’ into ‘thus I willed it.’ ” To will one’s own individuality, Nietz­scheans insist, means becoming something new. And to succeed in this existential task, it is not enough to look back in memory over the contingencies of your life. You must also invent fresh metaphors to unify these contingencies into a narrative whole, one worthy of affirmation. You must be what the late Harold Bloom called a “strong poet” of your own life. Otherwise, your self-fashioning enterprise will issue not in something new in the world but in a copy or replica—a possibility that might (to continue with the Bloomian jargon) leave you touched by the “anxiety of influence.” (Nietzsche himself may have betrayed an anxiety of influence in his repeated railings against Socrates, the philosophical paradigm of a radically new type of human.)

The Nietzschean recipe for fashioning a self from the contingencies of memory is (one might complain) unhelpfully vague. It is also a bit precious—witness the example of Michel Foucault, a latter-day Nietzschean who liked to talk about creating one’s self “as a work of art.” Or it might sound a little banal—“Be yourself!”

And is originality enough? What if what is original are your crimes, your cruelty? Hitler, in his final moments in the bunker, might well have looked back affirmatively on his youthful decision to abandon the career of a mediocre painter in favor of ushering in the Thousand Year Reich. His self-narrative was original enough—arguably the handiwork of a strong poet. Did it thereby give rise to a meaningfully fashioned self? Some philosophers have been willing to grasp this nettle. “The fact that his life was so dreadfully immoral might really have had no deleterious effect whatever on the value to him of living that life,” writes the philosopher Harry G. Frankfurt. “It is possible, I am sorry to reveal, that immoral lives may be good to live.” Of course, one could argue that cruelty always makes for a banal and derivative self-narrative, with Mein Kampf as Exhibit A (although Humbert Humbert would make for a trickier test case). Or one might look for a more stringent criterion for a life-unifying narrative than the Nietzschean one of sheer originality.

So consider the Platonist alternative. Here the emphasis is not on newness but on goodness. To create a meaningful self, the story you tell about your life must be organized around objective value. For Plato, it must be an ascent of desire toward an eternal form of the Good, one we dimly recall from our preexistence. (This is the story, for example, that Wordsworth tells about himself in “Ode: Intimations of Immortality” and The Prelude.) For a contemporary philosopher like Susan Wolf, the story of a meaningful life must be one of passionate engagement in projects of objective worth. “It is not enough…that one is occupied with doing things that one loves,” Wolf writes. “The things one loves doing must be good in some independent way.” If you are deceived about the worth of what you care about—if you are devoted to a false god, a schlock artistic ideal, a wicked ideology, or a scoundrel of a romantic partner—then your life might feel meaningful, but it will not be so.

The Platonists want to put an external check on the narrative by which you shape your life into a self. Your story must be not only internally cohesive; it must link up with a reality of value that lies beyond you. But how far beyond? Should you frame your self-defining good according to the values of your tribe? The values of the creed you were born into? The values on which the most enlightened portion of humanity appears to be converging? Or could all of these values be illusory, spurious? Plato had no doubt about the existence of a transcendent and timeless Good, one that might objectively ground the meaningfulness of our personal narratives. But even contemporary philosophers who invoke objective goodness admit they can give no satisfying account of what it is or how we may come to know it. As Wolf concedes, this remains “an unsolved problem in philosophy.”

So when it comes to self-creation, both the Platonists and the Nietzscheans leave us in a conceptual lurch. Neither furnishes a workable criterion of success at this task—even though both assure us the task is crucial to living well and fully.

 

But what if they are wrong in this shared assumption? At the outset, we supposed that what makes Socrates different from a pig is that Socrates has autobiographical memories, and that these memories furnish raw materials for constructing a self that endures over time, from birth to death. This construction is hard work, we are told, and worrying about it may detract from one’s piggish happiness in the moment. But it is necessary.

Yet there are dissentient voices that insist we can live perfectly good and meaningful lives without it. They take their inspiration from figures like Petrarch, Montaigne, and Proust, all of whom have vividly depicted a more fragmentary kind of existence over time, in which a single human life comprises something like a succession of selves. The most emphatic of these voices today is the English philosopher Galen Strawson, who proudly declares himself to be an “episodic.”

Asadullah (9), by Ambreen Butt, 2019. Watercolor and collage of text on tea-stained paper, 29 x 21 inches. © Ambreen Butt, courtesy the artist and Gallery Wendi Norris, San Francisco.

What make “episodics” different? It’s not that they are deficient in their faculty of memory. Rather, it’s that they don’t use their autobiographical memory to “time travel.” They remember past experiences, but not as their experiences. I may be able to summon a memory of my first day of school and of the feelings it occasioned. But if I am an episodic, I don’t identify my present “remembering self” with the past “experiencing self”; I don’t see that little boy in the frame as me (even though I concede it’s the same human animal). I feel no need to “unify” the two selves over time by constructing a narrative.

Episodic types have no special interest in the past, except insofar as it has shaped their present character. Accordingly, they do not fear the loss of memory as long as their moral character remains intact. As the proto-episodic Earl of Shaftesbury wrote, “What matter for memory? What have I to do with that part? If, whilst I am, I am as I should be, what do I care more?…the now; the now. Mind this: in this is all.”

Is the episodic life worth living? Strawson assures us, from the perspective of his own existence as a series of disjointed selves, that it is: “Some may still think that the episodic life must be deprived in some way. But truly happy-go-lucky, see-what-comes-along lives are among the best there are, vivid, blessed, profound.” And it may be that Mr. Episodic is even more in tune with the muse of memory than his friend Mr. Diachronic. For the former has no reason to massage, revise, augment, extrude, fabulate, and otherwise distort his autobiographical recollections for the sake of forcing them into an interesting story, the way Mr. Diachronic does. He may have a more ephemeral self, but he has superior self-understanding.

There are many such ironies, and many options open to us for choosing a mode of life somewhere between Socrates and Brother Pig. And regardless of which you choose, you’ll owe a debt to Mnemosyne. So let the narrativist and the episodic, the Nietzschean and the Platonist, the philosopher and the poet, put aside their differences and raise their voices in unison to say, Speak, memory.

Related Reads