Immanuel Kant has a quote from his Critique of Pure Reason: “Two things fill the mind with ever new and increasing admiration and awe, the more often and more steadily we reflect upon them: the starry heavens above me and the moral law within me.”
Nobel Prize-winning biologist Francis Crick calls the Astonishing Hypothesis: You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.
(As Pinker puts it: “The supposedly immaterial soul, we now know, can be bisected with a knife, altered by chemicals, started or stopped by electricity, and extinguished by a sharp blow or by insufficient oxygen.”
What is new is that we can now observe the brain at work. You can put someone in a brain scanner, for instance, and tell from the parts of the brain that are active whether they are thinking about their favorite song or the layout of their apartment or a mathematical problem. We may not be too far from the point where we can look at the brain of a sleeping person and know what they are dreaming about.
One can eat brain—I’ve had it with cream sauce (not human brain, mind you—you shouldn’t eat human brain; you can get this terrible disease, kuru, which is much like mad cow disease, and it’s one reason not to be a cannibal).
Emerson M. Pugh, wrote, “If the human brain were so simple that we could understand it, we would be so simple that we couldn’t.”
Emo Phillips says, “I used to think that the brain was the most wonderful organ in my body. Then I realized who was telling me this.”
It is certain that there may be extraordinary mental activity with an extremely small absolute mass of nervous matter: thus the wonderfully diversified instincts, mental powers, and affections of ants are notorious, yet their cerebral ganglia are not so large as the quarter of a small pin’s head. Under this point of view, the brain of an ant is one of the most marvellous atoms of matter in the world, perhaps more so than the brain of a man.
If you show someone a movie and later ask, “Did you see the children getting on the school bus?” the person is more likely to remember, later on, a school bus in the scene, even if there wasn’t one.27 Indeed, repeated questioning can lead to the creation of false memories.
Plato, for instance, talked of a trinity—a “spirit” that lives in the chest and is involved in righteous anger, the “appetite” located in the stomach and related to desires, and “reason,” in the head (at last!), which oversees the other two.
But everything changes the brain. Reading this sentence just changed your brain, because you’re thinking about it and thinking takes place in the brain. Indeed, reading this sentence creates long-lasting changes in your brain, because you’re going to remember a bit of it tomorrow (I promise you), and this means that the structure of your brain has been modified by this experience.
when it comes to moral issues, the relevant question isn’t whether something can reason or speak; it is whether it can suffer.
It turns out that if you imagine walking around your house, it causes a spike of activity in the parahippocampal gyrus in the frontal lobe, while if you imagine playing tennis, this will activate the premotor cortex. In a fascinating study, researchers told one such patient—known as Patient 23—that they were going to ask him questions and he could signal “yes” by imagining playing tennis or “no” by imagining walking about his house.
We see the limits of consciousness in other modalities. One clever method in cognitive psychology is to have people wear headphones and listen to two separate speeches, one in the right ear, one in the left. It would be neat if people could attend to both—imagine following two podcasts simultaneously—but we can’t.
The more of the details of our daily life we can hand over to the effortless custody of automatism, the more our higher powers of mind will be set free for their own proper work.
the “spotlight effect”: We overestimate the extent to which others notice us, in both negative and positive ways.
“A whip is a great way to get someone to be here now. They can’t look away from it, and they can’t think of anything else.”
Lack of self-awareness can be the trick to survival.
And to be fair, a lot of Freud’s ideas really are strange. He insisted on the importance of penis envy (the trauma resulting from a girl’s discovery of her lack of a penis), castration anxiety (a boy’s concern that he will lose his), and the “primal scene” (a term that he coined for children witnessing their parents having sex). The effects on a boy if the mother dies and the father raises him? Homosexuality, due to “exaggerated castration anxiety.” A young girl in a therapy session who pulls the hem of her skirt over her exposed ankle? Tendencies toward exhibitionism. A young man who smooths the crease of his trousers in place before lying down in his first session? Freud writes that this man “reveals himself as an erstwhile coprophiliac of the highest refinement”—where a coprophiliac is someone with an erotic fascination with feces!
Beneath apparent rationality, Freud had discerned dark impulses and contradictory yearnings that coalesced into predictable patterns he called complexes. He had demonstrated that, in the culture and in the lives of individuals, hidden symbols abound; our customs and behaviours simultaneously hide and reveal sexual and aggressive drives incompatible with the requirements of civilized society. Freud’s theories seemed to update ancient philosophies, casting our lives as tragic dramas of a distinctively modern sort. It was as if, before Freud, we had never known ourselves.
For Freud, then, the ego serves two masters. It’s stuck between raging animal desires on the one hand, the id, and a conscience on the other, the superego. Now, it’s tempting to think, as I’m framing it this way, that the id is dumb and animalistic and the superego is advanced and civilized, corresponding to the picture of a person (ego) with a devil over one shoulder (id) and an angel over the other (superego). But this isn’t quite right. A lot of the prohibitions set up in the superego are grounded on the prejudices and beliefs of society, and may not reflect a clearheaded moral understanding. It’s possible that you might believe, intellectually, that some act, perhaps something sexual, that you engage in is fine—nobody is harmed—but your superego may scream at you that it is disgusting and wrong. And this can limit your happiness and flourishing.
This failure of the world to give you what you want leads to a second system called the ego. This is where consciousness emerges—your ego is you. With the emergence of the ego, there is some understanding of reality; the ego enables you to either pragmatically satisfy your desires or to suppress them. Here we see the reality principle at work.
Later in development, the trinity is complete—the superego emerges. This is the part of the mind that has internalized a moral code, first from the parents, and then from society more generally. An enraged baby might want to hit his father in the face and, being nothing but id, will do just that. Later on, possessing an ego, the toddler might reason that this is going to have a bad outcome—father’s anger—and so holds back. But much later on, possessing a superego, a child might decide to restrain herself just because it’s wrong. At a certain point, one’s desires are inhibited not merely by fear of consequences but by some sort of moral code.
The first process is the id, which is present at birth. This is the animal part of the self. The id wants to eat, and drink, and excrete, and get sensual pleasure. It works in accord with what Freud called the pleasure principle—it wants immediate gratification.
The first stage, which lasts throughout the first year of life, is the oral stage, where the mouth is associated with pleasure. For Freud, weaning a child incorrectly could lead to oral fixation in adulthood. In the literal sense, that could mean eating too much or chewing gum or smoking. In a more metaphorical sense, the person could be dependent or needy; oral fixation relates to problems revolving around trust and envy.
The next is the anal stage, running from roughly the first year to about age three. Now it’s the anus that is associated with pleasure. The key challenge involves toilet training. Adults who struggled through this stage of development might be compulsive, clean, and stingy, because they are (to put it metaphorically) unwilling to part with their feces. This conception has ended up in language, so we might say, “Oh, he’s so anal,” to refer to someone who is obsessively concerned with getting things exactly right.
Then there’s the phallic stage, between around age three and five, where the focus of pleasure shifts to the genitals. For Freud, there is an important drama that occurs at this point, which he called the Oedipus complex. This is based on the play Oedipus Rex, written by Sophocles in 439 BC, about a man who unwittingly kills his father and marries his mother, Jocasta. Freud’s idea is that an analogous event happens for every boy. In the phallic stage, he is focusing on his penis, and he seeks an external object of affection. Who is the woman he loves and who loves him back? Mom. He wants to sleep with Mom. But what about his father? Three’s a crowd, and so Freud claimed that the child comes to hate his father and wants him out of the way. Murder Dad, marry Mom. Freud also believed that the child will believe that his thoughts are public. He worries that his father will discover his plans and retaliate—by castrating him. This is terrifying. The child gives up the murder plot and, instead, allies himself with his father. Success at this stage? According to Freud: Identification as a man, heterosexual desires later in life. Failure? Well, it could be homosexuality; or alternatively, a sort of hypermasculinity, with too much focus on power and authority.
Electra complex, the term coined not by Freud, but by his famous follower, Carl Jung. (The term comes from another Greek myth, in which Electra competes with her mother for the affections of her father.) Jung’s story is complex, involving penis envy, a desire to create a son who would have the missing penis, a shift from the clitoris to the vagina as an expression of mature sexuality, and much else.
After all the turmoil of the phallic stage, there is a respite. This is the latency stage, where sexuality is repressed. The child identifies most with the same-sex parent, and focuses on hobbies and school and friendship. At puberty, sexual feelings reemerge, and healthy adults find pleasure in sexual relationships as well as other pursuits. We are at the genital phase. Those unfortunate enough to have had difficulties with earlier stages, such as with breastfeeding or toilet training, will, according to Freud, struggle with associated psychological problems throughout their adult lives.
Even for a reasonably healthy adult, all sorts of challenges remain. One of them is that the id is generating various desires, many of which are forbidden by the superego. It’s not merely that you can’t act upon them; it’s that you shouldn’t even be thinking about them. And so they get repressed. But some of this forbidden material makes it out, in jokes, in slips of the tongue—the Freudian slips we talked about earlier—and in dreams.
Freud has a lot to say about dreams. One of his best-known books was The Interpretation of Dreams, where he argued that they represent taboo wishes.7 These are then disguised as they make their way to consciousness. Accordingly, Freud distinguished the manifest dream—the dream you remember—from the latent dream—the one that really happened.
Sublimation. Taking desires that are unacceptable and directing them to more valuable activities. For instance, somebody who has strong sexual desires might devote a lot of energy to his or her work or studies. Rationalization. Taking desires that are unacceptable and reconstruing them in a more acceptable way. A father who enjoys physically punishing his children might think of his violent acts as being for their own good. Displacement. Redirecting shameful thoughts to more appropriate targets. If a boy hates his father and wishes him dead, this defense mechanism might refocus his aggression toward more appropriate targets, making him violently competitive with other boys, for instance. Projection. Taking one’s own shameful thoughts and attributing them to someone else. Imagine a woman who wants to have sex with other women, but who was raised so that her superego tells her that this desire is unacceptable. She might become unconscious of it—and come to believe that other women are sexually drawn to her. Reaction formation. Replacing shameful thoughts and fantasies with their opposites. Consider romantic comedies in which couples who are in vicious disagreement later fall in each other’s arms—their apparent dislike masked their true attraction. Or consider the cliché that men who have the most negative feelings toward gay men are themselves wrestling with homosexual desires (a claim, by the way, that there’s little evidence for8).
Unlike modern approaches such as cognitive behavioral therapy, which deal directly with the everyday difficulties that patients face, Freud saw these problems as mere symptoms. The goal of psychoanalysis was insight—to bring to the patient’s consciousness the core issue, often a repressed trauma early in life. Through insight, Freud believed, patients can ultimately be freed from the problems that brought them to therapy in the first place.
Freud spoke of transference, a phenomenon where one projects one’s desires and feelings from one person onto another. In treatment, this can mean that the patient may start to think of the therapist in terms of a significant individual in his or her life, treating the therapist as a father or mother, or perhaps as a romantic partner. Transference can be useful, and the patient can be urged to explore this reaction, but it has its perils, and therapists are trained to be wary of this happening in the other direction—countertransference, where a therapist might begin to have inappropriate feelings toward the patient.
The antimentalism of behaviorism is often seen as a reaction to the excesses of Freud, who posited all sorts of mental constructs—the id, ego, superego, defense mechanisms, and much more—that are revealed in often mysterious ways. In response to this, the behaviorists said, “We want to do real science, and so we need to get away from this Freudian mumbo jumbo—and while we’re at it, we’ll also get rid of those unscientific things that everyone else talks about, such as desires and goals and memories and emotions.”
The process of classical conditioning can be broken into three stages: STAGE 1: BEFORE ANYTHING HAPPENS The animal starts with some sort of natural, or unconditioned, response to some stimulus in the world, either innate or learned in the past. In Pavlov’s case, it was the food in the mouth (the unconditioned stimulus) and the salivation in response (the unconditioned response). That’s what the animal brings to the study before anyone messes with it. STAGE 2: CONDITIONING Then the experimenter adds a neutral stimulus, something that doesn’t evoke any response, such as a bell. (Pavlov himself actually never used a bell, but it’s what most people associate with his studies, and it makes for a fine example.) This neutral stimulus (bell) and the unconditioned stimulus (food) are then presented together. Because of the presence of the unconditioned stimulus, the animal will provide the unconditioned response (salivation). But over time, the animal will come to also associate the neutral stimulus (bell) with the unconditioned stimulus (food) and so . . . STAGE 3: AFTER CONDITIONING . . . the bell will change from a neutral stimulus to a conditioned stimulus and give rise to salivation, which, in this case, will be the conditioned response.
Law of Effect: Responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.
The behaviorists’ word for reward is reinforcement, and there are two types—positive reinforcement, giving the animal something it wants, and negative reinforcement, releasing it from something aversive.
Jean-Jacques Rousseau made this point in even harsher terms, saying that if a child was born in an adult body, “such a child-man would be a perfect idiot, an automaton, a statue without motion and almost without feeling; he would see and hear nothing, he would recognize no one.”3 This view—that knowledge comes through exposure to the environment—is known as empiricism.
Locke, is that the mind is a blank slate: “Let us then suppose the mind to be . . . white paper; void of any characters; without any ideas; how comes it to be furnished? To this I answer in one word: experience.”
The opposing view is nativism, which proposes that much of our knowledge and capacities are part of our natural endowment.
Early philosophers such as Plato explained the existence of innate ideas as the result of souls recollecting knowledge learned in past lives; modern-day nativists think of this as the product of our evolutionary history, encoded in our genes.
Even mental systems like color vision, which we are tempted to see as pretty much hardwired and built in, won’t operate without some experience—kittens raised in darkness go blind. There are some horrific cases where children are isolated from social stimulation by cruel or deranged parents, and this lack of the normal environment does terrible damage to both body and soul.
One nice example of how innate machinery is sensitive to the environment comes from the sleep-wake cycle. We have evolved a twenty-four-hour circadian rhythm in response to a long-standing fact about the environment—the duration of the Earth’s rotation. It’s hardwired. But if you take away the environmental clues of light and dark by putting someone in darkness, our internal clock goes out of sync by a few hours. As the neuroscientist David Eagleman puts it, “This exposes the brain’s simple solution: build a non-exact clock and then calibrate it to the sun’s cycle. With this elegant trick, there is no need to genetically code a perfectly wound clock. The world does the winding.”
Piaget’s ultimate interest was not child development. This was just a means to an end. Rather, he was interested in the development of knowledge in the human species—genetic epistemology, as he called it, where “genetic” refers to origins and “epistemology” refers to knowledge.
Piaget proposed that mental life includes complex cognitive structures, which he called schemas, and he posited two psychological mechanisms that lead to the transformation of these schemas and the creation of new ones. The first, assimilation, is the process of using already-existing schemas to deal with new situations. Take a baby who has a simple schema—it knows how to suck on its mother’s breast. Assimilation is when the baby sucks on a rattle or sucks on its own toes. But to do so successfully, the baby must modify the behavior in certain ways, and this is accommodation, a process through which existing schemas are changed or new schemas are created to fit the new information and new experience.
The newborn begins at the sensorimotor stage. For about the first two of years of life, the baby is a purely sensory creature. At the start of this stage, it perceives and manipulates, but doesn’t reason. There’s no sense of time, no differentiation between itself and other people, and critically, no object permanence. This last claim has long fascinated psychologists—because it’s so audacious. When a ball rolls behind the dresser, of course you know it’s still there. What could be more obvious? But Piaget’s claim was that before about six months of age, babies don’t understand that objects exist independently of their actions or perceptions of them. Out of sight, out of mind—literally. This might be why they are so amused by the game of peekaboo. You cover up your face, and then reveal it, and then babies crack up or gasp, and it’s because when you covered up your face, they thought you were gone.
Piaget’s second stage, the preoperational stage, runs from about age two and lasts until about age seven. The baby is now a child and starts to reason. Children at this age can think, they can differentiate themselves from others, they have a rudimentary understanding of time, and they understand that objects continue to exist when out of sight. But they have certain interesting limitations. One of these is what Piaget called egocentrism. He didn’t mean this in the same way I do when I say that one of my colleagues is egocentric because he spends all his time boasting about his accomplishments and never notices when I get a new haircut. Piaget meant that children literally can’t understand that the world is seen and understood differently by others; they can’t take other people’s perspectives. One classic demonstration of this is the Three Mountains task, which can be done with three- and four-year-olds.
Another limitation, according to Piaget, is that children in this stage fail to appreciate that certain operations on the world change some properties but not others—that some properties are conserved.
There are reasons, though, to believe that the universality of language really is because language is part of our nature. This was Charles Darwin’s view: “Man has an instinctive tendency to speak, as we see in the babble of our young children; while no child has an instinctive tendency to brew, bake, or write.”
There are also specific genes implicated in cases of language disorder in children.4 All of this supports the notion that language is part of our nature, in much the same way that the inborn communication systems of other creatures such as birds and bees are part of theirs.
Our larynx has been modified over the course of evolution to express the sounds of speech. In most mammals, the larynx is high in the throat, which means that they can eat and breathe at the same time. Ever wonder how a baby can breastfeed for a long stretch without ever coming up for air? Me neither, but it’s a good question and the answer is that human babies begin, like the adults of other primate species, with the larynx high up. It lowers as we age. This leads to an increased risk of choking on food, but it allows the developing child to produce the music of language. This suggests that the benefits of vocal communication outweigh the cost of possible death by choking, which is a good argument that the power of speech evolved by natural selection, through the adaptive advantages that it gave us.
Phonology is the aspect of language that directly connects to its physical realization. We can produce many sounds, but only a small subset is used in any given language. We call these phonemes.
Sounds make words, and the aspect of language that concerns words is known as morphology.
There is little doubt that brains are, at least in part, association machines. This insight was central to the ideas of the British Empiricists, and later, to the behaviorists. So much of learning can be seen as noticing and recording associations—the relationships between different parts of experience. You hear a commercial jingle and think of the product; Pavlov’s dog salivates at the sound of the bell; Skinner’s rat darts toward the side of the maze associated with a delicious treat. Certain modern computational approaches, sometimes described as “connectionism”17 or “deep learning,”18 build on this insight, seeing the brain as a powerful statistical learning machine, adept at discovering patterns in the environment. Some would go so far as to say that, in essence, this is all the brain does.
Language learning starts in the womb. Newborns suck a pacifier more to hearing their mother’s voice than the voice of a stranger because it was their mother’s voice they were most accustomed to hearing before birth.
Babies start off with the capacity to distinguish the sounds of all languages. A child born in America to English-speaking parents can tell apart phonemes that are used in Hindi, a capacity that English-speaking adults lack.28 Similarly, a Japanese baby can hear the difference between “la” and “ra” (distinct phonemes in English, which is why “lamp” and “ramp” are different words)—but a Japanese adult cannot. This is one of the few ways in which babies have greater mental powers than adults.
Children learning English know some of the rules of syntax soon after their first birthday. This has been explored through the same “preferential looking” methods that we just described for word learning. In one study, thirteen-to-fifteen-month-olds were shown two videos, one of a woman kissing a set of keys while holding a ball and the other of the same woman kissing a ball while holding a set of keys.33 Then they would hear a sentence like “She’s kissing the keys!” You can see the cleverness of this study; to figure out which sentence this corresponds to, it’s not enough to know what the words mean—both videos contain the woman, keys, and kissing. You need to have some appreciation of the syntax. The infants looked longer at the matching event, where she was kissing the keys, as opposed to the one where she was kissing the ball and there were keys, but where the scene and the sentence didn’t match.
One of the most common questions I get from students is how they can better learn second languages.
There is a certain window of time where one is best at learning language, sometimes called a “critical period” or “sensitive period.”38 One of the most common questions I get from students is how they can better learn second languages. My unhelpful answer is that they should go back in time and learn them when they were children.
In the Tsimané, a society in the Bolivian Amazon, there is a relative indifference to children; mothers carry around their babies but don’t interact with them much; they don’t even get names until their first birthday. (This might be because of a high mortality rate; perhaps mothers don’t want to get too attached to their children.) Adults speak to these children about one tenth the amount that American mothers speak to their children, and they certainly don’t train them to speak through reinforcement and punishment. And yet these children, like children everywhere, do acquire language.
Here’s the linguist Noam Chomsky presenting a nativist perspective: It is a curious fact about the intellectual history of the past few centuries that physical and mental development have been approached in quite different ways. No one would take seriously the proposal that the human organism learns through experience to have arms rather than wings, or that the basic structure of particular organs results from accidental experience. Rather, it is taken for granted that the physical structure of the organism is genetically determined, though of course variation along such dimensions as size, rate of development, and so forth will depend in part on external factors. . . . Human cognitive systems, when seriously investigated, prove to be no less marvelous and intricate than the physical structures that develop in the life of the organism. Why, then, should we not study the acquisition of a cognitive structure such as language more or less as we study some complex bodily organ?
More controversially, even abstract thought seems to exist without language. The data here come from a somewhat unusual source—autobiographical accounts of people who lived without language. The most famous case is that of Helen Keller, who became deaf and blind at eighteen months of age, and learned no language until the age of six, when she was taught a tactile language by a remarkable teacher, Anne Sullivan. She learned to read and write and went on to attend college and write several books, including an autobiography, The Story of My Life.52 This biography is sometimes cited as showing the limitations of thought without language, as Keller described herself “at sea in a dense fog.” But then again, before learning the tactile language, she was able to develop her own makeshift communication system—for instance, shivering to ask for ice cream. She knew she was different from other people, and that they (somehow) communicated with their mouths. Her own failure to communicate frustrated her. She was able to carry out practical jokes, such as locking her mother in a pantry and laughing as she pounded to get out. Nobody could seriously see her as a mindless automaton.
The connection between language and thought means that the study of language can tell us interesting things about how we think. As one example, languages often use the same words to talk about time and about space, as in the English prepositional system—the word “in” can be used for “in a minute” and “in a box,” “on” can be used for “on Tuesday” and “on a mat,” and so on. This linguistic conflation between time and space likely reflects a deep connection between the two in our minds.
As a final example, linguists have chronicled the metaphorical patterns one finds in many languages, such as the notion that time is akin to money, as in these sentences: You’re wasting my time. This gadget will save you hours. I don’t have the time to give you. How do you spend your time these days? That flat tire cost me an hour.
Noam Chomsky was right, then, when he called language “a mirror of the mind.”56 Anyone interested in mental life can make a lot of progress by looking closely into that mirror.
As a final example, linguists have chronicled the metaphorical patterns one finds in many languages, such as the notion that time is akin to money, as in these sentences: You’re wasting my time. This gadget will save you hours. I don’t have the time to give you. How do you spend your time these days? That flat tire cost me an hour.55 This is not universal—some societies have no money—so here again the patterns of language can be seen as expressing certain culturally specific patterns of thought. Noam Chomsky was right, then, when he called language “a mirror of the mind.”56 Anyone interested in mental life can make a lot of progress by looking closely into that mirror.
When we talk, our language is crafted for other people, configured to make them think certain thoughts, often in ways that transcend the literal meanings of words and sentences. This brings us to the study of pragmatics, which explores the part of language that allows us to say things without literally saying them. Someone tells you He went to Harvard but he’s a real nice person. And, though it’s never explicit, the “but” conveys how they think of most graduates of this fine Ivy League university. Consider two other examples—a couple in the kitchen: We’re out of garbage bags. I’ve been busy. and two professors at a conference: What do you think of his research? Well, people say he’s got a great personality. In both cases, if you take the sentence pairs literally, the second has nothing to do with the first. But we understand them, because there is more to language than the direct meanings of sentences. “I’ve been busy” responds to the speaker’s implication that the listener should have picked up garbage bags; “Well, people say he’s got a great personality” expresses—by dint of what’s left unsaid—that the listener doesn’t think highly of the person’s research.
“Language is not merely a reproducing instrument for voicing ideas but rather it is itself the shaper of ideas, the program and guide for the individual’s mental activity.”
Perhaps the binary singular/plural distinction of English—it’s either one dog or many dogs—makes us think in terms of one versus many, while speakers of a language that offers more alternatives, such as a special plural marker for exactly two things, might have a more nuanced appreciation. In English, many verbs of motion incorporate the sort of movement, as in walk, jog, hop, amble, creep, spring, run, and so on. Other languages don’t work that way; they just have a verb and add more words to specify the manner of motion, using phrases that are the equivalent of “moving in a hopping manner.” Perhaps this means that English speakers think more about manner of motion. Perhaps, more generally, the different ways in which languages talk about color, time, causality, and hypotheticals have implications for how their speakers make sense of these fundamental notions.
Russian has different words for lighter blues (“goluboy”) and darker blues (“siniy”); English mostly uses just one—“blue.” One effect of this is that colors might be remembered more accurately in Russian, since we often remember scenes in verbal descriptions, and the descriptions in Russian are more fine-grained.62 By the same token, compare a wine expert, who remembers a glass as “Merlot with black raisin and entrancing floral notes” and me, a wine moron, who remembers it as “red, from a pretty bottle”; the expert’s description will enable them to later recall the flavor in a way that I can’t. It’s not just memory, though. If two colors are described by different words in Russian but one word in English, Russian speakers are a fraction of a second faster at telling them apart.
We saw in the last chapter that even babies can add and subtract: Put one ball behind the screen, add another, then they expect two, not one or three.66 But their abilities stop around there. Babies don’t know, for instance, that 8 plus 8 equals 16. It might be that such understanding must be rooted in a symbolic system, one that is not innate—a symbolic system like natural language.
why young children have such problems fully appreciating the minds of other people—so-called theory of mind. Again, some say that the missing ingredient is proficiency in language. There turns out to be a strong correlation between language development and various theory-of-mind skills—the better children are at language, the better they are at reasoning about other minds.68 And deaf children who have not acquired a sign language have a considerable delay in understanding false beliefs.
The philosopher Daniel Dennett once wrote: “Perhaps the kind of mind you get when you add language to it is so different from the kind of mind you can have without language that calling them both minds is a mistake.”
There is a real world out there. We experience it because it contacts our sensory organs in the form of light, sound waves, and pressure on our skin. This contact makes neurons fire, and the experience that arises from this firing is known as sensation—experiences of light and sound and touch. Your brain chugs away and processes this information, combining sensation with expectations about how the world works, and out of this arises a rich experience of the world around us, and we call this process perception.
Finally, when you shut your eyes or cover your ears, some of the information remains. It is encoded in your physical brain. And we call this memory. Later, by being reminded or through conscious effort, you can recover some of these memories, and the world of the past can be resurrected.
And so: There is a world. You perceive it. You attend to it. You are conscious of it. You remember it.
Perception, attention, and memory, the behaviorist will tell you, are made-up constructs that have no place in our mature science of psychology. We should stick to stimulus and response.
George Berkeley claimed that all that exists is the mind of humans and of God: Esse est percipi, he wrote—things only exist when they are perceived.2 And there are more than a few scholars right now who carry on the skeptical tradition of doubting that there is a world out there independent of our perception of it.
What is reality? I like the answer of the novelist Philip K. Dick: It is whatever, when you stop believing in it, doesn’t go away.
Anaïs Nin: “We don’t see the world as it is, we see it as we are.”
Jerry Fodor once put it, “One has practically no access to the acoustics of utterances in languages that one speaks.8 (You all know what Swedish and Chinese sound like; what does English sound like?)”
Stanislas Dehaene writes, We never see the world as our retina sees it. In fact, it would be a pretty horrible sight: a highly distorted set of light and dark pixels, blown up toward the center of the retina, masked by blood vessels, with a massive hole at the location of the “blind spot” where cables leave for the brain; the image would constantly blur and change as our gaze moved around.9 No matter how much you try, you can’t experience this. Rather you get: a three-dimensional scene, corrected for retinal defects, mended at the blind spot, stabilized for our eye and head movements, and massively reinterpreted based on our previous experience of similar visual scenes.
There is a science of how sensations relate to the stimuli that produce them, called psychophysics.
In Double Down, a memoir of gambling addiction, Frederick and Steven Barthelme write, “At the table, losing our money, we were all smiles, as if it were nothing. In fact, it felt like nothing. Money isn’t money in a casino. At home you might drive across town to save a buck on a box of Tide, but at the table you tip a cocktail waitress five dollars for bringing a few Cokes. You do both these things on the same day.”
There is nothing more different than experiences from separate senses—between, say, the smell of a rose versus the feeling of a kiss, between looking at a rainbow versus listening to a toddler giggle. But all these experiences come into the brain in the same way, as neurons firing. As a team of psychologists point out, “The signals don’t have some secret handshake to say they belong to the vision or hearing club.”
What would happen if you put the optic nerve from the eye into the parietal lobe and the neurons from the ears into the occipital lobe? Would you hear light and taste sound? What if you switched other neurons; could you feel the aroma of chicken vindaloo or taste a rainbow? This is not entirely fantasy; some people experience synesthesia, where letters and numbers have sensory experiences associated with them. In an extreme version, one famous neurological patient known as S. had such powerful synesthesia that he didn’t read the newspaper while eating breakfast because the flavors he perceived while reading the words would ruin the taste of his meal.
successful perception involves the coordination of two sorts of information. The input to the visual system; the neuronal firing based on light hitting the retina (sometimes called “bottom-up” information). Your assumption about the world. Some of these might be part of the visual system itself, wired into the visual cortex; some of it might come from your memories and expectations (“top-down” information).
During a bitter online debate about vaccination during the spring of 2021, someone posted this on Twitter as something that made them laugh: When the time comes, I 100% support mandatory vacations for everyone. If anyone refuses they should be FORCED. At first I simply didn’t get the joke, and I read it over and over (was there a repeated “the” somewhere?). Finally, someone had to tell me to look closely at the word after “mandatory” and then I realized that it was “vacations”—not the word I expected and therefore saw: “vaccinations.”
Some philosophers, like John Locke, argue that our memories are in large part what we are—if you woke up with my memories and I woke up with yours, I would be you and you would be me.22 I’m skeptical of this theory myself, but there’s something right here—a complete loss of all of one’s memories would be something close to an obliteration of the self.
There is autobiographical memory, the memory of personal experiences. This is what Bourne is said to have lost, and what we typically talk about when we talk about “losing our memory.” But Bourne retains what’s called semantic memory—he is aware that Paris is the capital of France, that dogs usually have tails, and so on. He retains procedural memory—memory about how to do things. He walks, reads maps, drives a car. Woken on a bench by two policemen, Bourne quickly knocks them unconscious, but he is startled that he could do this; he knows how to fight (he has procedural memory) but doesn’t know he knows (he lacks autobiographical memory).
depth of processing: the deeper you think about something, the more sense you try to add to it, the easier it is to remember. There is a classic study that illustrates this.30 You show people a series of words one by one and have them answer questions about them. (They are not told to remember the words.) One group is asked whether the words are written in capital letters. A second group is asked whether the words rhyme with “weight.” And the third is asked whether they would fit into the sentence “He met a ____ in the street.” Then they get a surprise memory test. It turns out that what they were asked to do with the words influenced their subsequent memory of them. The subjects who remembered them best were the ones who were asked to think about whether it could appropriately appear in a sentence. More generally, the deeper you think of something—focusing on meaning rather than superficial aspects, like capital letters—the better you will be at remembering.
Another technique is to make your experience vivid, make it stand out, make it interesting. If you want to know the function of the hippocampus, you might try to remember: The hippocampus is involved in the memory of spatial environments. But you would do a lot better with this: The hippocampus helps you find your way around campus.
After a memory is formed, processes called consolidation embed it into the brain.32 Sleep, and particularly dream states, seems to assist in the consolidation of memory.
Retrieval hews to the compatibility principle: Memories come back easier in the same context in which they were acquired. The compatibility principle works not just for sameness of the physical environment, but sameness of the psychological state. If you are mildly buzzed when you study, you will remember the information better if you are mildly buzzed while tested34—though, I’ll be quick to add, being drunk while you study causes other problems. If you are sad, you do tend to be better at remembering experiences you had when you were sad; if you are happy, happy experiences are quicker to come to mind.
The discovery that someone with anterograde amnesia can nonetheless form certain sorts of memories predates Molaison.38 In the late 1800s, a Swiss neurologist named Édouard Claparède did a cruel, albeit clever, little experiment with an amnesic patient of his. He hid a pin in his palm as he shook her hand; when he met her again, later that same day, she had no conscious memory of the pinprick, or even of meeting the doctor, but when he extended his hand, she pulled her own hand back, though she could not explain why.
Many of our memories, even those we are confident about, are false. Jean Piaget provides a nice illustration of this: One of my first memories would date, if it were true, from my second year. I can still see, most clearly, the following scene, in which I believed until I was about fifteen. I was sitting in my pram . . . when a man tried to kidnap me. I was held in by the strap fastened round me while my nurse bravely tried to stand between me and the thief. She received various scratches, and I can still vaguely see those on her face. . . . When I was about fifteen, my parents received a letter from my former nurse saying that she had been converted to the Salvation Army. She wanted to confess her past faults, and in particular to return the watch she had been given as a reward on this occasion. She had made up the whole story, faking the scratches. I therefore must have heard, as a child, this story, which my parents believed, and projected it into the past in the form of a visual memory. . . . Many real memories are doubtless of the same order.
When we discussed perception, we saw that our experience of the world reflects a balance between what impinges on our senses and what we expect to perceive. The same sort of thing happens for memory. So, just as psychologists can create visual illusions, they can also create situations crafted to generate false memories. In one study, subjects were asked to remember a string of words, presented at a rate of about one word per second.43 bed, rest, awake, tired, dream, wake, snooze, blanket, doze, slumber, snore, nap, peace, yawn, drowsy When later asked to recall the words, people will often remember the word “sleep,” even though it was never spoken. All the words are sleep-related, and so it’s such a reasonable word to include on the list. Memory, like perception, is sensitive to plausibility.
There are studies in which psychologists tell people stories about somebody who has a meal in a restaurant, and later ask their subjects what they remember about the story. People add plausible details. Someone might remember that they’ve been told the diner paid the bill even if it wasn’t in the story, just because this is what people do in restaurants.44 Just as with the similar phenomenon in perception, this sort of “filling in” is a rational way for the mind to work.
Another example of this is from a classic paper published in 1989, called “Becoming Famous Overnight.”45 The first paragraph summarizes the findings, and it’s so well written (at a level that I have very rarely seen in a scientific journal) that I’m going to quote it here: Is Sebastian Weisdorf famous? To our knowledge he is not, but we have found a way to make him famous. Subjects read a list of names, including Sebastian Weisdorf, that they were told were nonfamous. Immediately after reading that list, people could respond with certainty that Sebastian Weisdorf was not famous because they could easily recollect that his name was among those they had read. However, when there was a 24-hr delay between reading the list of nonfamous names and making fame judgments, the name Sebastian Weisdorf and other nonfamous names from the list were more likely to be mistakenly judged as famous than they would have been had they not been read earlier. The names became famous overnight. We see this sort of misattribution all the time in the real world. I once told a story to friends about a funny but somewhat stressful experience I had a few years ago, and later on, my wife gently reminded me that, while the details were correct, it happened to her, not to me.
Some of her research explores how leading questions can influence memory. In one study, undergraduates watched a movie where a car hit a pedestrian. Some of them were asked, “How fast was the car traveling when it passed the yield sign?” Later on, these subjects were more likely to remember a yield sign in the scene, even though it was really a stop sign. Given that the question assumed the existence of the yield sign, subjects obligingly updated their memory.46 People are also more likely to later remember a broken headlight when they had been previously asked, “Did you see the broken headlight?” (which presumes that there was one) than when asked, “Did you see a broken headlight?” Similarly, asking, “Did you see the children get into the school bus?” makes subjects more likely to remember seeing a school bus.
But we often don’t focus enough on them; we often suffer from what’s called base-rate neglect. Here is a rather vivid example. You are tested for a terrible disease. The test never misses the disease; if you have it, you will test positive. But 5 percent of the time, the test will say that you have it when you don’t. (In other words, there are 5 percent false positives.) You test positive. How much should you worry?8 Many people will say A LOT: I mean, holy cow, there is a 95 percent chance of having the disease. But no, there isn’t.
Positive framing: “Saves two hundred lives” (Treatment A) versus “a 33 percent chance of saving all six hundred people, 66 percent probability of saving no lives” (Treatment B) Negative framing: “Four hundred people will die” (Treatment A) versus “a 33 percent chance that no people will die, 66 percent probability that all six hundred will die” (Treatment B) These are the same dilemmas just described in different ways, and so it would be rational to make the same choice for both. But Treatment A was chosen by 72 percent of the people who got the positive framing and by only 22 percent of the people who got the negative framing.
Framing effects show up all over the place. It’s better to describe condoms as 95 percent effective than as failing 5 percent of the time. If a conference is going to charge people differently depending on when they sign up, it’s nicer to describe this as a discount for early registration than as a penalty for late registration. If you wanted to get people to use some service you were offering, and it takes forty-nine minutes on average, best to say “We get it done in less than fifty minutes!”
imagine that you had to rule on a custody case, where only one of the parents can get custody of the children.13 Here is the information about the parents: Parent A is average in every way—income, health, working hours—and has a reasonably good rapport with the child and a stable social life. Parent B has an above-average income, is very close to the child, has an extremely active social life, travels a lot for work, and has minor health problems. Who should get the children? Well, I don’t know, but I do know that the specific framing of the question shouldn’t matter—and it does. If you ask who should be awarded custody, people are more likely to say B . . . and if you ask who should be denied custody, people are also more likely to say B! The explanation for this is that when asked about awarding custody, you notice special factors in B’s favor (income, closeness to child), while there’s nothing especially good about Parent A. But when asked about denying custody, you look for negative considerations, and you also find them in Parent B (social life, travel, health problems), while there’s nothing especially bad about Parent A. Plainly, something has gone wrong here—B cannot be both the best choice to award custody and the best choice to deny custody—and framing effects are to blame.
Looking at false claims about COVID in 2020, one team of scholars found that people get more focused on truth if you simply give them some sort of reminder, direct or indirect, to focus on accuracy.23 For instance, if you get people to rate the accuracy of a neutral headline (having to do with the discovery of a new star, for instance), it seems to bring out System 2—afterward, people are about three times more discerning when deciding what sort of information to share on social media. The conclusion here is threefold, then. First, we can be irrational in important ways. Second, this irrationality is more likely in unnatural conditions, of the sort that our mind hasn’t evolved to deal with. And third, even in such cases, we have the potential to do better.
Consider intuitions about a good legal system. Here are two questions about the same policy, framed in different ways. If you get sued and you win the case, should the person who sued you pay your legal costs? If you sue someone and you lose the case, should you pay his costs? Eighty-five percent say yes to the first, 44 percent say yes to the second.32
from the standpoint of evolution, we love our children because of the amoral forces of natural selection, for reasons that are, in some metaphorical sense, selfish. But this surely doesn’t mean that when parents take care of their offspring, they are driven by selfish motivations. To paraphrase James, not one person in a billion, when rocking their baby to sleep, ever thinks of utility. A mother or father is usually motivated by love.
Another famous case study, explored by the neuroscientist Antonio Damasio, is of a man named Elliot, who had a brain tumor in the frontal lobes. The tumor was removed, but the damage had been done. Elliot remained an intelligent man, but, echoing the famous phrase used to describe Phineas Gage, Damasio writes: “Elliot was no longer Elliot. . . . He needed prompting to get started in the morning and go to work. Once at work he was unable to manage his time properly; he could not be trusted with a schedule.” Damasio blames these failings on Elliot’s relative loss of emotions: “The cold-bloodedness of Elliot’s reasoning prevented him from assigning different values to different options, and made his decision-making landscape hopelessly flat.”
As Freud put it in Civilization and Its Discontents, “The excreta arouse no disgust in children. They seem valuable to them as being a part of their own body which has come away from it.”35 If left unattended, young children will touch and even eat all manner of disgusting things. In one of the coolest studies in all of developmental psychology, Rozin and his colleagues did an experiment in which they offered children under the age of two something that was described as dog feces (“realistically crafted from peanut butter and odorous cheese”). Most of them ate it.
But we don’t start off disgust sensitive. As Freud put it in Civilization and Its Discontents, “The excreta arouse no disgust in children. They seem valuable to them as being a part of their own body which has come away from it.”35 If left unattended, young children will touch and even eat all manner of disgusting things. In one of the coolest studies in all of developmental psychology, Rozin and his colleagues did an experiment in which they offered children under the age of two something that was described as dog feces (“realistically crafted from peanut butter and odorous cheese”). Most of them ate it.
A further twist is that human females can have, and enjoy, sexual intercourse anytime during the menstrual cycle. One theory of why this unusual trait evolved is that it facilitates pair bonding. If human females can mate all the time, and if it is unpredictable when this mating will lead to a child, it behooves males to stick around to ensure that any child they end up raising contains their genes.
We also find kindness early in life. I wrote a book about the emergence of morality in babies and children,64 and there’s so much to say here, but the upshot is that even the youngest children care about others, and often try, in their own limited ways, to make things better. Some experiments explore this by getting adults to act as if they are in pain—an experimenter might pretend to get her finger caught in a clipboard—and then seeing how children respond. Often they try to soothe the adults, hoping to make their pain go away. Other studies find that toddlers will help adults who are struggling to pick up an object that is out of reach or struggling to open a door. The toddlers do so without any prompting from the adults, not even eye contact, and will do so at a price, walking away from an enjoyable box of toys to offer assistance.
There’s a rich irony here. It turns out that the best explanation of our finest instincts—love, gratitude, friendship, and esteem, as Smith listed them—entails that these can only persist if we also have feelings such as anger and resentment. What could be seen as the worst parts of human nature turn out to be essential for the creation of a good, loving, and cooperative society. If the appetite for revenge had been stripped from our ancestors long ago, we would never have arrived at where we are today.
apostasy.) More recently, Francis Collins, the
Just as other motivations can go in directions that evolution could never have anticipated—consider a child’s curiosity about other planets, or a reader’s sadness about the fate of Anna Karenina—moral emotions such as compassion, guilt, and righteous anger can, in concert with our rationality, lead us toward decisions and actions that take us far away from the project of reproductive success.
Alexander, an evolutionary biologist known for his work on the origins of morality, describes an argument he had with his mentor. Alexander was trying to make a
Richard Alexander, an evolutionary biologist known for his work on the origins of morality, describes an argument he had with his mentor. Alexander was trying to make a case for pure moral motivations, and he described how he went out of his way to avoid stepping on a line of ants. Isn’t that truly altruistic? And his mentor responded: “It might have been, until you bragged about it.”
Here’s one classic demonstration, from 1959.7 Subjects were assigned to a very boring task—turning pegs in a pegboard for an hour. Then they were given either a dollar or twenty dollars (a lot of money at the time) to go into another room and lie to another person, to say that the task is a lot of fun. Afterward, the subjects were asked how enjoyable the task really was. It turns out that those who got paid a small amount of money rated the task as more enjoyable. The explanation for this comes from cognitive dissonance. It makes sense to lie for a large sum of money. But it’s uncomfortable to realize that you lied for a dollar, and one way to reduce this discomfort is to convince yourself that the task wasn’t so bad, and so you weren’t lying after all.
Some cognitive dissonance studies involve choice. Ask people to choose between two things that are valued roughly equally, and later on they will tend to like the chosen one more than when they started and the unchosen one less.8 This effect occurs even when the choice is blind—where you don’t know what you’re choosing.9 But it doesn’t happen if someone else makes the choice for you, suggesting that the shift in preferences plays a psychologically palliative role—it makes you feel better about your own decisions.
We also judge people based on what they do, and here a specific bias emerges. We’ve just seen that we tend to be soft on ourselves, blaming our failures on factors beyond our control. We are not so gentle with others. For them, we blame their character, not the situation. If someone is unkind to you, you might assume that they are a rude person, not that they are just having a bad day. This is sometimes known as the fundamental attribution error.
Sometimes people get it wrong, though. I once ate at a dining hall at the University of Chicago, and they had signs complaining about how many students were stealing cutlery and asking them to stop. This was exactly the wrong tactic: I had never thought of slipping the fork and knife into my coat pocket, but after reading the sign, the thought did come to mind. After all, this seems to be what people do at the University of Chicago. Much better for them would be a sign saying something like, “95 percent of students don’t steal; be like them and respect your university.”
The discovery is social priming. Consider some truly striking findings. (Some of these might seem a bit crazy, but I’ll explain the theory behind them in a bit.) Sitting on a wobbly workstation or standing on one foot makes people think their romantic relationships are less likely to last.31 College students who fill out a questionnaire about their political opinions when next to a dispenser of hand sanitizer become, at least for a moment, more politically conservative.32 Exposure to noxious smells makes people feel less warmly toward gay men.33 If you are holding a résumé on a heavy clipboard, you will think better of the applicant.34 If you are sitting on a soft-cushioned chair, you will be more flexible when negotiating.35 Standing in an assertive and expansive way increases your testosterone and decreases your cortisol, making you more confident and more assertive.36 Thinking of money makes you less caring about other people.37 Voting in a school makes you more approving of educational policies.38 Holding a cold object makes you feel more lonely.39 Seeing words related to the elderly makes you walk slower.40 You are more likely to want to wash your hands if you are feeling guilty.41 Being surrounded by trash makes you more racist.42 And so on and so on.
Another researcher makes the connection to behaviorism more explicit: “As Skinner argued so pointedly, the more we know about the situational causes of psychological phenomena, the less need we have for postulating internal conscious mediating processes to explain these phenomena.”
Maybe drinking sauerkraut juice makes an experimental subject more likely to endorse extreme right-wing policies47 (a real finding—check the notes), but you don’t want to conclude that the major factor in the rise of fascist movements is too much sauerkraut juice consumption.
Numerous laboratory studies find that we automatically encode three pieces of information when we meet a new person, and Gessen is right about two of them—age and gender. The third is race or ethnicity.
Age, gender, race. These three considerations trump the rest; they are what interests us. They are also what lingers in memory. In “memory confusion” studies, experimenters give subjects a series of pictures of people who are depicted as uttering different sentences.3 If you have enough of these people-sentence pairs, subjects make mistakes—they misattribute who said what. And these mistakes tend to hew to how we categorize people. If there is a sentence said by a white female eight-year-old and you get the speaker wrong, you’re more likely to misattribute it to another white girl than to someone else; presumably because, beyond the specifics of the person, you will have encoded the speaker as “white girl.”
Children rely on a similar stereotype. In one study, a stranger would approach three-to-six-year-old children and try to lure them from school, saying, “You are so adorable. I really like you. I have a gift that I want to give to you!” and “Let’s go together and get it, and I will bring you back here after a while.” The depressing finding here was that, despite how often children have been told not to fall for this, about half of them left with the stranger. But another finding is that they were much more likely to leave with a woman than with a man.
The biases may not have anything to do with animus or dislike. When you associate the elderly with negativity, for instance, it shouldn’t be taken as showing any personal bad feelings toward the elderly. As an illustration of this point, consider an experiment in which people were taught about novel groups, which were called Noffians and Fasties.38 Some people were told that the Noffians were oppressed and the Fasties were privileged, and some groups were told the opposite. Then they were given an IAT. As predicted, subjects had negative associations with Noffians in the condition where they were told that they were oppressed. But plainly they themselves didn’t have bad feelings about this imaginary group; they were just responding to what they were told.
The writer Ian Parker, in his discussion of Milgram, summarizes the view of prominent social psychologists: The Obedience Experiments point us towards a great social psychological truth, perhaps the great truth, which is this: people tend to do things because of where they are, not who they are.”
Gustave Flaubert advised artists, “Be regular and orderly in your life, so that you may be violent and original in your work.”
These differences get larger when you look at specific subscales. For instance, men are, on average, more assertive than women (one form of extraversion), while women are, on average, more sociable and friendly (another form of extraversion). There are universal differences between men and women in traits such as aggression, risk-taking, dominance, cooperating, and nurturance.
You’ve probably heard people say things like, “IQ tests don’t predict anything important.” That’s absolutely true—so long as you don’t think grades, jobs, money, health, or longevity are important. The fact is that intelligence test scores are highly correlated with just about every outcome that human beings care about.
In our society right now (and we’ll see this has an important qualification), if you had to give a child one test to predict his or her fate in life, you couldn’t go wrong with an IQ test.
It’s not surprising, then, that in the United States, the heritability of intelligence for rich children is higher than the heritability of intelligence for poor children.60 While moving children from one reasonably adequate environment to another doesn’t seem to make a difference in their intelligence, moving them from an impoverished environment to a better one leads to an IQ bump of over ten points, a powerful effect.61
We’ve talked about Turkheimer’s First Law—“All human behavioral traits are heritable”—and then added various qualifications. Now here are the Second and Third Laws of Behavior Genetics:64 Second Law: The effect of being raised in the same family is smaller than the effect of the genes. Third Law: A substantial portion of the variation in complex human behavioral traits is not accounted for by the effects of genes or families. What arises from all of this is that, if the environment is good enough (always an important qualification), it’s genes and nonshared environment that have the most effect on how people are different. The research suggests that what doesn’t seem to matter very much, at least for the traits of personality and intelligence, is family environment. To put this in a more radical way, once the moment of conception is over, and the parental genes have been fused into the zygote, then for certain important aspects of how children turn out, parents just don’t matter that much.65 You might be thinking that this can’t be right—there must be an effect of shared environment. After all, there are a million studies—as well as common sense—showing that there’s a high correlation between parent and child for everything. Parents who read a lot tend to have children who read a lot. Religious parents are more likely to have religious children, extroverts are more likely to have extroverted children, and so on. These are obvious and much replicated findings. The problem is, they are consistent with explanations other than parenting effects. As one example among many, children who are hit by their parents tend to be more violent.66 The immediate conclusion that many draw from this is that parental hitting has a bad effect on children. But it’s equally consistent here that the genes that make parents likely to hit (those related to low impulse control, say) are shared by adult and child. (There is also a third possibility. Such correlations might be due to the child influencing the parent, not the other way around. Perhaps it’s not that reading to children makes them interested in books, but rather that if parents have a child who is interested in books, they are more likely to read to him or her. Developmental psychologists call this a child effect.)
The same problem of disentangling genes and environment arises when we tell stories about our own lives. I am neurotic because Mom never loved me—or maybe I’m neurotic because she showered me with too much love for my own good. I am a good father to my children because Dad was such a good father to me . . . or I’m a lousy father because I decided that I could never live up to his example. See how easy storytelling is? We love personal narratives, and the best stories involve other people, usually the people who raised us. The research, however, suggests that such stories are often false, or at least unsupported.
Fourth Law: A typical human behavioral trait is associated with very many genetic variants, each of which accounts for a very small percentage of the behavioral variability.
But Johann Hari, in his book Lost Connections, objects that this doesn’t go far enough. Once you admit that depression is rational when someone you love dies, why stop there? Why is death the only event that can happen in life where depression is a reasonable response? Why not if your husband has left you after thirty years of marriage? Why not if you are trapped for the next thirty years in a meaningless job you hate? Why not if you have ended up homeless and you are living under a bridge? If it’s reasonable to use one set of circumstances, could there be other circumstances where it is also reasonable?8 Hari goes on: “What if depression is, in fact, a form of grief—for our own lives not being as they should? What if it is a form of grief for the connections we have lost, yet still need?” And he has a corresponding skepticism about biological treatments (although he is careful not to entirely reject them). For him, they focus on the wrong problem. Don’t blame the mind, blame the world.
Some argue that we should see autism as reflecting a certain style of processing information, one that is different from that of neurotypical individuals, but not inferior.9 The difficulties that many individuals suffer should be understood as reflecting the intolerance that a society of mostly neurotypical individuals has toward those who think in different ways.
Schizophrenia is not the most common mental illness. Only about 1 percent of the world’s population suffers from it. But it might well be the most terrible one. The World Health Organization classifies it as one of the ten worst diseases on Earth in terms of its economic impact.12 The life expectancy for someone with schizophrenia is more than fifteen years less than normal, due to suicide, illnesses, and accidents,13 and over a third of people in mental hospitals are there due to diagnoses of schizophrenia. The onset of schizophrenia occurs somewhere between the late teens and the midthirties, a bit later in women than men. It is more common in men.14 To be diagnosed, two or more of the following symptoms must be present for a significant amount of time:15 Hallucinations: You experience things that don’t really happen. Typically, these are auditory—you hear voices. These might be from God, from the devil, or from sinister people berating you or demanding you do terrible things. Delusions: A delusion is an irrational belief, one that is very hard to change. Maybe the government is tracking you, or you are Jesus Christ, or aliens are reading your mind, and so on. Disorganized speech: This refers to odd incoherent language. In extreme cases, those who suffer from schizophrenia might produce gibberish, sometimes called “word salad.” Disorganized or catatonic behavior: This refers to both odd and inappropriate actions, such as inappropriate giggling, wearing strange clothing, and either a decrease in movement, sometimes becoming frozen in place, or sometimes the opposite—excessive and purposeless movement.
Still, in support of the early environment account, there is a strong relationship between schizophrenia and certain life stressors—children born into poor families are more likely to develop schizophrenia as adults.22 It’s possible as well that the trigger isn’t the sort of usual trauma or bad environment that we tend to think about. It might be a difficult birth, maternal malnutrition, or a viral infection.23 In this regard, there is a similarity with autism, which is also associated with problems with birth,24 infections,25 and more likelihood of being conceived in the winter months.26 Finally, schizophrenia emerges in early adulthood, but precursors emerge in childhood. In a clever study, psychologists looked at home movies taken of adults with schizophrenia when they were five years old or younger, where they appeared with siblings who didn’t develop schizophrenia later in life.27 Observers who didn’t know who was who could guess, better than chance, who in the movies will grow up to suffer from schizophrenia—they behave more oddly and have fewer positive and more negative facial expressions.
There’s a story about James Joyce talking to the psychologist Carl Jung about Joyce’s daughter, Lucia, who suffered from schizophrenia. Jung was describing the unusual mental processes characteristic of Lucia’s condition and Joyce observed that the same sort of strange associations occurred in his own writing. He said, “Doctor Jung, have you noticed that my daughter seems to be submerged in the same waters as me?” And Jung answered, “Yes, but where you swim, she drowns.”
Other disorders are more familiar. Have you ever been sad? Really sad? Didn’t want to get out of bed, didn’t want to eat, life had no joy for you? Multiply this many times and you may have some inkling of what major depression feels like.29 Here is a description from the writer Andrew Solomon: At about this time, night terrors began. My book was coming out in the United States, and a friend threw a party on October 11. . . . I was too lackluster to invite many people, was too tired to stand up much during the party, and sweated horribly all night. The event lives in my mind in ghostly outlines and washed-out colors. When I got home, terror seized me. I lay in bed, not sleeping and hugging my pillow for comfort. Two weeks later—the day before my thirty-first birthday—I left the house once, to buy groceries; petrified for no reason, I suddenly lost bowel control and soiled myself. I ran home, shaking, and went to bed, but I did not sleep, and could not get up the following day. I wanted to call people to cancel birthday plans, but I couldn’t. I lay very still and thought about speaking, trying to figure out how I moved my tongue, but there were no sounds. I had forgotten how to talk. Then I began to cry without tears. I was on my back. I wanted to turn over, but couldn’t remember how to do that, either. I guessed that perhaps I’d had a stroke. At about three that afternoon, I managed to get up and go to the bathroom. I returned to bed shivering. Fortunately, my father, who lived uptown, called about then. “Cancel tonight,” I said, struggling with the strange words. “What’s wrong?” he kept asking, but I didn’t know.
Depression is common, with a prevalence of about 15 percent in men and 26 percent in women.31 Like schizophrenia, it is heritable. If your biological relatives have it, you are more likely to have it. But, also like schizophrenia, it’s not perfectly heritable, far from it, and so environment must play a role. (In fact, I am going to stop saying this—all the mental disorders we will discuss are partially heritable.)
One theory of depression is that it’s due to low levels of neurotransmitters such as serotonin. In support of this, when people are given medication that is believed to increase the level of serotonin in the brain, including popular ones that are sold under the names of Prozac, Zoloft, and Paxil, they often have a beneficial effect. But, again just as with schizophrenia, it can’t be as simple as this. The drugs influence neurotransmitters right away, but they take weeks to take effect. And the evidence that serotonin is implicated in both the cause of depression and its treatment is surprisingly weak, as one commentator said: “To put it bluntly, there is no decisive evidence that low mood is caused by low serotonin levels.”33 The drugs often work, but we don’t know why they work. A quite different theory is that depression is related to a more general lack of plasticity in the brain, where one loses the capacity to make appropriate restructuring in response to the environment—though there is hardly a consensus here.34 At a more cognitive level, depression is associated with (though perhaps not caused by—the usual problems with cause and effect apply) certain patterns of thought. These include a tendency to ruminate on one’s problems.35 The Ruminative Response Scale asks people how often they do the following when they feel sad, blue, or depressed: I think “Why do I react this way?” I think about how hard it is to concentrate. I think “I won’t be able to do my job if I don’t snap out of this.” Responses are related to the likelihood of getting depressed, though not the duration of the depression. The psychologist Susan Nolen-Hoeksema proposed that this may be why depression is more common in women; women ruminate more.
Therapists know some surprising things about mental illness. One psychiatrist, Randolph Neese, writes: Recognizing common patterns can make even ordinary clinicians seem like mind readers. Asking a patient who reports lacking hope, energy, and interest, “Does your food taste like cardboard, and do you awaken at four a.m.?” is likely to elicit “Yes, both! How did you know?” Patients who report excessive hand washing are astounded when you guess correctly, “Do you ever drive around the block to see if you might have hit someone?” If a student has weight loss and a fear of obesity, she will likely be astonished when asked, “You get all A’s, right?” Clinicians recognize these clusters of symptoms as syndromes: major depression, obsessive-compulsive disorder, and anorexia nervosa. After seeing thousands of patients, expert clinicians recognize different syndromes as readily as botanists recognize different species of plants.
Therapists also often know how to make people better. Certain specific therapies do well for anxiety disorders, for instance. Nesse writes: “Patients with panic disorder get better so reliably that treating them would be boring if it were not for the satisfaction of watching them return to living full lives.”
The data so far suggest that most disorders are continuous. Just like introversion is a matter of degree (and so we are better off, at least as scientists, talking about the extent of introversion rather than about a certain special sort of person who is “an introvert”), this is true as well for many of the disorders we’ve been discussing. One recent meta-analysis on the topic concludes by saying that the data from multiple studies “support the conclusion that the great majority of psychological differences between people is latently continuous, and that psychopathology is no exception.”72 This continuous view, should it be right, has implications for some of the more foundational issues we began with, such as neurodiversity and the very definition of mental illness. When thinking about disorders like depression, anxiety, addiction, autism spectrum disorder, and schizophrenia, we can no longer simply view them as problems to be solved. Rather, they are the names we give to certain extremes of human variation. What counts as extreme enough to warrant treatment isn’t just a psychological question. It is a moral and political one.
Call this remembered happiness.9 The happiness levels you get from these kinds of measures are correlated—but they are not identical. Someone can judge their life as happy even if their day-to-day experiences aren’t so enjoyable, or, conversely, have overall positive experiences but not be so pleased by their life as a whole.10 There is quite a bit of debate as to which of these two measures is the one to shoot for, with pluralists like me saying that they are both worthy goals.11 *
The big finding is that the very happy have rich social relationships. They spend a lot of time with family and friends. Almost all of them—94 percent—said that they have someone that they can count on to help when they are in trouble. Almost all of them—98 percent—said that they were treated with respect on the previous day. Relative to other individuals, the happy elite make more money, are in better physical shape, have fewer health problems, exercise more, smoke less, feel more well rested, and have less stress. They have more hours of free time, more freedom to choose their activities, and are more likely to say that they learned something new on the previous day. Now, importantly, some people who are moderately happy and even some who are unhappy are also treated with respect, spend time with family and friends, make a good living, and so on. Such considerations are not sufficient for being very happy, then. But they do appear to be close to necessary; without them, you are very unlikely to be one of the happiest people in the world.
Take having friends. Very happy people have friends. So perhaps friends make us happy, a reasonable claim. But perhaps being happy leads to having friends, which also makes sense. There’s something appealing about good cheer, at least in proper doses. It’s harder for someone who is sullen and withdrawn to be popular. Or perhaps happiness and friendship have no direct relationship, but some third factor independently influences them both. Perhaps some people are gregarious and agreeable and extroverted. Such traits might make them more likely to be happy, and might also help them have more friends, but their happiness and their popularity might themselves have no direct connection.
One of the first thing people ask about happiness is “Does being kind make you happier?” and there is good data about causality here. Consider first one form of kindness—volunteering. In study after study, research finds that volunteering is associated with greater life satisfaction and lower rates of depression. This relationship holds for both poor countries and wealthy countries.13 So good, if you want to be happy, volunteer, right? It’s unclear. There are several experiments that have randomly assigned people to volunteer or not to volunteer and they find no difference in mood, self-esteem, and so on.14 But a recent longitudinal study, looking at changes in happiness before and after volunteering, does find a substantial positive effect.15 It’s not yet known, then, whether happy people volunteer more (for whatever reason), whether volunteering makes you happy, or both.
Also, there are several studies that illustrate what the psychologist Daniel Gilbert has described as failures of “affective forecasting.”19 We are often poor at predicting our happiness. Most relevant to the issue of stability, we tend to think life events matter more than they will. We fail to appreciate that we are fairly resilient to bad experiences and, unfortunately, fairly unaffected by good ones as well. In one study, college students got to provide their university with a ranked list of which dormitories they wanted to live in and were asked to predict how happy they would be if they were assigned their top choice or their bottom choice. They predicted a substantial effect, but once they were in their dorms, there was no difference at all.20 In another study, sports fans were asked how they would feel after a pivotal soccer championship game. It was made clear to them that they were being asked about their overall feelings, not their feelings when they were thinking about the game, but still, fans of the winning team overestimated how happy they would be, and fans of the losing team overestimated how sad they would be.21 After the 2000 election, when George Bush was elected president, Bush supporters were not as happy as they thought they would be, and Gore supporters were not as unhappy as they thought they would be.22 Indeed, even for more severe events, such as the dissolution of a romantic relationship, people tend to overestimate how much their happiness will be affected.
nothing in life matters quite as much as you think it does when you are thinking of it.
Finally, we often don’t appreciate how good we are at telling stories and reinterpreting information in ways that preserve our good feelings about ourselves, our group, and the views we most cherish. Gilbert and his colleagues nicely put it as follows (you know who Freud is; Leon Festinger developed the cognitive dissonance work described in the earlier chapter on social psychology): Psychologists from Freud to Festinger have described the artful methods by which the human mind ignores, augments, transforms, and rearranges information in its unending battle against the affective consequences of negative events. Some of these methods are quite simple (e.g., dismissing as a rule all remarks that begin with “You drooling imbecile”), and some are more complicated (e.g., finding four good reasons why one didn’t really want to win the lottery in the first place); taken in sum, however, they seem to constitute a psychological immune system that serves to protect the individual from an overdose of gloom. As Vaillant noted: “Defense mechanisms are for the mind what the immune system is for the body.”
The factors associated with happy countries are what we would expect. Happy countries are high in average income—there is a robust connection between the GDP of a country and how happy it is. They have high life expectancy and strong social support. The citizens of these societies perceive high levels of freedom, trust, and generosity. Progressive taxation and a strong welfare state predict happiness. So does some degree of economic competition (communist countries are unhappy countries).30 Is it fair to call this an effect of the environment? Perhaps it’s not actually living in the country that influences one’s happiness; Finns, say, might be happy because of Finnish genes, and they would be just as happy if they moved to Moldova. But this isn’t so. Several studies show that, although there is some influence of country of origin, immigrants tend to be roughly as happy as native-born citizens of the country they live in.31 It’s the country, not the genes.
Now, the problems that we discussed earlier about correlation and causation weigh heavily here. Maybe the sort of person who is rich is different in some ways from the rest of us and would have been happy even if all their money was taken away. This can potentially be studied in a simple experiment: Give people a lot of money and see what happens—does this make them happier in the long run? There is just one small practical problem, though. Not many psychologists can afford to give a million dollars each to a large number of people to see what happens.
there is no decline in happiness with age. Rather, happiness maps onto age in what appears to be a U-shaped curve.40 Eighteen-year-olds are relatively happy, then average happiness gradually drops until the early fifties, and then it rises again, so that the eighties are, on average, the happiest time of people’s lives.
We saw that the happiest people in the world tended to have good close relationships. Take marriage. My nana would have told you that to have a fulfilled life, you need a partner. Every pot needs a cover, she would loudly remind my (then) unmarried sister at family gatherings, which I found very funny, though my sister did not. Now Nana was phrasing things a bit too strongly—there are plenty of happy people who have no partners—but she was right on average; married people do tend to be happier.43 But why? Here we are presented with our usual trio of options when thinking about the relationship between two factors. A can cause B; B can cause A; or a third factor, C, can cause both A and B, but A and B might not be otherwise related. It might be that marriage makes you happy (plausible enough), or that being happy makes you more likely to attract people who want to marry you (also plausible), or that there is some third factor—wealth, money, a certain personality type—that leads to both marriage and happiness (also plausible). One way to pull apart these possibilities is to look at timing. If your happiness goes up once you get married, it suggests that marriage does play a causal role. And this does seem to happen, for both heterosexual and homosexual couples, at least for the first few years of marriage.44 But then your happiness drops back to where you started.45 Careful studies looking at long-term consequences of marriage find subtle effects that vary due to all sorts of factors, such as whether you are a man or a woman, how old you are, and what country you’re from.46 The question of whether marriage makes you happy does not have an easy answer.
As the psychologist Dan Gilbert puts it, “The only symptom of empty nest syndrome is increased smiling.”
Consistent with this, one study looked at millions of people from 160 countries and found that, overall, parents are less happy than nonparents—lower life satisfaction and more daily stress.56 But they also experience more daily joy. So perhaps we remember the highs? I had a Zoom call with a journalist during the pandemic and he mentioned that he had a four-year-old and two-year-old twins. He said that parenthood was like heroin—the lows are horrific, but the highs are wondrous. The writer Zadie Smith described it in similar terms: having a child as “a strange admixture of terror, pain, and delight.”57 I think this remember-the-peak explanation is plausible enough, but there is another consideration to throw in the mix. Let’s oversimplify here and assume that there really is a drop in pleasure and happiness when you have kids. But now I want to return to the pluralism introduced at the beginning of this chapter and suggest that it might still be worthwhile. When I say that raising my sons is a source of immense satisfaction for me, I don’t mean that being a father upped my overall pleasure. I’m talking about something deeper, having to do with satisfaction, purpose, and meaning.
Steven Pinker:58 We can see happiness as the output of an ancient biological feedback system that tracks our progress in pursuing auspicious signs of fitness in a natural environment. We are happier, in general, when we are healthy, comfortable, safe, provisioned, socially connected, sexual, and loved. The function of happiness is to goad us into seeking the keys to fitness: when we are unhappy, we scramble for things that would improve our lot; when we are happy, we cherish the status quo. From this perspective, happiness evolved as an indicator that our lives are going well. This explains why it’s impossible to simply choose to be happy. Imagine buying a car where the gas gauge was rigged so that it would always read FULL. This would be very pleasing in the short term—you would never worry about running out of fuel. But it’s a bad idea for the long term because the tank would soon empty and the car would stop running. Similarly, if I’m starving and could just choose to ignore my hunger, or even to feel as if I just had a satisfying meal, it would be nice in the short term, but I’d lose the motivation to get food and then I wouldn’t be around for much longer. If I choose to always be happy, I would miss out on the information provided by being unhappy, the information that life is going poorly and that something needs to be done. Unhappiness—like a gas gauge telling you that you are approaching empty, like an aching empty stomach—is a gift.
But Daniel Kahneman and his colleagues
Daniel Kahneman and his colleagues find neither of these claims are true.66 The length of an experience has little effect on how we remember it.67 In reality, two enjoyable weeks of vacation has more pleasure than one enjoyable week of vacation. Indeed, assuming that the pleasure stays constant and you don’t get bored, it is twice as much pleasure. But we remember them as identical. That was a fun vacation. Or consider our memories of negative experiences. Imagine being on a cramped middle seat of the plane, with no in-flight entertainment, nothing to read, and your neighbor at the window constantly getting up to pee. Which is worse, doing this for four hours or eight hours? This is not a hard question, I hope, but when you think back on the experience, you will pretty much remember the four-hour experience and the eight-hour experience as equally negative. What an awful flight. It turns out that when we think about past events, we tend to focus on two things—the peak (the most intense moment) and the ending. One study provides a dramatic illustration of this. Experimenters exposed subjects to differing levels of pain—by having them immerse their hands in freezing water—for different periods of time.68 Here were the trials: 60 seconds of moderate pain. 60 seconds of exactly the same moderate pain. Then, for 30 more seconds, the temperature is raised a bit, still painful, but less so. Which did subjects prefer? You might think A because, duh, less pain. But, no, they preferred B because of the better ending. In another study,69 Kahneman and his colleagues went outside of the lab. They tested volunteers who were undergoing a colonoscopy procedure (this was done at a time when these procedures were substantially more unpleasant), and, for half of the people, artificially prolonged the procedure, by leaving the scope inside them for an extra three minutes, not moving, which was uncomfortable but not painful. These people with the additional discomfort rated their experience as overall less unpleasant, just because it ended on a less painful note.
I am grateful to the authors of three excellent books that had a big influence on me as I worked on the manuscript: The Idea of the Brain: The Past and Future of Neuroscience by Matthew Cobb; The Genetic Lottery: Why DNA Matters for Social Equality by Kathryn Paige Harden; and Freud: Inventor of the Modern Mind by Peter D. Kramer. I am grateful as well to other scholars whose work I went back to over and over again, including Susan Carey, Frank Keil, and Elizabeth Spelke (on the topic of child development), Edward Diener (happiness), Julia Galef (rationality), Daniel Gilbert (social psychology), Joseph Henrich (culture), Steven Pinker (language, rationality), Stuart Ritchie (intelligence), and Scott Alexander (perception, clinical psychology). I know that some readers will want to know where to go to learn more about psychology—this paragraph is my answer.