The Moral Brain

David Pacchioli
October 02, 2006

Through most of human history, our morality—the capacity to perceive interests beyond our own, and to act fairly, caringly, even selflessly—has been touted as what defines us, separates us from the beasts. From Plato on, this moral dimension has been ascribed to our powers of reason, seated in the brain and holding tight rein on the passions below.

The rise of cognitive psychology in the 1960s, with its emphasis on brain as computer, tended to reinforce this rationalist view. A parallel development, however, developed in part as a reaction to the cognitive approach: the theory of multiple intelligences.

"Multiple intelligences is the idea that the measured IQ is really a very narrow view of human capability," says Paul Eslinger, professor of neurology at Penn State's Hershey Medical Center. "The human brain and potential is not just what we're able to think but what we're able to feel, and how we're able to integrate these two streams of experience and knowledge."

Of particular interest to brain researchers were the so-called moral emotions, distinct from such basic emotions as happiness or fear, Eslinger says, in that they are tied to the welfare of others, and of society at large. Empathy, guilt, gratitude, and disgust are examples of emotions that are intrinsically moral. Eslinger calls them "social" emotions, and in fact their social aspect is now thought to have been the driving factor in the evolution of the human brain.

"Drives such as reciprocal interaction, dominance, protection of family—certain basic processes that underlie not only survival of oneself but survival of a greater unit, of progeny—these exist in non-human primates, in a variety of species," Eslinger explains. "But in humans these drives have undergone a tremendous elaboration and are embedded within complex social contexts. We think it goes along with the front part of the brain, the prefrontal cortex, being vastly enlarged."

This enlargement, in turn, "has provided the basis for all of our interpersonal abilities," he says. "For language, for social groupings, for long-term relationships, even for things such as altruism. In working through a lot of the literature, from psychology and animal behavior as well as the clinical, it appears that we humans have these specialized processing modules that provide the basis for moral behavior—in other words, that there is a biological basis to morality."

A spike through the head

Phineas Gage skull
Warren Anatomical Museum, Francis A.Countway Library of Medicine

Phineas Gage's skull and life mask

Until recently, the only way for neurologists to test this theory was to carve up the brains of those deemed morally deficient—dead criminals, mostly—and compare them to "normal" brains, similarly defunct. That or study the behavior of patients living with a rare variety of brain damage.

The most famous case of the latter remains that of Phineas Gage, a railroad laborer in Vermont who, one day in the fall of 1848, suffered a horrific on-the-job injury. Gage, the foreman of a crew laying track outside the town of Cavendish, was tamping black powder into a hole drilled in rock when he apparently struck a spark. In a flash of explosion, the tamping iron, a three-and-a-half-foot-long bar an inch in diameter, blew up and back, through his left cheek and clean out the top of his head, landing some 30 yards behind him. "It essentially severed the front third of his brain," Eslinger says. "The surgeon who came to the scene described that he could insert a finger through either side of the wound and actually touch them. It was just this clean hole."

Amazingly, Gage survived, and was in fact strong enough to resume work in less than a year. His basic mental faculties—motor skills, memory, speech—were essentially intact. What had changed, irrevocably, was his personality. Where before the accident, Gage had been regarded as an excellent foreman, thoughtful, shrewd with money, and well-spoken, afterward he was described as "fitful, irreverent, and grossly profane," acting with little regard for others. His friends said he was "no longer Gage."

Over the past 20 years, Eslinger and others have looked into a number of cases involving trauma or stroke and resulting in similar changes in personality: "acquired sociopathy," in today's clinical terms. "One of the key changes seems to be a loss of ability to share and understand the experience of others," Eslinger says. In most of these cases, as in Gage's, the damaged area was the prefrontal cortex.

Brains at work

MRI scanner
James Collins

MRI scanner at Penn State Hershey Medical Center

To Eslinger such clues were tantalizing, but there was only so much to be inferred from clinical symptoms. "It's like something going out on your car, and just by looking at that part trying to figure out how the rest of the car works," he says, shrugging. Now, however, using functional magnetic resonance imaging, or fMRI, he can literally see the brain at work.

Like its now well-established cousin structural MRI, fMRI takes advantage of the inherently magnetic property of water molecules within the brain. The high field MRI machine first exposes the brain to a powerful magnetic field that causes the hydrogen protons in those water molecules to line up. Then it uses electromagnetic waves to knock the protons out of alignment. Finally, it stops the waves and measures the time it takes for the protons to relax back into line. Data collected about the relaxation time at each brain location (or pixel) are then translated into a high-resolution image of living neural tissue.

"When specific brain areas become more active, they demand more glucose and oxygen, so the blood flow increases," Eslinger explains. "We can usually see a three to five percent change over a period of just a few seconds. We can look at the anatomic distribution of the activity, and the volume of it. We can determine if change happens very quickly or it's something that takes time to develop."

Eslinger had already used fMRI in a variety of projects at Hershey's Center for Nuclear Magnetic Resonance Researchin collaboration with MRI physicist Qing Yang (see sidebar) when he was contacted in 2000 by Jorge Moll and Ricardo Oliveira-Souza, two Brazilian neuroscientists interested in the neurobiological bases of antisocial behavior. Moll and Oliveira-Souza had read Eslinger's brain-injury work, in which, as Eslinger now says, "we were beginning to identify that the effects of these rare injuries were real and measurable, that some sociopathies weren't a matter of not trying, or being poorly educated, or growing up in an abusive environment. They had physiological causes, just as a weakness in the arm or a loss of speech did."

Moll and Oliveira-Souza had hopes of impacting public policy on crime and punishment in Brazil by better understanding the roots of violence and aggression, Eslinger remembers. "They were specifically interested in what we call 'snakes in suits,' the well-cultured sociopaths who exist in all societies. If you do structural scans of their brains, they look the same as anyone else. If you test them on moral inventories, they can score just as well, if not even better. They know the rules, they know what's right and wrong, but the difference is that such knowledge doesn't guide their behavior.

"How is it that these people can be habitual killers, rapists, swindlers, and all sorts of things, and yet look so normal? Is there something else going on in the brain?"

The emotional response

At the Hospital Barra D'Or in Rio de Janeiro, Eslinger and his new collaborators tested a group of normal adults by asking them to view images of emotionally charged scenes with and without "moral" content while lying inside an fMRI magnet. Pictures of physical assaults, war carnage, and abandoned children were included in the moral category; the non-moral images depicted body lesions and dangerous animals.

As the researchers predicted, certain brain regions (the amygdala, thalamus, and upper midbrain) were consistently activated by both types of emotional stimulus. But some areas, including the orbital prefrontal cortex (OFC), located just above the eye sockets, and the superior temporal sulcus, at the furrow between the frontal and temporal lobes, fired specifically in response to moral content. Moreover, "We found that this activation was very fast," Eslinger reports. "People detected that something was wrong very quickly. It's almost as if the moral content is embedded within the perception. You don't have to stop to think." When they later showed the same pictures to a group of diagnosed sociopaths, he adds, their reactions differed, and these "cortico-limbic circuits" didn't consistently activate.

Paul Eslinger
James Collins

Paul Eslinger

To Eslinger, these findings suggested that "snakes in suits" may have specific neural deficits that preclude social emotional responses. It also confirmed that the "automatic" processing of moral emotions, mediated in the OFC (the region where Phineas Gage sustained damage), is a separate task from moral reasoning, which is a slower, deliberative process that takes place elsewhere in the brain, where "snakes" have no damage.

Other imaging experiments have tended to support this separation. At Princeton, psychologist Joshua Greene and his colleagues applied fMRI to the trolley problem, a classic ethical dilemma where a subject is asked to imagine him or herself observing an impending catastrophe involving a runaway train. If the observer does nothing, the train will kill five people tied to the track. By pulling a switch, the subject can divert the train, saving the five at the cost of killing a single person tied to the second track.

"If you change the scenario so that the subject has to push that lone person onto the track to save the others," Eslinger notes, "this dramatically switches activation into the emotional parts of the brain, in particular the OFC. The cognitive aspects—weighing of cost vs. benefit—are about the same, but the direct role generates more of an emotional response."

Meanwhile, Moll, Oliveira-Souza, and Eslinger have worked at trying to pinpoint specific moral emotions. Compassion and guilt, they have found, light up differently in the brain. So does disgust, a kind of built-in protection against antisocial behavior which University of Virginia social psychologist Jonathan Haidt calls "the emotion of civilization."

To brain researchers, disgust is interesting because it has "multiple domains," i.e., it can be both a basic emotion and a moral one. Pure disgust, Eslinger explains, is a visceral, primordial response. "It's when you want to eject something because it smells or tastes bad—sour milk or rancid meat." Moral disgust, or indignation, is something more abstract: the reaction to a person whose behavior we judge as wrong. "Yet the impulse triggered is similar," Eslinger says. "Break off contact, withdraw from the offending source."

Moll and Eslinger asked subjects in the magnet to read statements describing disgusting scenarios of both types. (Pure disgust: "A man died after eating a living rat." Indignation: "A newspaper reports a mother is beating her child in a supermarket parking lot.") As expected, the scans elicited showed some distinct differences in brain activity. But they also showed "remarkable" overlap, Eslinger says. "When we presented the more abstract social scenarios eliciting disgust, we found that several of the areas that activated were those also associated with visceral disgust from smell and taste. So part of the comprehension and intensity of the social and cognitive concepts may be built on top of this real primitive reflex."

Bones was right

To Eslinger, these findings, even though preliminary, confirm what he has long suspected: that everyday morality is essentially a gut response. "Our behavior when we're driving a car, when we get back too much change from a cashier—these are accultured behaviors we acquire developmentally, and they become incorporated into our daily routines until we don't even think about them," he says. "Waiting in line instead of cutting in front is not something you weigh the pros and cons of. It's just something you do."

More complicated moral decisions—what we sometimes call moral dilemmas—also involve the emotional circuits in the brain, Eslinger contends, but in this case they work in concert with rational processes. As he, Moll, and Oliveira-Souza write: "This integrative perspective contrasts with the commonly held view that 'rational' cognitive mechanisms control or compete with emotional ones."

Eslinger's interest now is in using fMRI to understand how moral ability takes shape in the developing brain. "We're asking children and teens questions like 'Is it okay to cut in line? To take a candy bar from the store?'," he reports. "What areas of the brain are being activated when they have to make that kind of judgment? That's really completely unknown at this point. And I suspect that it probably shifts quite a bit developmentally.

"A lot of philosophical teaching emphasizes that it's man's reasoning, control of the wild instinctual things, that makes man different from other animals," Eslinger adds at last. "And what we've come around to say is that's probably not true. It's as much emotion that influences our decision-making and capabilities as cognition, and really it's the integration of those two realms of knowledge and experience. The importance of emotional intelligence is increasingly recognized alongside measured IQ.

"Remember Star Trek? Spock's reasoning versus Bones's gut sense of right and wrong? From the Vulcan standpoint, human emotion was a weakness, but I think the series suggested that it's really a human strength, a vital resource that helpsus get through life's difficulties and be resilient.

"I think Star Trek got it right."

Paul N. Eslinger, Ph.D., is professor of neurology in the Penn StateCollege of Medicine; Qing X. Yang, Ph.D., is associate professor of radiology and neurosurgery at the Center for NMR Research in the department of radiology in the College of Medicine; The Center provides MRI and fMRI support to investigators at the College of Medicine and at University Park. Visit the Center online..


A new way of seeing

Last Updated October 23, 2019