In a famous set of experiments in the 1970s, children were observed trick-or-treating in the suburbs. Some were asked their names and addresses upon arriving at a door, while some were asked nothing. All were instructed to take just one piece of candy from the bowl, but as soon as the owner of the home retreated into the kitchen, the children who hadn’t provided their names and addresses shoveled the candy into their bags, sometimes taking everything in the bowl. Psychologists posited that anonymity made the children feel safe from the repercussions of their actions, an effect they call deindividuation.
Moral psychologists have since constructed myriad experiments to probe the workings of human morality, studying how we decide to cheat or to play by the rules, to lie or to tell the truth. And the results can be surprising, even disturbing. For instance, we have based our society on the assumption that deciding to lie or to tell the truth is within our conscious control. But Harvard’s Joshua Greene and Joseph Paxton say this assumption may be flawed and are probing whether honesty may instead be the result of controlling a desire to lie (a conscious process) or of not feeling the temptation to lie in the first place (an automatic process). “When we are honest, are we honest because we actively force ourselves to be? Or are we honest because it flows naturally?” Greene asks.
Greene and Paxton have just published a study in the Proceedings of the National Academy of Sciences that attempts to get at the subconscious underpinnings of morality by recording subjects’ brain activity as they make a decision to lie. Under the fMRI, subjects were asked to predict the result of a coin toss and were allowed to keep their predictions to themselves until after the coin fell, giving them a chance to lie. As motivation, they were paid for correct predictions. For comparison, the researchers ran tests in which they asked subjects to reveal their predictions before the coin toss. The scientists then analyzed the subjects’ success rates using statistics: The dishonest were identified as those who guessed the results of the coin toss more times than chance would dictate.
Greene and Paxton had hypothesized that if deciding to be honest is a conscious process—the result of resisting temptation—the areas of the brain associated with self-control and critical thinking would light up when subjects told the truth. If it is automatic, those areas would remain dark.
What they found is that honesty is an automatic process—but only for some people. Comparing scans from tests with and without the opportunity to cheat, the scientists found that for honest subjects, deciding to be honest took no extra brain activity. But for others, the dishonest group, both deciding to lie and deciding to tell the truth required extra activity in the areas of the brain associated with critical thinking and self-control.
Their findings—that honesty is automatic for some people—is part of a growing body of work that shows that many, if not most, of our daily actions are not under our conscious control. According to John Bargh, a Yale social psychologist who studies automaticity, even our higher mental processes, ranging from persistence at an activity to social stereotyping to stopping to help a person in need, are performed unconsciously in response to environmental cues. And Jon Haidt of the University of Virginia has found through numerous studies that we make some moral judgments, like those involved in the trolley problem, based entirely on our emotions and are unable to explain logically why some things are right and others wrong.
Greene and Paxton’s study suggests that honesty in particular is automatic only for some, which Bargh interprets to mean that some portion of the population might be naturally honest, while others struggle with telling the truth. “It could potentially be some of the most intriguing evidence for group selection,” Bargh speculates, adding that the results are reminiscent of the evolutionary idea that “cheaters” and “suckers” coexist in a specific ratio in the animal kingdom. The classic example is parasitic cuckoos and the hapless birds that raise the cuckoos’ young. Bargh wonders if the ratio of “cheaters” to “suckers” exists in our species as well. In the Halloween party experiment, were there children who did not take extra candy even though they hadn’t revealed their names and addresses?
Greene and Paxton specifically state in the paper that they are not drawing conclusions about how the “honest” and “dishonest” group behave beyond the confines of the experiment, so to answer some of these deeper questions they plan to submit subjects to a retest and assess the robustness of the labels. If these designations hold up to further testing—if people really are consistently honest or dishonest—the pair would then hope to identify what individual personality traits might predict each case. Then the experimenters will also try to track down what kind of situations—like being reprimanded or being alone in the room—bring out honesty and dishonesty. They hope to thus search out the roots of automatic morality.
One surprising finding from this study reveals the complexity Greene and others face in trying to dissect moral behavior: The decision to lie for personal gain turns out to be a strikingly unemotional choice. Some moral dilemmas Greene studies, like the trolley problem, trigger emotional processing centers in our brains. In his coin toss experiment, there was no sign at all that emotions factored into a subject’s decision to lie or to tell the truth. “Moral judgment is not a single thing,” Greene concludes, suggesting that although we often lump them together under the heading of “morality,” deciding what’s right or wrong and deciding to tell the truth or to tell a lie may, in some situations, be entirely disconnected processes.