Truth or Lies

Truth or Lies

Hello Guest! Sign up to join the discussion below...
Results 1 to 5 of 5
Thank Tree2Thanks
  • 2 Post By snail

This is a discussion on Truth or Lies within the General Psychology forums, part of the Topics of Interest category; ...

  1. #1

    Truth or Lies

    In a famous set of experiments in the 1970s, children were observed trick-or-treating in the suburbs. Some were asked their names and addresses upon arriving at a door, while some were asked nothing. All were instructed to take just one piece of candy from the bowl, but as soon as the owner of the home retreated into the kitchen, the children who hadn’t provided their names and addresses shoveled the candy into their bags, sometimes taking everything in the bowl. Psychologists posited that anonymity made the children feel safe from the repercussions of their actions, an effect they call deindividuation.

    Moral psychologists have since constructed myriad experiments to probe the workings of human morality, studying how we decide to cheat or to play by the rules, to lie or to tell the truth. And the results can be surprising, even disturbing. For instance, we have based our society on the assumption that deciding to lie or to tell the truth is within our conscious control. But Harvard’s Joshua Greene and Joseph Paxton say this assumption may be flawed and are probing whether honesty may instead be the result of controlling a desire to lie (a conscious process) or of not feeling the temptation to lie in the first place (an automatic process). “When we are honest, are we honest because we actively force ourselves to be? Or are we honest because it flows naturally?” Greene asks.

    Greene and Paxton have just published a study in the Proceedings of the National Academy of Sciences that attempts to get at the subconscious underpinnings of morality by recording subjects’ brain activity as they make a decision to lie. Under the fMRI, subjects were asked to predict the result of a coin toss and were allowed to keep their predictions to themselves until after the coin fell, giving them a chance to lie. As motivation, they were paid for correct predictions. For comparison, the researchers ran tests in which they asked subjects to reveal their predictions before the coin toss. The scientists then analyzed the subjects’ success rates using statistics: The dishonest were identified as those who guessed the results of the coin toss more times than chance would dictate.

    Greene and Paxton had hypothesized that if deciding to be honest is a conscious process—the result of resisting temptation—the areas of the brain associated with self-control and critical thinking would light up when subjects told the truth. If it is automatic, those areas would remain dark.

    What they found is that honesty is an automatic process—but only for some people. Comparing scans from tests with and without the opportunity to cheat, the scientists found that for honest subjects, deciding to be honest took no extra brain activity. But for others, the dishonest group, both deciding to lie and deciding to tell the truth required extra activity in the areas of the brain associated with critical thinking and self-control.

    Their findings—that honesty is automatic for some people—is part of a growing body of work that shows that many, if not most, of our daily actions are not under our conscious control. According to John Bargh, a Yale social psychologist who studies automaticity, even our higher mental processes, ranging from persistence at an activity to social stereotyping to stopping to help a person in need, are performed unconsciously in response to environmental cues. And Jon Haidt of the University of Virginia has found through numerous studies that we make some moral judgments, like those involved in the trolley problem, based entirely on our emotions and are unable to explain logically why some things are right and others wrong.

    Greene and Paxton’s study suggests that honesty in particular is automatic only for some, which Bargh interprets to mean that some portion of the population might be naturally honest, while others struggle with telling the truth. “It could potentially be some of the most intriguing evidence for group selection,” Bargh speculates, adding that the results are reminiscent of the evolutionary idea that “cheaters” and “suckers” coexist in a specific ratio in the animal kingdom. The classic example is parasitic cuckoos and the hapless birds that raise the cuckoos’ young. Bargh wonders if the ratio of “cheaters” to “suckers” exists in our species as well. In the Halloween party experiment, were there children who did not take extra candy even though they hadn’t revealed their names and addresses?

    Greene and Paxton specifically state in the paper that they are not drawing conclusions about how the “honest” and “dishonest” group behave beyond the confines of the experiment, so to answer some of these deeper questions they plan to submit subjects to a retest and assess the robustness of the labels. If these designations hold up to further testing—if people really are consistently honest or dishonest—the pair would then hope to identify what individual personality traits might predict each case. Then the experimenters will also try to track down what kind of situations—like being reprimanded or being alone in the room—bring out honesty and dishonesty. They hope to thus search out the roots of automatic morality.

    One surprising finding from this study reveals the complexity Greene and others face in trying to dissect moral behavior: The decision to lie for personal gain turns out to be a strikingly unemotional choice. Some moral dilemmas Greene studies, like the trolley problem, trigger emotional processing centers in our brains. In his coin toss experiment, there was no sign at all that emotions factored into a subject’s decision to lie or to tell the truth. “Moral judgment is not a single thing,” Greene concludes, suggesting that although we often lump them together under the heading of “morality,” deciding what’s right or wrong and deciding to tell the truth or to tell a lie may, in some situations, be entirely disconnected processes.

  2. #2

    Anonymity certainly decreases my tendency to control the impulse to be more honest than is socially or ethically appropriate, even about feelings I shouldn't have. My natural impulse is to be excessively open, and under normal conditions I hold back. Having recently been tested in such a scenario, I found that the temptation to tell everything I felt was, under supposedly anonymous conditions, enough to override my desire to avoid the potential negative consequences of my authentic expression on others. I can speak with certainty. I suspect that it isn't just lying that gets blocked in everyday behavior, but also socially destructive forms of honesty, such as telling someone the truth about a bad haircut, or that we find them frustrating to argue with. That might be why it is often easier to discuss our deepest secrets anonymously with strangers on an online forum than to do so in the real world where our choices have a more noticeable impact on those around us.
    Posted via Mobile Device

  3. #3

    can I please get the link Marino?

  4. #4

    Quote Originally Posted by slowriot View Post
    can I please get the link Marino?
    Truth or Lies SEEDMAGAZINE.COM

  5. #5

    Jon Haidt of the University of Virginia has found through numerous studies that we make some moral judgments, like those involved in the trolley problem, based entirely on our emotions and are unable to explain logically why some things are right and others wrong.
    This is confusing. OK, I mean, it's confusing me. In the context of MBTI, judgements are consciousluy made by one of two available functions - the Objective 'T' function or the Subjective 'F' function. In the MBTI system great pains are made to make explicit the fact that F judgement is NOT emotion. Now we have this research which says some decisions are made 'emotionally'. Are to extend this to assume that decision making can be 'emotional' OR it can be T (objective) based OR F (subjective based?

    I have enough problems as it is with the MBTI, which claims to be a system operating only in the domain of the *conscious* mind. Psychologists and the like have an awful ot to say about behaviour and motivations originating or driven by the subconscious mind. We are told by many sources that it isn't possible to consciously manage subconscious urges/motivations, so, surely, MBTI type and it's total reliance on conscious behaviour must sometimes be rendered totally irrelevant by what's going on in the subconsious. Sheesh.

    As for honesty ... it was dishonest of Schindler to lie to those Nazis, and save all those Jewish people. Was it RIGHT though? I think there is a definition problem around the attributes of "Honest" as in correct, and "Honest" as in 'true to a set of personal morals'.


Similar Threads

  1. Hello Personality Cafe!
    By Fieldmarshall in forum Intro
    Replies: 13
    Last Post: 09-08-2009, 03:18 PM


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
All times are GMT -7. The time now is 01:10 PM.
Information provided on the site is meant to complement and not replace any advice or information from a health professional.
© 2014 PersonalityCafe

SEO by vBSEO 3.6.0