The Bishop and the Butterfly: Murder, Politics, and the End of the Jazz Age
    cmaukonen's picture

    It's My Party And I'll Believe What I Want To


    We have all heard and asked this question. A million times or more. "How can these people still belive this stuff when all the facts prove otherwise ?" The left asks this of the right and the religious of the atheists and so on and so forth. Well there just might be a perfectly logical reason for this. What David McRaney calls The backfire Effect.

    The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.  
    The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.  
    Wired, The New York Times, Backyard Poultry Magazine – they all do it. Sometimes, they screw up and get the facts wrong. In ink or in electrons, a reputable news source takes the time to say “my bad.” If you are in the news business and want to maintain your reputation for accuracy, you publish corrections. For most topics this works just fine, but what most news organizations don’t realize is a correction can further push readers away from the facts if the issue at hand is close to the heart. In fact, those pithy blurbs hidden on a deep page in every newspaper point to one of the most powerful forces shaping the way you think, feel and decide – a behavior keeping you from accepting the truth. In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.

    Because of this engaging in online battles with people in an attempt to prove you particular point of view being the correct one, may in fact be a profound waste of time.

    The last time you got into, or sat on the sidelines of, an argument online with someone who thought they knew all there was to know about health care reform, gun control, gay marriage, climate change, sex education, the drug war, Joss Whedon or whether or not 0.9999 repeated to infinity was equal to one – how did it go? Did you teach the other party a valuable lesson? Did they thank you for edifying them on the intricacies of the issue after cursing their heretofore ignorance, doffing their virtual hat as they parted from the keyboard a better person? No, probably not. Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike. If you are lucky, the comment thread will get derailed in time for you to keep your dignity, or a neighboring commenter will help initiate a text-based dogpile on your opponent.

    There may actually be a very good reason for this. That our tendency to hold onto some belief or information when challenged could be a self preservation technique.

    Have you ever noticed the peculiar tendency you have to let praise pass through you, but feel crushed by criticism? A thousand positive remarks can slip by unnoticed, but one “you suck” can linger in your head for days. One hypothesis as to why this and the backfire effect happens is that you spend much more time considering information you disagree with than you do information you accept. Information which lines up with what you already believe passes through the mind like a vapor, but when you come across something which threatens your beliefs, something which conflicts with your preconceived notions of how the world works, you seize up and take notice. Some psychologists speculate there is an evolutionary explanation. Your ancestors paid more attention and spent more time thinking about negative stimuli than positive because bad things required a response. Those who failed to address negative stimuli failed to keep breathing. In 1992, Peter Ditto and David Lopez conducted a study in which subjects dipped little strips of paper into cups filled with saliva. The paper wasn’t special, but the psychologists told half the subjects the strips would turn green if he or she had a terrible pancreatic disorder and told the other half it would turn green if they were free and clear. For both groups, they said the reaction would take about 20 seconds. The people who were told the strip would turn green if they were safe tended to wait much longer to see the results, far past the time they were told it would take. When it didn’t change colors, 52 percent retested themselves. The other group, the ones for whom a green strip would be very bad news, tended to wait the 20 seconds and move on. Only 18 percent retested. When you read a negative comment, when someone shits on what you love, when your beliefs are challenged, you pore over the data, picking it apart, searching for weakness. The cognitive dissonance locks up the gears of your mind until you deal with it. In the process you form more neural connections, build new memories and put out effort – once you finally move on, your original convictions are stronger than ever.

    This may also explain the phenomenon know as The True Believer Syndrome. Where people will hang onto their beliefs even in the presence of conflicting data.

    True-believer syndrome is an expression coined by M. Lamar Keene to describe an apparent cognitive disorder characterized by believing in the reality of paranormal or supernatural events after one has been presented overwhelming evidence that the event was fraudulently staged. Keene is a reformed phony psychic who exposed religious racketeering-to little effect, apparently. Phony faith healers, psychics, channelers, televangelist miracle workers, etc., are as abundant as ever. Keene believes that "the true-believer syndrome is the greatest thing phony mediums have going for them" because "no amount of logic can shatter a faith consciously based on a lie." That those suffering from true-believer syndrome are consciously lying to themselves hardly seems likely, however.

    It's like being in a very scary situation where you are about to jump right out of your skin. If someone says "You look marvelous" you ignore or brush it off. But if someone says "What's that ?!" You want to jump right up and grab the ceiling. Ever since the Dot Com bubble burst, it has been just one scary thing right after another. And it's not just the right either. The left has it's own version with the extreme environmentalists and health freaks all convinced that they will parish as well. All this lack of security has us all on edge. Looking for the next bad thing and something to blame it on. Me as well. I know when I am very anxious, I will pay more attention to the negatives than the positives in my life.

    So maybe when the right keeps yelling "We want out country back" what they are really saying is "We want our security back." That Peaceful Easy Feeling they had before reality intruded.

    Comments

    Yes, this is exactly how it goes:

    Most online battles follow a similar pattern, each side launching attacks and pulling evidence from deep inside the web to back up their positions until, out of frustration, one party resorts to an all-out ad hominem nuclear strike. If you are lucky, the comment thread will get derailed in time for you to keep your dignity, or a neighboring commenter will help initiate a text-based dogpile on your opponent.

    and there goes the debate and in comes the argument.  lol.  McRaney is right that we almost never change anyone's mind.  Nor can they change ours.  It is, indeed, a waste of time.  But I think a part of it is that we don't know what else to do.  We're powerless to do anything about so much of what's happening to us and to our country. All we can think to do sometimes is howl.

    I read your piece thinking back on how many times I've been accused of being a Democrat just because I've always been one.  That may be.  My primary argument is always that I think of the Democrats as my family and no matter how many jerks there are in my family, I don't give up on them. 

    It may be that I'm a Democrat because I can't see myself as anything else.  No imagination, I know.  But it may be that I'm a Democrat because I'm proud of who we've been as a party and I have faith that we'll climb back up there again.  Whatever it is, I'm more afraid of the true-believers than I am of my party.  That's something, I guess

     


    "They maybe sons-a-bitches. But their ar sun-s-a-bitches." - FDR


    But the point being is that our emotions - our lizard brain - controls our thinking a lot more than any of us care to admit.


    Yeah, that makes sense, but now my thought is, how do we employ this knowledge to get the Right to feel secure?  Or are you saying that because of this little quirk of human nature there is nothing we can do, we're all operating on blind instinct and we might as well throw up our hands and stop trying to convince the other side of the correctness of our positions, because they just can't ever grok the truth...  Where does knowing this bring us?  To a new determination or utter defeatism?


    That's the thing. But knowing the problem is necessary to finding the solution. Now we need a solution.  There maybe no solution. Then again there may. It will take some thinking to be sure.