By Steven G. Mehta

Recently, I was at a soccer tournament recently and a goal keeper saved a goal.  At the same time, one of the parents jokingly stated that the Goalkeeper had saved the world.  How, I asked.  The parent then reminded me that a religious organization had recently professed that the world was going to end on that exact date and time.  At that same time, the Goal keeper had saved the goal, and as a result saved the soccer team from total destruction.  The next day, the religious organization claimed that they had miscalculated and that the new date was really the end of the world.  That world saving goal keeper got me thinking about why people are convinced about their positions and why they don’t change their mind even in the face of overwhelming evidence.  That process brought me to the the concept of the Backfire Effect.

The Backfire effect is as follows:  When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.

This happens all the time in mediation.  Many times the parties are so unwilling to consider the other sides’ viewpoint and all the evidence you present does not convince them to change their opinion. How can this be?  David McRaney, in his new book, discusses this principle.  McRaney explains in as follows:

In 2006, Brendan Nyhan and Jason Reifler at The University of Michigan and Georgia State University created fake newspaper articles about polarizing political issues. The articles were written in a way which would confirm a widespread misconception about certain ideas in American politics. As soon as a person read a fake article, researchers then handed over a true article which corrected the first. For instance, one article suggested the United States found weapons of mass destruction in Iraq. The next said the U.S. never found them, which was the truth. Those opposed to the war or who had strong liberal leanings tended to disagree with the original article and accept the second. Those who supported the war and leaned more toward the conservative camp tended to agree with the first article and strongly disagree with the second. These reactions shouldn’t surprise you. What should give you pause though is how conservatives felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs were correct.

They repeated the experiment with other wedge issues like stem cell research and tax reform, and once again, they found corrections tended to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies. People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired.

According to McRaney, once you have placed a topic in your belief system, your brain then tries to defend you from altering those beliefs.  This allows you to stick to your beliefs.  In other words, it is hard to develop a belief, but once it is in your system, it is hard to get out of your system.

In mediation, the same principle holds true.  Many times a person is so invested in his or her opinion about the merits of the case that person will never change his or her mind based on the evidence.  The more evidence you present to substantiate the facts, the more the person is convinced of his position.

So just to keep track of the title, we have connected the End of the Worlders and mediation.  How in the world will Austin Powers, the International Man of Mystery connect to this crazy backfire concept.  Have a look at the following clip:

Here, the evidence continues to develop that Austin owned a certain item.  As more evidence is presented, he becomes even more vehement in his opinion that the item in question is not his.  At the end, he never agrees that the item was his.  Here, his belief system (of him being the debonaire secret agent) prevented him from conceding that he owned an item that was contrary to that image.

So now that we have identified this problem, how do we deal with it?

First, you cannot expect to convince this person on a rationale level about this topic.  You have to try other ways to convince this person that is consistent with his or her beliefs.   Many times, I have told parties that I understand their position.  I also get them to recognize that there are other people in the world that don’t agree with their position.  I also get them to understand that those people could be on the jury, and that no matter how strong the party’s belief is, it is also true that he or she will never be on the jury that decides this case.

Second, many people have the misconception that people are rationale decision makers.  The reality is that all of us are irrationale all the time.  We make impulse purchases.  We make emotional decisions.   The decision to support a team, the decision to marry our mate, the decision to buy one brand over the other.  All of these decisions are emotional.  When faced with the backfire effect, you can try to persuade the person on an emotional level rather than an intellectual level.  Appeal to other interests such as prestige, fear, failure, success, perception, bias, ego, etc.  Using those interests rather than the strictly logical will likely yield better results in attempting to persuade the backfire effecters.

So for the End of the Worlders, perhaps you might use their faith as the tool to assist in persuading, for Austin Powers, perhaps his ego, and for the mediating parties, their fear of the unknown.