Have you ever met people who change their minds when you tell them facts that are contrary to their beliefs? Me never. Worse, people seem to strengthen their beliefs and defend them fiercely when there is damning evidence against them. The explanation is related to the fact that our worldview seems to be threatened by factual evidence that does not go in its direction.
Creationists, for example, challenge evidences of evolution like fossils or genetics because they worry that secular forces are encroaching on religious faith. Anti-vaccination is suspicious of pharmaceutical companies and thinks that money corrupts medicine. This leads them to believe, for example, that there is a cause and effect relationship between vaccines and autism despite the embarrassing truth that the only study claiming such a link has been retracted and its main perpetrator accused of fraud.
The 9/11 conspiracyists focus on minute details like the melting point of steel in the World Trade Center towers, which caused their collapse, because they think the US government is lying and conducting operations “under false flag “to create a new world order.
Climate deniers are studying tree growth rings, ice cores and greenhouse gas concentrations because they are passionate about freedom, especially that of industries to conduct their business without being constrained by regulations restrictive government. Barack Obama’s obsessed folks desperately dissected his birth certificate in search of fraud because they believed that the first African-American president of the United States was a socialist who aimed to destroy the country.
In these examples, the deep-seated worldviews of these followers are perceived as being threatened by rationalists, making them “the enemy to be defeated”. This hold of belief on the proof can be explained by two factors: cognitive dissonance and the rebound effect (backfire). In a classic book published in 1956 entitled When the Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a sect worshiping UFOs after the expected extraterrestrial mothership failed to arrive. the time announced.
Instead of admitting their mistake, “the members of the group frantically sought to convince the world of their beliefs,” and they made “a series of desperate attempts to erase this dissonance between their belief and reality by making new predictions. after the initial prophecy, hoping that one would end up being the right one. ” Festinger called this state of cognitive dissonance an uncomfortable tension that occurs when two conflicting ideas are considered simultaneously.
Why the facts are not enough to convince people that they are wrong
© Izhar Cohen
In their book The mistakes of others. Self-Reasoning, Its Spreads and Misdemeanors, published in 2007, the two social psychologists Carol Tavris and Elliot Aronson (a former Festinger student) document thousands of experiments demonstrating how people distort and select facts to fit their needs. preexisting beliefs and reduce their cognitive dissonance. Their metaphor of the “pyramid of choice” illustrates how two individuals with close positions – side by side at the top of the pyramid – can quickly diverge and finish at the foot of the pyramid on opposite sides, with opposite opinions, as long as they have been fighting to defend a position.
In a series of experiments, Brendan Nyhan of Dartmouth College and Jason Reifler of the University of Exeter identified a second, related factor, which they called “rebound effect” (in English, backfire): correcting factual errors related to a person’s beliefs is not only ineffective, but reinforces his mistaken beliefs because “it threatens his view of the world or self-concept”.
Subjects of an experiment, for example, received fictitious press articles that confirmed widespread misconceptions, such as the presence of weapons of mass destruction in Iraq. Then the participants were given an article that showed that no weapons of mass destruction had been found. As a result, the liberal-minded subjects who opposed the war accepted the new article and rejected the old ones, while the Conservatives who supported the war did the opposite.
Worse, they said they were even more convinced of the existence of weapons of mass destruction after reading the article showing that there were none, on the grounds that it only proved that Saddam Hussein had hidden them or destroyed them. . In fact, Nyhan and Reifler noted that among many conservatives, “the belief that Iraq possessed weapons of mass destruction just before the invasion by the United States persisted long after the Bush administration itself finally admit that it was not the case.
If factual corrections only make matters worse, what can we do to convince people that their beliefs are wrong? According to my empirical experience, one can adopt the following behavior:
- Put your emotions aside.
- Discuss, do not attack (no ad hominem attack or Godwin point).
- Listen carefully and try to analyze the position of your interlocutor with precision.
- Show respect.
- Recognize that you understand why someone can support this opinion.
- Trying to show how changing a vision of the facts does not necessarily mean changing the worldview.
These strategies do not always work to convince people to change their point of view, but at a time when it has become so common to break free of the truth in public debate, it could at least help reduce unnecessary dissent.