cult recovery 101

belief

Logic-Tight Compartments: How our modular brains lead us to deny and distort evidence

michaelshermer.com
Michael Shermer
How our modular brains lead us to deny and distort evidence
January 1, 2013
IF YOU HAVE PONDERED how intelligent and educated people can, in the face of overwhelming contradictory evidence, believe that that evolution is a myth, that global warming is a hoax, that vaccines cause autism and asthma, that 9/11 was orchestrated by the Bush administration, conjecture no more. The explanation is in what I call logic-tight compartments—modules in the brain analogous to watertight compartments in a ship.
The concept of compartmentalized brain functions acting either in concert or in conflict has been a core idea of evolutionary psychology since the early 1990s. According to University of Pennsylvania evolutionary psychologist Robert Kurzban in Why Everyone (Else) Is a Hypocrite (Princeton University Press, 2010), the brain evolved as a modular, multitasking problem-solving organ—a Swiss Army knife of practical tools in the old metaphor or an app-loaded iPhone in Kurzban’s upgrade. There is no unified “self” that generates internally consistent and seamlessly coherent beliefs devoid of conflict. Instead we are a collection of distinct but interacting modules often at odds with one another. The module that leads us to crave sweet and fatty foods in the short term is in conflict with the module that monitors our body image and health in the long term. The module for cooperation is in conflict with the one for competition, as are the modules for altruism and avarice or the modules for truth telling and lying.
Compartmentalization is also at work when new scientific theories conflict with older and more naive beliefs. In the 2012 paper “Scientific Knowledge Suppresses but Does Not Supplant Earlier Intuitions” in the journal Cognition, Occidental College psychologists Andrew Shtulman and Joshua Valcarcel found that subjects more quickly verified the validity of scientific statements when those statements agreed with their prior naive beliefs. Contradictory scientific statements were processed more slowly and less accurately, suggesting that “naive theories survive the acquisition of a mutually incompatible scientific theory, coexisting with that theory for many years to follow.”
Cognitive dissonance may also be at work in the compartmentalization of beliefs. In the 2010 article “When in Doubt, Shout!” in Psychological Science, Northwestern University researchers David Gal and Derek Rucker found that when subjects’ closely held beliefs were shaken, they “engaged in more advocacy of their beliefs … than did people whose confidence was not undermined.” Further, they concluded that enthusiastic evangelists of a belief may in fact be “boiling over with doubt,” and thus their persistent proselytizing may be a signal that the belief warrants skepticism.
In addition, our logic-tight compartments are influenced by our moral emotions, which lead us to bend and distort data and evidence through a process called motivated reasoning. The module housing our religious preferences, for example, motivates believers to seek and find facts that support, say, a biblical model of a young earth in which the overwhelming evidence of an old earth must be denied. The module containing our political predilections, if they are, say, of a conservative bent, may motivate procapitalists to believe that any attempt to curtail industrial pollution by way of the threat of global warming must be a liberal hoax.
What can be done to break down the walls separating our logic-tight compartments? In the 2012 paper “Misinformation and Its Correction: Continued Influence and Successful Debiasing” in Psychological Science in the Public Interest, University of Western Australia psychologist Stephan Lewandowsky and his colleagues suggest these strategies: “Consider what gaps in people’s mental event models are created by debunking and fill them using an alternative explanation…. To avoid making people more familiar with misinformation…, emphasize the facts you wish to communicate rather than the myth. Provide an explicit warning before mentioning a myth, to ensure that people are cognitively on guard and less likely to be influenced by the misinformation…. Consider whether your content may be threatening to the worldview and values of your audience. If so, you risk a worldview backfire effect.”
Debunking by itself is not enough. We must replace bad bunk with sound science.