Drop Your Intellectual Defenses

By Arnold Kling

When your views are challenged by a discordant observation or a person with a different opinion, should you treat this as an opportunity to reconsider or as a threat to fight off? Julia Galef argues for the former.

Galef favors what she terms the scout mindset, which means adjusting your outlook to take new information into account. She contrasts this with what she calls the soldier mindset, which means ignoring or dismissing new information in order to keep your current outlook intact.

According to Galef, the intellectual scout uses reasoning to try to map reality. The scout welcomes contrary information as helping to correct this map. The soldier uses reasoning to defend one’s map of reality. The solider fights contrary information as if to stave off defeat.

Scout mindset has a number of advantages. One makes better predictions and decisions by seeking the truth. One is actually more persuasive to others, because people value honest assessment rather than overconfidence.

This raises the question of why the soldier mindset evolved in the first place. Galef lists several psychological factors that make it appealing.

First, challenges to our worldview make us uncomfortable. Dismissing such challenges relieves the discomfort, at least for a while.

We are inclined to tune our beliefs in order to protect our self-esteem. For example, if we have trouble learning a foreign language, it is easier to insist that knowing foreign languages is unimportant than to undertake the effort needed to attain that skill.

When we make a decision, considering alternatives may create anguish. Closing our minds to those alternatives may allow us to feel better about our choice, at least for a while.

Being firm in our beliefs can help us to get others to comply with our wishes. But note that this creates the risk that when others defer to our soldier mindset, they do so reluctantly, lacking our same conviction. Galef cites a study in which

  • … law students who are randomly assigned to one side of a moot court case become confident, after reading the case materials, that their side is morally and legally in the right. But that confidence doesn’t help them persuade the judge… [they] are significantly less likely to win the case—perhaps because they fail to consider and prepare for rebuttals to their arguments. (p. 27)

Galef points out that one’s beliefs can serve as a sort of fashion statement.

  • Psychologists call it impression management, and evolutionary psychologists call it signaling: When considering a claim, we implicitly ask ourselves, “What kind of person would believe a claim like this, and is that how I want other people to see me?” (p. 23)

A related motive for holding some beliefs is to fit in better with one’s social group. This can be a particularly powerful motive when a group is strict about excommunicating heretics.

“The more that we are convinced of our own objectivity, the less likely that we are operating in scout mindset.”

One of Galef’s interesting themes is that we self-deceive about our mindset. The more that we are convinced of our own objectivity, the less likely that we are operating in scout mindset. In fact, one key to remaining in scout mindset is the willingness and ability to recognize one’s own inclination to fall back on soldier mindset. As she puts it,

  • But the biggest sign of scout mindset may be this: Can you point to occasions in which you were in soldier mindset? (p. 57)

In particular, having high intelligence and a good education is no assurance that one has scout mindset. On the contrary, it makes one better able to operate using soldier mindset and to hang on to incorrect views.

Galef believes that one acquires scout mindset by cultivating certain habits. These include making a point of telling other people when they have helped you to change your mind, genuinely welcoming feedback, and subjecting your own beliefs to rigorous examination.

Galef advocates using thought experiments as a way of escaping from soldier mindset. For example, in deciding whether to continue or quit a project, she proposes an “outsider test,” in which you imagine what another person would do if they were suddenly dropped into your situation. This thought experiment could relieve you of the baggage of your previous actions that got you into the predicament. The outsider test may also make it easier to avoid throwing good money after bad or wasting time continuing to pursue a graduate degree that no longer seems as worthwhile as when you started.

Another interesting thought experiment is to ask whether your opinion would change if an influential person were to change their mind. For example, suppose that during a meeting the boss advocates a particular project. Before you decide whether or not you agree, imagine what your thinking would be if the boss were to oppose the project.

For acquainting yourself with diverse viewpoints, it pays to choose carefully who you pick to represent that viewpoint. If you only pay attention to the worst people on the other side, then this will serve to close your mind rather than open it.

  • To give yourself the best chance of learning from disagreement, you should be listening to people who make it easier to be open to their arguments, not harder. People you like or respect, even if you don’t agree with them. People with whom you have some common ground—intellectual premises, or a core value that you share—even though you disagree with them on other issues. People whom you consider reasonable, who acknowledge nuance and areas of uncertainty, and who argue in good faith. (p.171)

Galef suggests that one good habit is to cultivate friends who model the scout mindset.

  • One of the biggest things you can do to change your thinking is to change the people you surround yourself with. We humans are social creatures, and our identities are shaped by our social circles, almost without our noticing. (p. 219)

For more on these topics, see the EconTalk episode Julia Galef on the Scout Mindset and “Tribal Psychology and Political Behavior,” by Arnold Kling, Library of Economics and Liberty, August 6, 2018. See also the EconTalk episode L.A. Paul on Vampires, Life Choices, and Transformation.

When we raise the level of analysis from the individual to the group, this leads me to think of another reason that the soldier mindset survives. At an individual level, self-skepticism may be a useful characteristic. But at a group level, rewarding loyalty and stifling dissent may have survival value, at least up to a point. A society where “anything goes” could lose out to a society that demands sacrifice and a strong community-oriented ethic from its members.

From an individual perspective, treating challenges to one’s beliefs as an opportunity rather than a threat might be a good strategy. But from a group perspective, it may pay off to be less truth-seeking and more conformity-demanding. I suspect that this tension between what is best for the individual and what promotes group survival may be at the heart of why soldier’s mindset is difficult to leave behind.

Previous
Previous

What Makes Capitalism Tick?

Next
Next

How Economics Drives News Media