Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

The New Yorker / February 27, 2017 / Elizabeth Kolbert

(Edited for length)

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs…”

Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research – or even occasionally picked up a copy of Psychology Today – knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at this question.

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.

If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias… The fact that both we and it [confirmation bias] survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.

Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. [Their] new book [is] “The Knowledge Illusion: Why We Never Think Alone”…

Sloman and Fernbach see [the] effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do… So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.

“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem… “This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe…

If we – or our friends or the pundits on CNN – spent less time pontificating and more time trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”

In “Denying the Grave: Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly…

The Gormans, too, argue that ways of thinking tha now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component… “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them… But here they encounter the very problems they have enumerated. Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. “The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”

Original article can be found in its entirety here: http://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds?mbid=social_facebook