political parties based on their followersâ cultural values or moral systems. For now, though, letâs survey the consequences that divisions like these have for how we understand science and facts.
In one of Kahanâs studies, members of the different groups were asked to imagine that a close friend has come to them and said that he or she is trying to decide about the risks on three contested issues: whether global warming is caused by human beings, whether nuclear waste can be safely stored deep underground, and whether letting people carry guns either deters violent crime on the one hand, or worsens it on the other. The experiment continued:
The friend tells you that he or she is planning to read a book about the issue but before taking the time to do so would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.
Then study subjects were shown alleged book excerpts by fake âexpertsâ on these issues, as well as phony pictures of the authors and fictitious resumes. All the authors were depicted as legitimate experts and members of the National Academy of Sciences. The only area where they differed was on their view of the risk in question.
The results were stark: When the fake scientistâs position stated that global warming is real and caused by humans, only 23 percent of hierarchical-individualists agreed the person was a âtrustworthy and knowledgeable expert.â Yet 88 percent of egalitarian-communitarians accepted the same scientistâs alleged expertise. (Similar divides, although not always as sharp, were observed on the other issues.)
In other words, people were rejecting the scientific source because its conclusion was contrary to their deeply held views about the world. None of the groups were âanti-scienceâ or âanti-expertâânot in their own minds, anyway. Itâs just that science and expertise were whatever they wanted them to beâwhatever made them feel that their convictions had been bolstered and strengthened.
When they deny global warming, then, conservatives think the best minds are actually on their side. They think theyâre the champions of truth and reality, and theyâre deeply attached to this view. That is why head-on attempts to persuade them otherwise usually fail. Indeed, factual counterarguments sometimes even trigger what has been termed a backfire effect : Those with strongly held but clearly incorrect beliefs not only fail to change their minds, but hold their wrong views more tenaciously after being shown contradictory evidence or a refutation.
To show this, letâs move from global warming to a question that, from the perspective of the political mind, is very similar: whether Saddam Husseinâs Iraq possessed hidden weapons of mass destruction prior to the U.S. invasion in 2003. When political scientists Brendan Nyhan of Dartmouth and Jason Reifler of Georgia State showed subjects fake newspaper articles in which this incorrect claim was first suggested (in a real-life 2004 quotation from President Bush) and then refuted (with a discussion of the actual findings of the 2004 Duelfer report, which found no evidence of concerted nuclear, chemical, or biological weapons efforts in pre-invasion Iraq), they found that conservatives were more likely to believe the claim than before.
The same thing happened in another experiment, when conservatives were primed with a ridiculous (and also real) statement by Bush concerning his tax cutsââthe tax relief stimulated economic vitality and growth and it has helped increase revenues to the Treasury.â The article then went on to inform study subjects that the tax cuts had not actually increased government revenue. Once again, following the factual correction, conservatives believed Bushâs false claim more strongly.
Seeking to be evenhanded, the researchers then tested how liberals responded when shown, in a
Andrew Lennon, Matt Hickman