FURTHER to my recent post on why people don’t accept evidence, it turns out that an editorial 1 and an opinion 2 piece in this week’s Nature, the latter unfortunately behind a pay-wall, actually focus on just this issue. The editorial states:
“Empirical evidence shows that people tend to react to reports on issues such as climate change according to their personal values (see page 296). Those who favour individualism over egalitarianism are more likely to reject evidence of climate change and calls to restrict emissions. And the messenger matters perhaps just as much as the message. People have more trust in experts — and scientists — when they sense that the speaker shares their values.”
So people tend to accept the evidence that supports their personal proclivities, and in fact interpret evidence in a manner than does so, thus people tend to persist in cherished beliefs and views even when confronted with contradictory evidence. This of course is something probably appreciated by most of us. Dan Kahan, in his opinion piece, points out:
“People endorse whichever position reinforces their connection to others with whom they share important commitments. As a result, public debate about science is strikingly polarized. The same groups who disagree on ‘cultural issues’ — abortion, same-sex marriage and school prayer — also disagree on whether climate change is real and on whether underground disposal of nuclear waste is safe.”
Another factor that weighs heavily in the public perception, and acceptance, of facts is the messenger. Owing to the fact that most people are ill-equipped to evaluate the raw data from scientific studies, they rely on the position of credible experts; it seems that those experts laypersons see as credible are those perceived to share the same values.
Research into the mental processes involved in such public perception is, Dan tells us, being conducted by Donald Braman at George Washington University Law School in Washington DC, Geoffrey Cohen at Stanford University in Palo Alto, California, John Gastil at the University of Washington in Seattle, Paul Slovic at the University of Oregon in Eugene and Dan Kahan, the Elizabeth K. Dollard professor of law at Yale Law School. These processes are collectively referred to as ‘cultural cognition’.
So what is cultural cognition? Kahan describes it as, ‘the influence of group values (ones relating to equality and authority, individualism and community) on risk perceptions and related beliefs.’ I would imagine that peer-pressure represents one example within a spectrum of influences in cultural cognition.
Two techniques are forth-coming as means to mitigate public polarization on scientific evidence:
1. Ensure that facts are presented in a manner that ‘affirms, rather than threatens people’s values.’ This in essence means that science communication should not just be targeted according to the level of scientific literacy of the target audience, but that it also be targeted in a manner that presents the information in the most favourable way to respective ‘groups’, whether they be individualistic or egalitarian. Dan Kahan:
“…people with individualistic values resist scientific evidence that climate change is a serious threat because they have come to assume that industry-constraining carbon-emission limits are the main solution. They would probably look at the evidence more favourably, however, if made aware that the possible responses to climate change include nuclear power and geoengineering, enterprises that to them symbolize human resourcefulness. Similarly, people with an egalitarian outlooks are less likely to reflexively dismiss evidence of the safety of nanotechnology if they are made aware of the part that nanotechnology might play in environmental protection, and not just its usefulness in the manufacture of consumer goods.”
It is hard to envisage how such targeted differentiation in the means of communicating scientific information would work in a practical sense; however, one would imagine that the medium in which science is communicated, whether this be a particular newspaper, magazine, society, TV channel, or political party policy research documents, will tend to have a majority demographic that reflects a particular group to whom the information can be tailored.
This sounds a little contrived, but Dan Kahan says:
‘Unlike commercial advertising, however, the goal of these techniques is not to induce public acceptance of any particular conclusion, but rather to create an environment for the public’s open-minded, unbiased consideration of the best available scientific information.’
2. A second technique is to ensure the message is conveyed, and vouched for, by a diverse set of experts from different backgrounds and different values. The editorial points out that:
‘…scientists should be careful not to disparage those on the other side of a debate: a respectful tone makes it easier for people to change their minds if they share something in common with that other side.’
Kahan’s opinion is that as straightforward as these recommendations might seem, science communicators routinely flout them. There is the tendency to think that by putting good, sound scientific information into the public space, this is sufficient to win over opinion and push out the competitors. However, despite being well intentioned, it is at best a naive approach; people will make decisions using many more factors than just facts. As Kahan puts it, ‘[when the] truth carries implications that threaten people’s cultural values, then holding their heads underwater is likely to harden their resistance and increase their willingness to support alternative arguments, no matter how lacking in evidence.’
1. Editorial (2010) Climate of suspicion. Nature 463: 269. http://dx.doi.org/10.1038/463269a.
2. Kahan, D. (2010) Fixing the communications failure. Nature 463: 296-297. http://dx.doi.org/10.1038/463296a.