Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Fixing the Communications Failure (Cultural Cognition)

0 views
Skip to first unread message

Dan Clore

unread,
Mar 9, 2010, 7:08:12 PM3/9/10
to
News & Views for Anarchists & Activists:
http://groups.yahoo.com/group/smygo

http://www.nature.com/nature/journal/v463/n7279/full/463296a.html
Opinion
Nature 463, 296-297 (21 January 2010)
Fixing the communications failure
by Dan Kahan1

Abstract

People's grasp of scientific debates can improve if communicators build
on the fact that cultural values influence what and whom we believe,
says Dan Kahan.

In a famous 1950s psychology experiment, researchers showed students
from two Ivy League colleges a film of an American football game between
their schools in which officials made a series of controversial
decisions against one side. Asked to make their own assessments,
students who attended the offending team's college reported seeing half
as many illegal plays as did students from the opposing institution.
Group ties, the researchers concluded, had unconsciously motivated
students from both colleges to view the tape in a manner that favoured
their own school1.

Since then, a growing body of work has suggested that ordinary citizens
react to scientific evidence on societal risks in much the same way.
People endorse whichever position reinforces their connection to others
with whom they share important commitments. As a result, public debate
about science is strikingly polarized. The same groups who disagree on
'cultural issues' � abortion, same-sex marriage and school prayer � also
disagree on whether climate change is real and on whether underground
disposal of nuclear waste is safe.

The ability of democratic societies to protect the welfare of their
citizens depends on finding a way to counteract this culture war over
empirical data. Unfortunately, prevailing theories of science
communication do not help much. Many experts attribute political
controversy over risk issues to the complexity of the underlying
science, or the imperfect dissemination of information. If that were the
problem, we would expect beliefs about issues such as environmental
risk, public health and crime control to be distributed randomly or
according to levels of education, not by moral outlook. Various
cognitive biases � excessive attention to vivid dangers, for example, or
self-reinforcing patterns of social interaction � distort people's
perception of risk, but they, too, do not explain why people who
subscribe to competing moral outlooks react differently to scientific data.

A process that does account for this distinctive form of polarization is
'cultural cognition'. Cultural cognition refers to the influence of
group values � ones relating to equality and authority, individualism
and community � on risk perceptions and related beliefs2, 3. In ongoing
research, Donald Braman at George Washington University Law School in
Washington DC, Geoffrey Cohen at Stanford University in Palo Alto,
California, John Gastil at the University of Washington in Seattle, Paul
Slovic at the University of Oregon in Eugene and I study the mental
processes behind cultural cognition.

For example, people find it disconcerting to believe that behaviour that
they find noble is nevertheless detrimental to society, and behaviour
that they find base is beneficial to it. Because accepting such a claim
could drive a wedge between them and their peers, they have a strong
emotional predisposition to reject it.
Picking sides

Our research suggests that this form of 'protective cognition' is a
major cause of political conflict over the credibility of scientific
data on climate change and other environmental risks. People with
individualistic values, who prize personal initiative, and those with
hierarchical values, who respect authority, tend to dismiss evidence of
environmental risks, because the widespread acceptance of such evidence
would lead to restrictions on commerce and industry, activities they
admire. By contrast, people who subscribe to more egalitarian and
communitarian values are suspicious of commerce and industry, which they
see as sources of unjust disparity. They are thus more inclined to
believe that such activities pose unacceptable risks and should be
restricted. Such differences, we have found, explain disagreements in
environmental-risk perceptions more completely than differences in
gender, race, income, education level, political ideology, personality
type or any other individual characteristic4.

Cultural cognition also causes people to interpret new evidence in a
biased way that reinforces their predispositions. As a result, groups
with opposing values often become more polarized, not less, when exposed
to scientifically sound information.

In one study, we examined how this process can influence people's
perceptions of the risks of nanotechnology. We found that relative to
counterparts in a control group, people who were supplied with neutral,
balanced information immediately splintered into highly polarized
factions consistent with their cultural predispositions towards more
familiar environmental risks, such as nuclear power and genetically
modified foods5.

Of course, because most people aren't in a position to evaluate
technical data for themselves, they tend to follow the lead of credible
experts. But cultural cognition operates here too: the experts whom
laypersons see as credible, we have found, are ones whom they perceive
to share their values. This was the conclusion of a study we carried out
of Americans' attitudes towards human-papillomavirus (HPV) vaccination
for schoolgirls. This common, sexually transmitted virus is the leading
cause of cervical cancer. The US government's Centers for Disease
Control and Prevention (CDC) recommended in 2006 that the vaccine be
routinely administered to girls aged 11 or 12 � before they are likely
to become exposed to the virus. That proposal has languished amid
intense political controversy, with critics claiming that the vaccine
causes harmful side effects and will increase unsafe sex among teens.

To test how expert opinion affects this debate, we constructed arguments
for and against mandatory vaccination and matched them with fictional
male experts, whose appearance (besuited and grey-haired, for example,
or denim-shirted and bearded) and publication titles were designed to
make them look as if they had distinct cultural perspectives. When the
expert who was perceived as hierarchical and individualistic criticized
the CDC recommendation, people who shared those values and who were
already predisposed to see the vaccine as risky became even more
intensely opposed to it. Likewise, when the expert perceived as
egalitarian and communitarian defended the vaccine as safe, people with
egalitarian values became even more supportive of it. Yet when we
inverted the expert-argument pairings, attributing support for mandatory
vaccination to the hierarchical expert and opposition to the egalitarian
one, people shifted their positions and polarization disappeared6.
Rooting for the same team

Taken together, these dynamics help to explain the peculiar cultural
polarization on scientific issues in the United States and beyond. Like
fans at a sporting contest, people deal with evidence selectively to
promote their emotional interest in their group. On issues ranging from
climate change to gun control, from synthetic biology to
counter-terrorism, they take their cue about what they should feel, and
hence believe, from the cheers and boos of the home crowd.

But unlike sports fans watching a game, citizens who hold opposing
cultural outlooks are in fact rooting for the same outcome: the health,
safety and economic well-being of their society. Are there remedies for
the tendency of cultural cognition to interfere with their ability to
reach agreement on what science tells them about how to attain that goal?

Research on how to control cultural cognition is less advanced than
research on the mechanisms behind it. Nevertheless, two techniques of
science communication may help.

One method, examined in depth by Geoffrey Cohen, is to present
information in a manner that affirms rather than threatens people's
values7. As my colleagues and I have shown, people tend to resist
scientific evidence that could lead to restrictions on activities valued
by their group. If, on the other hand, they are presented with
information in a way that upholds their commitments, they react more
open-mindedly8.

For instance, people with individualistic values resist scientific
evidence that climate change is a serious threat because they have come
to assume that industry-constraining carbon-emission limits are the main
solution. They would probably look at the evidence more favourably,
however, if made aware that the possible responses to climate change
include nuclear power and geoengineering, enterprises that to them
symbolize human resourcefulness. Similarly, people with an egalitarian
outlooks are less likely to reflexively dismiss evidence of the safety
of nanotechnology if they are made aware of the part that nanotechnology
might play in environmental protection, and not just its usefulness in
the manufacture of consumer goods.

The second technique for mitigating public conflict over scientific
evidence is to make sure that sound information is vouched for by a
diverse set of experts. In our HPV-vaccine experiment, polarization was
also substantially reduced when people encountered advocates with
diverse values on both sides of the issue. People feel that it is safe
to consider evidence with an open mind when they know that a
knowledgeable member of their cultural community accepts it. Thus,
giving a platform to a spokesperson likely to be recognized as a typical
traditional parent with a hierarchical world view might help to dispel
any association between mandatory HPV vaccination and the condoning of
permissive sexual practices.

It would not be a gross simplification to say that science needs better
marketing. Unlike commercial advertising, however, the goal of these
techniques is not to induce public acceptance of any particular
conclusion, but rather to create an environment for the public's
open-minded, unbiased consideration of the best available scientific
information.

As straightforward as these recommendations might seem, however, science
communicators routinely flout them. The prevailing approach is still
simply to flood the public with as much sound data as possible on the
assumption that the truth is bound, eventually, to drown out its
competitors. If, however, the truth carries implications that threaten
people's cultural values, then holding their heads underwater is likely
to harden their resistance and increase their willingness to support
alternative arguments, no matter how lacking in evidence. This reaction
is substantially reinforced when, as often happens, the message is put
across by public communicators who are unmistakably associated with
particular cultural outlooks or styles � the more so if such advocates
indulge in partisan rhetoric, ridiculing opponents as corrupt or devoid
of reason. This approach encourages citizens to experience scientific
debates as contests between warring cultural factions � and to pick
sides accordingly.

We need to learn more about how to present information in forms that are
agreeable to culturally diverse groups, and how to structure debate so
that it avoids cultural polarization. If we want democratic
policy-making to be backed by the best available science, we need a
theory of risk communication that takes full account of the effects of
culture on our decision-making.

References

1. Hastorf, A. H. & Cantril, H. J. Abnorm. Soc. Psychol. 49, 129�134
(1954). | Article | ChemPort |
2. Douglas, M. & Wildavsky, A. B. Risk and Culture: An Essay on the
Selection of Technical and Environmental Dangers. (Univ. California
Press, 1982).
3. DiMaggio, P. Annu. Rev. Sociol. 23, 263�287 (1997). | Article
4. Kahan, D. M., Braman, D., Gastil, J., Slovic, P. & Mertz, C. K. J.
Empir. Legal Stud. 4, 465�505 (2007). | Article
5. Kahan, D. M., Braman, D., Slovic, P., Gastil, J. & Cohen, G. Nature
Nanotechnol. 4, 87�91 (2009). | Article | ChemPort |
6. Kahan, D. M., Braman, D., Cohen, G. L., Slovic, P. & Gastil, J. Law
Human Behav. (in the press).
7. Cohen, G. L., Aronson, J. & Steele, C. M. Pers. Soc. Psychol. Bull.
26, 1151�1164 (2000). | Article
8. Cohen, G. L. et al. J. Pers. Soc. Psychol. B. 93, 415�430 (2007). |
Article

1. Dan Kahan is the Elizabeth K. Dollard professor of law at Yale Law
School, New Haven, Connecticut 06511, USA.
Email: dan....@yale.edu.

--
Dan Clore

New book: _Weird Words: A Lovecraftian Lexicon_:
http://tinyurl.com/yd3bxkw
My collected fiction, _The Unspeakable and Others_:
http://www.amazon.com/gp/product/B0035LTS0O
Lord We�rdgliffe & Necronomicon Page:
http://tinyurl.com/292yz9
News & Views for Anarchists & Activists:
http://groups.yahoo.com/group/smygo

Strange pleasures are known to him who flaunts the
immarcescible purple of poetry before the color-blind.
-- Clark Ashton Smith, "Epigrams and Apothegms"

Rev 11D Meow!

unread,
Mar 9, 2010, 7:34:28 PM3/9/10
to
On Tue, 09 Mar 2010 16:08:12 -0800, Dan Clore
<cl...@columbia-center.org> wrote:

> News & Views for Anarchists & Activists:

http://sadtrombone.com

Zapanaz

unread,
May 3, 2010, 2:58:48 PM5/3/10
to
On Tue, 09 Mar 2010 16:08:12 -0800, Dan Clore
<cl...@columbia-center.org> wrote:

>People's grasp of scientific debates can improve if communicators build
>on the fact that cultural values influence what and whom we believe,
>says Dan Kahan.

That's sure something I care a lot about. My friends call me the KING
of building on the fact that cultural values influence what and whom
we believe
--
Zapanaz
International Satanic Conspiracy
Customer Support Specialist
http://joecosby.com/
Lay down all thoughts, surrender to the void
It is shining
It is shining
- John Lennon

:: Currently listening to Buzzy, 1947, by Charlie Parker, from "Now's the Time"

0 new messages