Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.


Skip to first unread message


Apr 11, 2013, 12:48:17 PM4/11/13

On a four-point scale, from one (strongly disagree) to four (strongly agree), please rate the following statements: “The Apollo moon landings never happened and were staged in a Hollywood film studio”; ”Princess Diana’s death was not an accident but rather an organized assassination by members of the British Royal Family who disliked her”; “The Coca-Cola Company intentionally changed to an inferior formula with the intent of driving up demand for their classic product, later reintroducing it for their financial gain”; and “Carbon dioxide emissions resulting from human activities cause climate change.”

Questions like those formed the core of one of the most intriguing studies I have seen in a long time, a brand-new study, just published in Psychological Science, that investigated the dynamics of science doubters. The Australian psychologist Stephan Lewandowsky and two collaborators surveyed over a thousand visitors to online climate blogs (all relatively positive toward science), and asked them questions about free-market ideology and their views on climate science. The investigators also probed for their “conspiracist ideation” by asking questions like the ones above about faked Apollo moon landings and the assassination of Princess Diana. Some subjects were eliminated because they appear to have lied about their age (it is doubtful that anyone under five completed the survey, for instance), and as a precaution, to prevent ballot-box stuffing, the experimenters also eliminated answers where more than one response came a single I.P. address.

In principle, you could imagine that people’s answers to these questions might be logically independent. One could be a conspiracy theorist about Coca-Cola without having any particular views about climate change, or vice versa. And indeed, some subjects really did believe in climate change even as they doubted the intentions of the sugar-water company from Atlanta, and vice versa.

But, over all, the trends were clear. The more people believed in free-market ideology, the less they believed in climate science; the more they accepted science in general, the more they accepted the conclusions of climate science; and the more likely they were to be conspiracy theorists, the less likely they were to believe in climate science.

These results fit in with a longer literature on what has come to be known as “motivated reasoning.” Other things being equal, people tend to believe what they want to believe, and to disbelieve new information that might challenge them. The classic study for this came in the nineteen-sixties, shortly after the first Surgeon General’s report on smoking and lung cancer, which suggested that smoking appeared to cause lung cancer. A careful survey revealed that (surprise!) smokers were less persuaded than nonsmokers were. Nonsmokers believed what the Surgeon General had to say. Smokers heaped on the counterarguments: “many smokers live a long time” (true, but ignores the statistical evidence), “lots of things are hazardous” (a red herring), ”smoking is better than being a nervous wreck,” and so forth, piling red herrings on top of unsupported assumptions. Other research has shown a polarization effect: bring a bunch of climate change doubters into a room together, and they will leave the room even more skeptical than before, more confident and more extreme in the their views.

There may be some evolutionary advantage to having minds that reason in this way, bobbing and weaving and often avoiding the truth, but elsewhere, in my book “Kluge: The Haphazard Evolution of the Human Mind,” I have speculated that it is more bug than feature—a neural glitch of how our memories are retrieved (mainly by finding matches to retrieval queries, which leads to confirmation bias, rather than through more systematic searches that might reveal disconfirming evidence that could potentially challenge one’s beliefs). A parallel phenomenon can contaminate our ability to listen to others; we tend to dismiss that which challenges our beliefs, while accepting confirming evidence. Cass Sunstein, of “Nudge” fame, has an interesting new technical paper on this.

Given that we live in a country in which the theory of evolution—one of the most powerful theories in all of science—is routinely dismissed, and one in which climate-change experts have struggled for years to persuade the public that there is a clear and present danger despite reams of data supporting them, serious investigations into the logic of crowds in real-world situations may represent an important step forward in understanding how to reason with less-than-reasonable masses.
0 new messages