<div dir="ltr">Wow fantastic article! There seems to be an easy fix: base policy on actual science. Why? Science has developed ways of weeding out bias, not completely perhaps but as much as possible. Individuals and political interest groups do not. Joe<br>
</div><div class="gmail_extra"><br><br><div class="gmail_quote">On Wed, Jun 19, 2013 at 8:51 AM, Art Deco <span dir="ltr"><<a href="mailto:art.deco.studios@gmail.com" target="_blank">art.deco.studios@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Forward this to the Christ Church Cult and the millions of other religious crackpots. Don't count on the con artists among them to help out though. Religious belief, impenetrable to rational though, is big, big business.<br>
<br>w. <br></div><div class="gmail_extra"><br><br><div class="gmail_quote"><div><div class="h5">On Wed, Jun 19, 2013 at 11:32 AM, Ron Force <span dir="ltr"><<a href="mailto:rforce2003@yahoo.com" target="_blank">rforce2003@yahoo.com</a>></span> wrote:<br>
</div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div class="h5"><div><div style="font-size:12pt;font-family:times new roman,new york,times,serif"><u></u><h1 style="font-family:freight-sans-pro,'Myriad Pro','Lucida Grande','Lucida Sans Unicode','Lucida Sans',Geneva,Verdana,sans-serif;letter-spacing:-1px;outline:0px;word-wrap:break-word;font-size:52px;line-height:1">
The Science of Why We Don’t Believe Science</h1><h2 style="margin:20px 0px;font-weight:400;outline:0px;word-wrap:break-word;font-style:italic;line-height:1.2">How our brains fool us on climate, creationism, and the vaccine-autism link.</h2>
<u></u><div style="font-family:ff-tisa-web-pro,Georgia,Cambria,'Times New Roman',Times,serif;font-size:22px;line-height:1.45;max-width:700px;margin:0px auto;color:rgb(51,51,50)"><div style="margin-top:25px"><div style="outline:0px;word-wrap:break-word">
<div name="4046" style="margin-bottom:31px"><strong><em>By </em></strong><a href="https://twitter.com/chriscmooney" style="color:rgb(51,51,50)" target="_blank"><strong><em>Chris Mooney</em></strong></a></div><div name="6447" style="margin-bottom:31px">
<strong>“A MAN WITH A CONVICTION </strong>is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger, in a passage that might have been
referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a <a href="http://www.powells.com/biblio/61-9781617202803-1" style="color:rgb(51,51,50)" target="_blank">famous case study</a> in psychology.</div>
<div name="3d56" style="margin-bottom:31px">Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.</div>
<div name="1494" style="margin-bottom:31px">Through her, the aliens had given the precise date of an Earth-rending cataclysm:
December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.</div>
<div name="9520" style="margin-bottom:31px">Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?</div>
<div name="2680" style="margin-bottom:31px">At first, the group struggled for an explanation. But then rationalization set in. A new message
arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!</div>
<div name="b54a" style="margin-bottom:31px">From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.</div>
<div name="867b" style="margin-bottom:31px"><strong>IN THE ANNALS OF DENIAL,</strong> it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while
Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “<a href="http://www.ncbi.nlm.nih.gov/pubmed/2270237" style="color:rgb(51,51,50)" target="_blank">motivated reasoning</a>” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and <a href="http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf" style="color:rgb(51,51,50)" target="_blank">religion of the president</a> (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the
facts.</div><div name="f49e" style="margin-bottom:31px">The theory of motivated reasoning builds on a <a href="https://motherjones.com/files/descartes.pdf" style="color:rgb(51,51,50)" target="_blank">key insight of modern neuroscience</a> (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist <a href="http://www-personal.umich.edu/~lupia/" style="color:rgb(51,51,50)" target="_blank">Arthur Lupia</a> of the University of Michigan. We push threatening information away;
we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.</div><blockquote name="f94b" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">
We apply fight-or-flight reflexes not only to predators, but to data itself.</blockquote><div name="eea3" style="margin-bottom:31px">Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist <a href="http://www.stonybrook.edu/polsci/ctaber/" style="color:rgb(51,51,50)" target="_blank">Charles Taber</a> of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of
memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”</div>
<div name="d4c3" style="margin-bottom:31px">In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist <a href="http://people.virginia.edu/~jdh6n/" style="color:rgb(51,51,50)" target="_blank">Jonathan Haidt</a>: We may think we’re being scientists, but<a href="https://motherjones.com/files/emotional_dog_and_rational_tail.pdf" style="color:rgb(51,51,50)" target="_blank">we’re actually being lawyers</a> (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster
our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.</div><div name="1aac" style="margin-bottom:31px">That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we
should.</div><blockquote name="2993" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.</blockquote>
<div name="6b85" style="margin-bottom:31px"><strong>MODERN SCIENCE</strong> <strong>ORIGINATED</strong> from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best
ideas prevail.</div><div name="760b" style="margin-bottom:31px">Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.</div>
<div name="bfe8" style="margin-bottom:31px">Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In <a href="http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf" style="color:rgb(51,51,50)" target="_blank">a classic 1979
experiment</a> (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”</div>
<div name="41d1" style="margin-bottom:31px">Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the <a href="http://psp.sagepub.com/content/23/6/636.abstract" style="color:rgb(51,51,50)" target="_blank">accuracy of gay stereotypes</a>, and much else. Even when study subjects are explicitly
instructed to be unbiased and even-handed about the evidence, they often fail.</div><div name="58fd" style="margin-bottom:31px">And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor <a href="http://www.law.yale.edu/faculty/DKahan.htm" style="color:rgb(51,51,50)" target="_blank">Dan Kahan</a> and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.</div>
<div name="7752" style="margin-bottom:31px">In <a href="https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf" style="color:rgb(51,51,50)" target="_blank">Kahan’s research</a> (PDF), individuals are classified, based
on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book
excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In <a href="http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers" style="color:rgb(51,51,50)" target="_blank">another study</a> (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were
opposed.)</div><blockquote name="a9a8" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.</blockquote>
<div name="de12" style="margin-bottom:31px">In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (<a href="http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers" style="color:rgb(51,51,50)" target="_blank">commerce, industry, a man’s freedom to
possess a gun to defend his family</a>) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” <a href="http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html" style="color:rgb(51,51,50)" target="_blank">says Kahan</a>.</div>
<div name="a4ab" style="margin-bottom:31px">And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where
people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.</div><div name="2eb6" style="margin-bottom:31px">Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler<a href="http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf" style="color:rgb(51,51,50)" target="_blank">showed subjects fake newspaper articles</a> (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban”
embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)</div><div name="2e67" style="margin-bottom:31px">Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist <a href="http://www.sociology.northwestern.edu/faculty/prasad/home.html" style="color:rgb(51,51,50)" target="_blank">Monica Prasad</a> and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up <a href="http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf" style="color:rgb(51,51,50)" target="_blank">a study</a> (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the
findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”</div><blockquote name="21c3" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">
One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.</blockquote><div name="bc61" style="margin-bottom:31px">As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being
unmovable:</div><div name="5f42" style="margin-bottom:31px"><strong>Interviewer: </strong>[T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?</div>
<div name="512b" style="margin-bottom:31px"><strong>Respondent: </strong>Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.</div>
<div name="4ae3" style="margin-bottom:31px">The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site<a href="http://www.factcheck.org/" style="color:rgb(51,51,50)" target="_blank">FactCheck.org</a>, a team at Ohio State <a href="http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf" style="color:rgb(51,51,50)" target="_blank">presented subjects</a> (PDF) with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.</div>
<div name="6b2a" style="margin-bottom:31px">A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,’” explains Stanford social psychologist <a href="http://communication.stanford.edu/faculty/krosnick/" style="color:rgb(51,51,50)" target="_blank">Jon Krosnick</a>. Indeed, there’s a sense in which
science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”</div>
<div name="1878" style="margin-bottom:31px">This may help explain a curious pattern Nyhan and his colleagues found when they <a href="http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf" style="color:rgb(51,51,50)" target="_blank">tried to test the fallacy</a> (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using
“social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.</div><blockquote name="eebc" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">
A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.</blockquote><div name="5dc8" style="margin-bottom:31px">Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, <a href="http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf" style="color:rgb(51,51,50)" target="_blank">a follow-up
study</a> (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.</div><div name="1dc7" style="margin-bottom:31px">
Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “<a href="http://en.wikipedia.org/wiki/Narrowcasting" style="color:rgb(51,51,50)" target="_blank">narrowcast</a>” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”</div>
<div name="2bcd" style="margin-bottom:31px"><strong>IF YOU WANTED TO SHOW </strong>how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.</div>
<div name="8da1" style="margin-bottom:31px">So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In <a href="http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming" style="color:rgb(51,51,50)" target="_blank">a 2008 Pew survey</a>, for instance, only 19 percent of college-educated Republicans agreed that the planet is
warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.</div>
<div name="5f72" style="margin-bottom:31px">Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example,
abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.</div>
<div name="6786" style="margin-bottom:31px">That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.</div>
<div name="37e8" style="margin-bottom:31px">Climategate had a substantial impact on
public opinion, according to<a href="http://environment.yale.edu/profile/leiserowitz/" style="color:rgb(51,51,50)" target="_blank">Anthony Leiserowitz</a>, director of the <a href="http://environment.yale.edu/climate/" style="color:rgb(51,51,50)" target="_blank">Yale Project on Climate Change Communication</a>. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”</div>
<blockquote name="badd" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.</blockquote>
<div name="4876" style="margin-bottom:31px">So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (<a href="http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html" style="color:rgb(51,51,50)" target="_blank">Robert F. Kennedy Jr</a>.) and numerous Hollywood celebrities (most notably <a href="http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html" style="color:rgb(51,51,50)" target="_blank">Jenny McCarthy</a> and Jim Carrey).
The<em>Huffington Post </em>gives a very large megaphone to denialists. And <a href="http://sethmnookin.com/" style="color:rgb(51,51,50)" target="_blank">Seth Mnookin</a>, author of the new book <a href="http://www.powells.com/biblio/1-9781439158647-0" style="color:rgb(51,51,50)" target="_blank"><em>The Panic Virus</em></a>, notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.</div>
<div name="10b3" style="margin-bottom:31px">Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates <a href="http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print" style="color:rgb(51,51,50)" target="_blank">has been undermined</a> by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending
agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.</div><div name="cec2" style="margin-bottom:31px">Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after <a href="http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext" style="color:rgb(51,51,50)" target="_blank">his 1998 <em>Lancet </em>paper</a>—which originated the current vaccine scare—was retracted and he subsequently <a href="http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf" style="color:rgb(51,51,50)" target="_blank">lost his license</a> (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts
further doubt on anti-vaccine views.</div><div name="c832" style="margin-bottom:31px">It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?</div>
<div name="9aa1" style="margin-bottom:31px">There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.</div>
<div name="b4fd" style="margin-bottom:31px">Some researchers have suggested that there are psychological differences between the left and the
right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.</div>
<blockquote name="8c02" style="font-style:italic;border:0px;padding:0px;line-height:1.4;text-align:center;font-size:32px">We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?</blockquote>
<div name="3453" style="margin-bottom:31px">This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have
differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.</div>
<div name="8b9f" style="margin-bottom:31px">The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?</div>
<div name="3931" style="margin-bottom:31px"><strong>GIVEN THE POWER OF</strong> our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.</div>
<div name="3d8b" style="margin-bottom:31px">This theory is gaining traction in part because of Kahan’s work at Yale. In <a href="http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact" style="color:rgb(51,51,50)" target="_blank">one study</a>, he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry
worldview.</div><div name="2eba" style="margin-bottom:31px">You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.</div>
<div name="7fbe" style="margin-bottom:31px"><em>This </em><a href="http://www.motherjones.com/politics/2011/03/denial-science-chris-mooney" style="color:rgb(51,51,50)" target="_blank"><em>story</em></a><em> first appeared in </em><a href="http://www.motherjones.com/" style="color:rgb(51,51,50)" target="_blank"><em>Mother
Jones</em></a><em> magazine.</em></div></div></div></div></div></div><br></div></div>=======================================================<br>
List services made available by First Step Internet,<br>
serving the communities of the Palouse since 1994.<br>
<a href="http://www.fsr.net" target="_blank">http://www.fsr.net</a><br>
mailto:<a href="mailto:Vision2020@moscow.com" target="_blank">Vision2020@moscow.com</a><br>
=======================================================<span class="HOEnZb"><font color="#888888"><br></font></span></blockquote></div><span class="HOEnZb"><font color="#888888"><br><br clear="all"><br>-- <br>Art Deco (Wayne A. Fox)<br>
<a href="mailto:art.deco.studios@gmail.com" target="_blank">art.deco.studios@gmail.com</a><br>
<br><img><br>
</font></span></div>
<br>=======================================================<br>
List services made available by First Step Internet,<br>
serving the communities of the Palouse since 1994.<br>
<a href="http://www.fsr.net" target="_blank">http://www.fsr.net</a><br>
mailto:<a href="mailto:Vision2020@moscow.com">Vision2020@moscow.com</a><br>
=======================================================<br></blockquote></div><br></div>