[Vision2020] [Spam 3.11] The Science of Why We Don't Believe Science

Joe Campbell philosopher.joe at gmail.com
Wed Jun 19 13:30:35 PDT 2013


I don't know much about the GMO debate. What are the main reasons for
rejecting, say, genetically modified foods? Anyone want to provide a short
list of reasons?


On Wed, Jun 19, 2013 at 1:04 PM, lfalen <lfalen at turbonet.com> wrote:

> Good article. Another area in addition to  the vaccine-autism link where
> some of the left is anti-science is GMO's or what they call Frankin Foods
> Roger
>
>
>
>
> -----Original Message-----
> Subject: [Spam 3.11] [Vision2020] The Science of Why We Don't Believe
> Science
> From: "Ron Force" **
> To: "Moscow Vision2020" **
> Date: 06/19/13 17:33:28
>
>  The Science of Why We Don't Believe Science How our brains fool us on
> climate, creationism, and the vaccine-autism link.
>
>  *By **Chris Mooney* <https://twitter.com/chriscmooney>
>  *"A MAN WITH A CONVICTION *is a hard man to change. Tell him you
> disagree and he turns away. Show him facts or figures and he questions your
> sources. Appeal to logic and he fails to see your point." So wrote the
> celebrated Stanford University psychologist Leon Festinger, in a passage
> that might have been referring to climate change denial-the persistent
> rejection, on the part of so many Americans today, of what we know about
> global warming and its human causes. But it was too early for that-this was
> the 1950s-and Festinger was actually describing a famous case study<http://www.powells.com/biblio/61-9781617202803-1> in
> psychology.
> Festinger and several of his colleagues had infiltrated the Seekers, a
> small Chicago-area cult whose members thought they were communicating with
> aliens-including one, "Sananda," who they believed was the astral
> incarnation of Jesus Christ. The group was led by Dorothy Martin, a
> Dianetics devotee who transcribed the interstellar messages through
> automatic writing.
> Through her, the aliens had given the precise date of an Earth-rending
> cataclysm: December 21, 1954. Some of Martin's followers quit their jobs
> and sold their property, expecting to be rescued by a flying saucer when
> the continent split asunder and a new sea swallowed much of the United
> States. The disciples even went so far as to remove brassieres and rip
> zippers out of their trousers-the metal, they believed, would pose a danger
> on the spacecraft.
> Festinger and his team were with the cult when the prophecy failed. First,
> the "boys upstairs" (as the aliens were sometimes called) did not show up
> and rescue the Seekers. Then December 21 arrived without incident. It was
> the moment Festinger had been waiting for: How would people so emotionally
> invested in a belief system react, now that it had been soundly refuted?
> At first, the group struggled for an explanation. But then rationalization
> set in. A new message arrived, announcing that they'd all been spared at
> the last minute. Festinger summarized the extraterrestrials' new
> pronouncement: "The little group, sitting all night long, had spread so
> much light that God had saved the world from destruction." Their
> willingness to believe in the prophecy had saved Earth from the prophecy!
> From that day forward, the Seekers, previously shy of the press and
> indifferent toward evangelizing, began to proselytize. "Their sense of
> urgency was enormous," wrote Festinger. The devastation of all they had
> believed had made them even more certain of their beliefs.
>  *IN THE ANNALS OF DENIAL,* it doesn't get much more extreme than the
> Seekers. They lost their jobs, the press mocked them, and there were
> efforts to keep them away from impressionable young minds. But while
> Martin's space cult might lie at on the far end of the spectrum of human
> self-delusion, there's plenty to go around. And since Festinger's day, an
> array of new discoveries in psychology and neuroscience has further
> demonstrated how our preexisting beliefs, far more than any new facts, can
> skew our thoughts and even color what we consider our most dispassionate
> and logical conclusions. This tendency toward so-called "motivated
> reasoning <http://www.ncbi.nlm.nih.gov/pubmed/2270237>" helps explain why
> we find groups so polarized over matters where the evidence is so
> unequivocal: climate change, vaccines, "death panels," the birthplace and religion
> of the president <http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf> (PDF),
> and much else. It would seem that expecting people to be convinced by the
> facts flies in the face of, you know, the facts.
> The theory of motivated reasoning builds on a key insight of modern
> neuroscience <https://motherjones.com/files/descartes.pdf> (PDF):
> Reasoning is actually suffused with emotion (or what researchers often call
> "affect"). Not only are the two inseparable, but our positive or negative
> feelings about people, things, and ideas arise much more rapidly than our
> conscious thoughts, in a matter of milliseconds-fast enough to detect with
> an EEG device, but long before we're aware of it. That shouldn't be
> surprising: Evolution required us to react very quickly to stimuli in our
> environment. It's a "basic human survival skill," explains political
> scientist Arthur Lupia <http://www-personal.umich.edu/~lupia/> of the
> University of Michigan. We push threatening information away; we pull
> friendly information close. We apply fight-or-flight reflexes not only to
> predators, but to data itself.
>
> We apply fight-or-flight reflexes not only to predators, but to data
> itself.
>
> Consider a person who has heard about a scientific discovery that deeply
> challenges her belief in divine creation-a new hominid, say, that confirms
> our evolutionary origins. What happens next, explains political scientist Charles
> Taber <http://www.stonybrook.edu/polsci/ctaber/> of Stony Brook
> University, is a subconscious negative response to the new information-and
> that response, in turn, guides the type of memories and associations formed
> in the conscious mind. "They retrieve thoughts that are consistent with
> their previous beliefs," says Taber, "and that will lead them to build an
> argument and challenge what they're hearing."
> In other words, when we think we're reasoning, we may instead be
> rationalizing. Or to use an analogy offered by University of Virginia
> psychologist Jonathan Haidt <http://people.virginia.edu/~jdh6n/>: We may
> think we're being scientists, butwe're actually being lawyers<https://motherjones.com/files/emotional_dog_and_rational_tail.pdf> (PDF).
> Our "reasoning" is a means to a predetermined end-winning our "case"-and is
> shot through with biases. They include "confirmation bias," in which we
> give greater heed to evidence and arguments that bolster our beliefs, and
> "disconfirmation bias," in which we expend disproportionate energy trying
> to debunk or refute views and arguments that we find uncongenial.
> That's a lot of jargon, but we all understand these mechanisms when it
> comes to interpersonal relationships. If I don't want to believe that my
> spouse is being unfaithful, or that my child is a bully, I can go to great
> lengths to explain away behavior that seems obvious to everybody
> else-everybody who isn't too emotionally invested to accept it, anyway.
> That's not to suggest that we aren't also motivated to perceive the world
> accurately-we are. Or that we never change our minds-we do. It's just that
> we have other important goals besides accuracy-including identity
> affirmation and protecting one's sense of self-and often those make us
> highly resistant to changing our beliefs when the facts say we should.
>
> Scientific evidence is highly susceptible to misinterpretation. Giving
> ideologues scientific data that's relevant to their beliefs is like
> unleashing them in the motivated-reasoning equivalent of a candy store.
>
>  *MODERN SCIENCE* *ORIGINATED* from an attempt to weed out such
> subjective lapses-what that great 17th century theorist of the scientific
> method, Francis Bacon, dubbed the "idols of the mind." Even if individual
> researchers are prone to falling in love with their own theories, the
> broader processes of peer review and institutionalized skepticism are
> designed to ensure that, eventually, the best ideas prevail.
> Our individual responses to the conclusions that science reaches, however,
> are quite another matter. Ironically, in part because researchers employ so
> much nuance and strive to disclose all remaining sources of uncertainty,
> scientific evidence is highly susceptible to selective reading and
> misinterpretation. Giving ideologues or partisans scientific data that's
> relevant to their beliefs is like unleashing them in the
> motivated-reasoning equivalent of a candy store.
> Sure enough, a large number of psychological studies have shown that
> people respond to scientific or technical evidence in ways that justify
> their preexisting beliefs. In a classic 1979 experiment<http://synapse.princeton.edu/~sam/lord_ross_lepper79_JPSP_biased-assimilation-and-attitude-polarization.pdf> (PDF),
> pro- and anti-death penalty advocates were exposed to descriptions of two
> fake scientific studies: one supporting and one undermining the notion that
> capital punishment deters violent crime and, in particular, murder. They
> were also shown detailed methodological critiques of the fake studies-and
> in a scientific sense, neither study was stronger than the other. Yet in
> each case, advocates more heavily criticized the study whose conclusions
> disagreed with their own, while describing the study that was more
> ideologically congenial as more "convincing."
> Since then, similar results have been found for how people respond to
> "evidence" about affirmative action, gun control, the accuracy of gay
> stereotypes <http://psp.sagepub.com/content/23/6/636.abstract>, and much
> else. Even when study subjects are explicitly instructed to be unbiased and
> even-handed about the evidence, they often fail.
> And it's not just that people twist or selectively read scientific
> evidence to support their preexisting views. According to research by Yale
> Law School professor Dan Kahan<http://www.law.yale.edu/faculty/DKahan.htm> and
> his colleagues, people's deep-seated views about morality, and about the
> way society should be ordered, strongly predict whom they consider to be a
> legitimate scientific expert in the first place-and thus where they
> consider "scientific consensus" to lie on contested issues.
> In Kahan's research<https://motherjones.com/files/kahan_paper_cultural_cognition_of_scientific_consesus.pdf> (PDF),
> individuals are classified, based on their cultural values, as either
> "individualists" or "communitarians," and as either "hierarchical" or
> "egalitarian" in outlook. (Somewhat oversimplifying, you can think of
> hierarchical individualists as akin to conservative Republicans, and
> egalitarian communitarians as liberal Democrats.) In one study, subjects in
> the different groups were asked to help a close friend determine the risks
> associated with climate change, sequestering nuclear waste, or concealed
> carry laws: "The friend tells you that he or she is planning to read a book
> about the issue but would like to get your opinion on whether the author
> seems like a knowledgeable and trustworthy expert." A subject was then
> presented with the résumé of a fake expert "depicted as a member of the
> National Academy of Sciences who had earned a Ph.D. in a pertinent field
> from one elite university and who was now on the faculty of another." The
> subject was then shown a book excerpt by that "expert," in which the risk
> of the issue at hand was portrayed as high or low, well-founded or
> speculative. The results were stark: When the scientist's position stated
> that global warming is real and human-caused, for instance, only 23 percent
> of hierarchical individualists agreed the person was a "trustworthy and
> knowledgeable expert." Yet 88 percent of egalitarian communitarians
> accepted the same scientist's expertise. Similar divides were observed on
> whether nuclear waste can be safely stored underground and whether letting
> people carry guns deters crime. (The alliances did not always hold. In another
> study<http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers> (PDF),
> hierarchs and communitarians were in favor of laws that would compel the
> mentally ill to accept treatment, whereas individualists and egalitarians
> were opposed.)
>
> Head-on attempts to persuade can sometimes trigger a backfire effect,
> where people not only fail to change their minds when confronted with the
> facts-they may hold their wrong views more tenaciously than ever.
>
> In other words, people rejected the validity of a scientific source
> because its conclusion contradicted their deeply held views-and thus the
> relative risks inherent in each scenario. A hierarchal individualist finds
> it difficult to believe that the things he prizes (commerce, industry, a
> man's freedom to possess a gun to defend his family<http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1095&context=fss_papers>)
> (PDF) could lead to outcomes deleterious to society. Whereas egalitarian
> communitarians tend to think that the free market causes harm, that
> patriarchal families mess up kids, and that people can't handle their guns.
> The study subjects weren't "anti-science"-not in their own minds, anyway.
> It's just that "science" was whatever they wanted it to be. "We've come to
> a misadventure, a bad situation where diverse citizens, who rely on diverse
> systems of cultural certification, are in conflict," says Kahan<http://seagrant.oregonstate.edu/blogs/communicatingclimate/transcripts/Episode_10b_Dan_Kahan.html>
> .
> And that undercuts the standard notion that the way to persuade people is
> via evidence and argument. In fact, head-on attempts to persuade can
> sometimes trigger a backfire effect, where people not only fail to change
> their minds when confronted with the facts-they may hold their wrong views
> more tenaciously than ever.
> Take, for instance, the question of whether Saddam Hussein possessed
> hidden weapons of mass destruction just before the US invasion of Iraq in
> 2003. When political scientists Brendan Nyhan and Jason Reiflershowed
> subjects fake newspaper articles<http://www-personal.umich.edu/~bnyhan/nyhan-reifler.pdf> (PDF)
> in which this was first suggested (in a 2004 quote from President Bush) and
> then refuted (with the findings of the Bush-commissioned Iraq Survey Group
> report, which found no evidence of active WMD programs in pre-invasion
> Iraq), they found that conservatives were more likely than before to
> believe the claim. (The researchers also tested how liberals responded when
> shown that Bush did not actually "ban" embryonic stem-cell research.
> Liberals weren't particularly amenable to persuasion, either, but no
> backfire effect was observed.)
> Another study gives some inkling of what may be going through people's
> minds when they resist persuasion. Northwestern University sociologist Monica
> Prasad <http://www.sociology.northwestern.edu/faculty/prasad/home.html> and
> her colleagues wanted to test whether they could dislodge the notion that
> Saddam Hussein and Al Qaeda were secretly collaborating among those most
> likely to believe it-Republican partisans from highly GOP-friendly
> counties. So the researchers set up a study<http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf> (PDF)
> in which they discussed the topic with some of these Republicans in person.
> They would cite the findings of the 9/11 Commission, as well as a statement
> in which George W. Bush himself denied his administration had "said the
> 9/11 attacks were orchestrated between Saddam and Al Qaeda."
>
> One study showed that not even Bush's own words could change the minds of
> Bush voters who believed there was an Iraq-Al Qaeda link.
>
> As it turned out, not even Bush's own words could change the minds of
> these Bush voters-just 1 of the 49 partisans who originally believed the
> Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting
> the correction in a variety of ways, either by coming up with
> counterarguments or by simply being unmovable:
>  *Interviewer: *[T]he September 11 Commission found no link between
> Saddam and 9/11, and this is what President Bush said. Do you have any
> comments on either of those?
>  *Respondent: *Well, I bet they say that the Commission didn't have any
> proof of it but I guess we still can have our opinions and feel that way
> even though they say that.
> The same types of responses are already being documented on divisive
> topics facing the current administration. Take the "Ground Zero mosque."
> Using information from the political myth-busting siteFactCheck.org<http://www.factcheck.org/>,
> a team at Ohio State presented subjects<http://www.comm.ohio-state.edu/kgarrett/FactcheckMosqueRumors.pdf> (PDF)
> with a detailed rebuttal to the claim that "Feisal Abdul Rauf, the Imam
> backing the proposed Islamic cultural center and mosque, is a
> terrorist-sympathizer." Yet among those who were aware of the rumor and
> believed it, fewer than a third changed their minds.
> A key question-and one that's difficult to answer-is how "irrational" all
> this is. On the one hand, it doesn't make sense to discard an entire belief
> system, built up over a lifetime, because of some new snippet of
> information. "It is quite possible to say, 'I reached this
> pro-capital-punishment decision based on real information that I arrived at
> over my life,'" explains Stanford social psychologist Jon Krosnick<http://communication.stanford.edu/faculty/krosnick/>.
> Indeed, there's a sense in which science denial could be considered keenly
> "rational." In certain conservative communities, explains Yale's Kahan,
> "People who say, 'I think there's something to climate change,' that's
> going to mark them out as a certain kind of person, and their life is going
> to go less well."
> This may help explain a curious pattern Nyhan and his colleagues found
> when they tried to test the fallacy<http://www-personal.umich.edu/~bnyhan/obama-muslim.pdf> (PDF)
> that President Obama is a Muslim. When a nonwhite researcher was
> administering their study, research subjects were amenable to changing
> their minds about the president's religion and updating incorrect views.
> But when only white researchers were present, GOP survey subjects in
> particular were more likely to believe the Obama Muslim myth than before.
> The subjects were using "social desirabililty" to tailor their beliefs (or
> stated beliefs, anyway) to whoever was listening.
>
> A predictor of whether you accept the science of global warming? Whether
> you're a Republican or a Democrat.
>
> Which leads us to the media. When people grow polarized over a body of
> evidence, or a resolvable matter of fact, the cause may be some form of
> biased reasoning, but they could also be receiving skewed information to
> begin with-or a complicated combination of both. In the Ground Zero mosque
> case, for instance, a follow-up study<http://www.comm.ohio-state.edu/kgarrett/MediaMosqueRumors.pdf> (PDF)
> showed that survey respondents who watched Fox News were more likely to
> believe the Rauf rumor and three related ones-and they believed them more
> strongly than non-Fox watchers.
> Okay, so people gravitate toward information that confirms what they
> believe, and they select sources that deliver it. Same as it ever was,
> right? Maybe, but the problem is arguably growing more acute, given the way
> we now consume information-through the Facebook links of friends, or tweets
> that lack nuance or context, or "narrowcast<http://en.wikipedia.org/wiki/Narrowcasting>"
> and often highly ideological media that have relatively small, like-minded
> audiences. Those basic human survival skills of ours, says Michigan's
> Arthur Lupia, are "not well-adapted to our information age."
>  *IF YOU WANTED TO SHOW *how and why fact is ditched in favor of
> motivated reasoning, you could find no better test case than climate
> change. After all, it's an issue where you have highly technical
> information on one hand and very strong beliefs on the other. And sure
> enough, one key predictor of whether you accept the science of global
> warming is whether you're a Republican or a Democrat. The two groups have
> been growing more divided in their views about the topic, even as the
> science becomes more unequivocal.
> So perhaps it should come as no surprise that more education doesn't budge
> Republican views. On the contrary: In a 2008 Pew survey<http://people-press.org/report/417/a-deeper-partisan-divide-over-global-warming>,
> for instance, only 19 percent of college-educated Republicans agreed that
> the planet is warming due to human actions, versus 31 percent of
> non-college educated Republicans. In other words, a higher education
> correlated with an increased likelihood of denying the science on the
> issue. Meanwhile, among Democrats and independents, more education
> correlated with greater acceptance of the science.
> Other studies have shown a similar effect: Republicans who think they
> understand the global warming issue best are least concerned about it; and
> among Republicans and those with higher levels of distrust of science in
> general, learning more about the issue doesn't increase one's concern about
> it. What's going on here? Well, according to Charles Taber and Milton Lodge
> of Stony Brook, one insidious aspect of motivated reasoning is that
> political sophisticates are prone to be more biased than those who know
> less about the issues. "People who have a dislike of some policy-for
> example, abortion-if they're unsophisticated they can just reject it out of
> hand," says Lodge. "But if they're sophisticated, they can go one step
> further and start coming up with counterarguments." These individuals are
> just as emotionally driven and biased as the rest of us, but they're able
> to generate more and better reasons to explain why they're right-and so
> their minds become harder to change.
> That may be why the selectively quoted emails of Climategate were so
> quickly and easily seized upon by partisans as evidence of scandal.
> Cherry-picking is precisely the sort of behavior you would expect motivated
> reasoners to engage in to bolster their views-and whatever you may think
> about Climategate, the emails were a rich trove of new information upon
> which to impose one's ideology.
> Climategate had a substantial impact on public opinion, according toAnthony
> Leiserowitz <http://environment.yale.edu/profile/leiserowitz/>, director
> of the Yale Project on Climate Change Communication<http://environment.yale.edu/climate/>.
> It contributed to an overall drop in public concern about climate change
> and a significant loss of trust in scientists. But-as we should expect by
> now-these declines were concentrated among particular groups of Americans:
> Republicans, conservatives, and those with "individualistic" values.
> Liberals and those with "egalitarian" values didn't lose much trust in
> climate science or scientists at all. "In some ways, Climategate was like a
> Rorschach test," Leiserowitz says, "with different groups interpreting
> ambiguous facts in very different ways."
>
> Is there a case study of science denial that largely occupies the
> political left? Yes: the claim that childhood vaccines are causing an
> epidemic of autism.
>
> So is there a case study of science denial that largely occupies the
> political left? Yes: the claim that childhood vaccines are causing an
> epidemic of autism. Its most famous proponents are an environmentalist (Robert
> F. Kennedy Jr<http://www.huffingtonpost.com/robert-f-kennedy-jr-and-david-kirby/vaccine-court-autism-deba_b_169673.html>.)
> and numerous Hollywood celebrities (most notably Jenny McCarthy<http://www.huffingtonpost.com/jenny-mccarthy/vaccine-autism-debate_b_806857.html> and
> Jim Carrey). The*Huffington Post *gives a very large megaphone to
> denialists. And Seth Mnookin <http://sethmnookin.com/>, author of the new
> book *The Panic Virus* <http://www.powells.com/biblio/1-9781439158647-0>,
> notes that if you want to find vaccine deniers, all you need to do is go
> hang out at Whole Foods.
> Vaccine denial has all the hallmarks of a belief system that's not
> amenable to refutation. Over the past decade, the assertion that childhood
> vaccines are driving autism rates has been undermined<http://discovermagazine.com/2009/jun/06-why-does-vaccine-autism-controversy-live-on/article_print> by
> multiple epidemiological studies-as well as the simple fact that autism
> rates continue to rise, even though the alleged offending agent in vaccines
> (a mercury-based preservative called thimerosal) has long since been
> removed.
> Yet the true believers persist-critiquing each new study that challenges
> their views, and even rallying to the defense of vaccine-autism researcher
> Andrew Wakefield, after his 1998 *Lancet *paper<http://www.thelancet.com/journals/lancet/article/PIIS0140673697110960/fulltext>-which
> originated the current vaccine scare-was retracted and he subsequently lost
> his license<http://www.gmc-uk.org/Wakefield_SPM_and_SANCTION.pdf_32595267.pdf> (PDF)
> to practice medicine. But then, why should we be surprised? Vaccine deniers
> created their own partisan media, such as the website Age of Autism, that
> instantly blast out critiques and counterarguments whenever any new
> development casts further doubt on anti-vaccine views.
> It all raises the question: Do left and right differ in any meaningful way
> when it comes to biases in processing information, or are we all equally
> susceptible?
> There are some clear differences. Science denial today is considerably
> more prominent on the political right-once you survey climate and related
> environmental issues, anti-evolutionism, attacks on reproductive health
> science by the Christian right, and stem-cell and biomedical matters. More
> tellingly, anti-vaccine positions are virtually nonexistent among
> Democratic officeholders today-whereas anti-climate-science views are
> becoming monolithic among Republican elected officials.
> Some researchers have suggested that there are psychological differences
> between the left and the right that might impact responses to new
> information-that conservatives are more rigid and authoritarian, and
> liberals more tolerant of ambiguity. Psychologist John Jost of New York
> University has further argued that conservatives are "system justifiers":
> They engage in motivated reasoning to defend the status quo.
>
> We all have blinders in some situations. The question then becomes: What
> can be done to counteract human nature?
>
> This is a contested area, however, because as soon as one tries to
> psychoanalyze inherent political differences, a battery of counterarguments
> emerges: What about dogmatic and militant communists? What about how the
> parties have differed through history? After all, the most canonical case
> of ideologically driven science denial is probably the rejection of
> genetics in the Soviet Union, where researchers disagreeing with the
> anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed,
> and genetics itself was denounced as a "bourgeois" science and officially
> banned.
> The upshot: All we can currently bank on is the fact that we all have
> blinders in some situations. The question then becomes: What can be done to
> counteract human nature itself?
>  *GIVEN THE POWER OF* our prior beliefs to skew how we respond to new
> information, one thing is becoming clear: If you want someone to accept new
> evidence, make sure to present it to them in a context that doesn't trigger
> a defensive, emotional reaction.
> This theory is gaining traction in part because of Kahan's work at Yale.
> In one study<http://www.scribd.com/doc/3446682/The-Second-National-Risk-and-Culture-Study-Making-Sense-of-and-Making-Progress-In-The-American-Culture-War-of-Fact>,
> he and his colleagues packaged the basic science of climate change into
> fake newspaper articles bearing two very different headlines-"Scientific
> Panel Recommends Anti-Pollution Solution to Global Warming" and "Scientific
> Panel Recommends Nuclear Solution to Global Warming"-and then tested how
> citizens with different values responded. Sure enough, the latter framing
> made hierarchical individualists much more open to accepting the fact that
> humans are causing global warming. Kahan infers that the effect occurred
> because the science had been written into an alternative narrative that
> appealed to their pro-industry worldview.
> You can follow the logic to its conclusion: Conservatives are more likely
> to embrace climate science if it comes to them via a business or religious
> leader, who can set the issue in the context of different values than those
> from which environmentalists or scientists often argue. Doing so is,
> effectively, to signal a détente in what Kahan has called a "culture war of
> fact." In other words, paradoxically, you don't lead with the facts in
> order to convince. You lead with the values-so as to give the facts a
> fighting chance.
>  *This **story*<http://www.motherjones.com/politics/2011/03/denial-science-chris-mooney>
> * first appeared in **Mother Jones* <http://www.motherjones.com/>*
>  magazine.*
>
>
> ------------------------------
> =======================================================
> List services made available by First Step Internet,
> serving the communities of the Palouse since 1994.
>               http://www.fsr.net
>          mailto:Vision2020 at moscow.com<http://index.html?_n[p][main]=win.main.tree&_n[p][content]=mail.compose&to=Vision2020 at moscow.com>
> =======================================================****
>
> =======================================================
>  List services made available by First Step Internet,
>  serving the communities of the Palouse since 1994.
>                http://www.fsr.net
>           mailto:Vision2020 at moscow.com
> =======================================================
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.fsr.com/pipermail/vision2020/attachments/20130619/d94a03dd/attachment-0001.html>


More information about the Vision2020 mailing list