Miami and Bangladesh are going to drown because Fox News tells us that environmentalists and egghead scientists are Communists who don't love Jesus or Brad Paisley.
I've always thought that it's possible, using reasonable arguments, to persuade people of the truth, and none of Nyhan's research makes me think otherwise.
2. Have you read the comments to that New Yorker article? In a periodical that deliberately targets a "sophisticated" readership? We are all doomed.
I've thought that even if you can't change people's minds, with sufficient ridicule you might hope to make them keep quiet about their stupid beliefs, and thus maybe limit the spread of same. The rise of contrarianism (among elites) and proud embrace of ignorance (among non-elites) doom this strategy.
It took me way too long to get 2.
4 is a good point. It's what's behind my proceed immediately to personally insulting libertarian fucktards discourse strategy.
I will say that IIRC Nyhan's "spinsanity" site was incredibly bad. Maybe he's actually become a smart professor who does useful things.
7: That's my recollection as well: crippled by a precious insistence on "both sides do it."
The problem with this line of research from a "how are my political opponents wrong" (and they are, believe me) perspective is that it pretty strongly implies that all of us have things that we believe in contravention of fact. So asking the question "how do we convince people who are wrong" is sort of superseded by the queasy "what am I wrong about?"
At this point are you really uncomfortable with the idea that misinformation is politically lopsided?
The article invokes stereotype threat like it's uncontroversially true. But it is controversial at best, right? Should that make me trust the rest of what it says less?
The 'why people who disagree with me are wrong' is pushed out by the supply side and lends itself to vanity publishing.
The demand is for 'show me something I didn't know' or (confirmation bias) give me factoids to support my beliefs.
Also on topic: internet myth business model explainer.
It seems to me the logical conclusion is to put the D0ve "Real Beauty" folks in charge of a pro-vaccination campaign. "You're pretty. Vaccinate your children."
10: no! I'm uncomfortable with the idea that I'm almost certainly wrong about obvious things as don't know it.
"as" s/b "and".
Anyhow I'm not going to stop assuming I'm mostly right about things, and the question of how to talk people who aren't me out of believing stupid things is important. But it seems to me that taking this research as something that generally applies to other people could lead me astray, if by some chance it turns out that I am actually wrong about something.
14 -- And that's what you're wrong about. Are you making a genuine attempt at an evidence based life?
I was thinking about what process I'd want people who are wrong to follow: something like "Ask myself what beliefs I have that people who ideologically disagree with me would say that I was simply denying reality about; for each such belief, check in with what reputable authorities say; if they disagree with me, figure out if I have a good reason for discounting them." That seems as if it should straighten out antivaxxers and climate change deniers.
When I apply it to my own beliefs, all I come up with for step (a) is various rightwing Econ 101 bullshit and (b) general human equality, and on both of those when I get to step 2, reputable authority is with me. So either I'm right about everything or my system doesn't work.
Re: 14
Pessimistic meta-induction, innit. Pretty much all of the smartest, best-informed humans in history were wrong about a huge amount. We are likely to be the same.
http://plato.stanford.edu/entries/scientific-realism/#PesInd
16: of course not. And, yeah, fair, it doesn't actually bother me as a general rule that my cognition works the same way as everybody else's.
All I'm saying is the framing "how do we fix the fact that people on the left are swayed by facts and people on the right are not?" (which might be a little strawmannish) is misleading to the point of uselessness and is the wrong question to derive from work on false belief.
What you're not wrong about -- and what a bunch of people are wrong about -- is willingness to change what you believe about reality based on facts. There's no 'both sides are wrong' when it comes to self-awareness: some people are, and some are not. And saying 'both sides do it' is as inaccurate in this regard as when talking about client change denial.
There are people who consider themselves 'left' who are swayed by all manner of woo, so I'm not willing to say it's just a left/right thing. It's a matter of how great a role faith plays, and how strong the distruct is of education qua education.
People always seem to follow these arguments through to the conclusion that reason has never convinced anyone of anything, which is nihilistic and self-refuting.
Although actually reading the link shows that this doesn't apply here.
17: I think the problem there is defining "reputable authority."
That is, if you're wrong about something you're likely also wrong about who counts as a reputable authority on the topic.
I'd be more convinced that scientific reasoning was helpful for these kinds of things if it didn't seem like 60% of doctors and engineers are total idiot nutters outside of their immediate specialty areas.
And maybe I should add "the world's most brilliant physicists" to 27, based on Essear's reporting here.
I've long believed that changing people's minds is usually impossible and that trying is pointless, so I'm happy to see empirical support for that conclusion.
Well doesn't this make everything seem pointless.
I don't have fond memories of Spinsanity so that does color my take here.
That said, I'm glad someone is researching this, and irritated that they seem to be primarily picking flashpoint politically partisan issues.
When I think about issues I have changed my mind on (IVF, gay marriage, how to do policy advocacy, medical marijuana to name a few) I don't think it was facts or emotions so much as sustained, repeated engagement with many people, often over many years.
Ten years ago, I truly didn't understand why civil unions weren't just as good as marriage. It took me a long time to get it. It would be silly to say that I changed my belief just because it wasn't very strongly held, and equally silly to say that I changed it for partisan reasons (especially since I have spent only four months of my life aligned with a political party).
Tl;dr: Understanding why people believe untrue things is very important, as is understanding why people change their minds, but I don't think this research approach is advancing our understanding of either very much.
it doesn't actually bother me as a general rule that my cognition works the same way as everybody else's. Mouseover!