Why science is so hard to believe
Joyce Park stashed this in Science
One of the best explanations of science denial by both left AND right that I've read. On the climate-denier side it doesn't matter whether you believe in global warming or not, you won't admit it because you KNOW FOR A FACT that it will lead to taxes and regulations that you don't like AND basically there are no personal and immediate costs to your denial. The anti-vaccinators are actually worse because Big Pharma is already super-regulated by the government over vaccine safety and prices AND their own children are very likely to be the ones who suffer the personal and immediate costs of their beliefs. But in the end it comes down to "tribe not truth": for non-scientists, belief in various scientific "facts" is more a statement of the kind of person they think they are rather than whether they would or could defend a cohesive alternate epistemology.
It's so weird to live in an era where science is incompatible with political beliefs in both parties.
And yet, I don't get the impression that a third party that embraces science would do well.
"tribe not truth" is a great name for many things.
Like a rock band?
or a PW stash? website? app? novel?
This article reminds me of an article you stashed 3 years ago on how conservatives lost their faith in science:
great call back!
Yes, and that article is worth re-reading. It stands up to time.
The scientific method leads us to truths that are less than self-evident, often mind-blowing and sometimes hard to swallow.
In the early 17th century, when Galileo claimed that the Earth spins on its axis and orbits the sun, he wasn’t just rejecting church doctrine. He was asking people to believe something that defied common sense — because it sure looks like the sun’s going around the Earth, and you can’t feel the Earth spinning. Galileo was put on trial and forced to recant. Two centuries later, Charles Darwin escaped that fate. But his idea that all life on Earth evolved from a primordial ancestor and that we humans are distant cousins of apes, whales and even deep-sea mollusks is still a big ask for a lot of people.
Even when we intellectually accept these precepts of science, we subconsciously cling to our intuitions — what researchers call our naive beliefs. A study by Andrew Shtulman of Occidental College showed that even students with an advanced science education had a hitch in their mental gait when asked to affirm or deny that humans are descended from sea animals and that the Earth goes around the sun. Both truths are counterintuitive. The students, even those who correctly marked “true,” were slower to answer those questions than questions about whether humans are descended from tree-dwelling creatures (also true but easier to grasp) and whether the moon goes around the Earth (also true but intuitive).
Shtulman’s research indicates that as we become scientifically literate, we repress our naive beliefs but never eliminate them entirely. They nest in our brains, chirping at us as we try to make sense of the world.
Most of us do that by relying on personal experience and anecdotes, on stories rather than statistics. We might get a prostate-specific antigen test, even though it’s no longer generally recommended, because it caught a close friend’s cancer — and we pay less attention to statistical evidence, painstakingly compiled through multiple studies, showing that the test rarely saves lives but triggers many unnecessary surgeries. Or we hear about a cluster of cancer cases in a town with a hazardous-waste dump, and we assume that pollution caused the cancers. Of course, just because two things happened together doesn’t mean one caused the other, and just because events are clustered doesn’t mean they’re not random. Yet we have trouble digesting randomness; our brains crave pattern and meaning.
Even for scientists, the scientific method is a hard discipline. They, too, are vulnerable to confirmation bias — the tendency to look for and see only evidence that confirms what they already believe. But unlike the rest of us, they submit their ideas to formal peer review before publishing them. Once the results are published, if they’re important enough, other scientists will try to reproduce them — and, being congenitally skeptical and competitive, will be very happy to announce that they don’t hold up. Scientific results are always provisional, susceptible to being overturned by some future experiment or observation. Scientists rarely proclaim an absolute truth or an absolute certainty. Uncertainty is inevitable at the frontiers of knowledge.
That provisional quality of science is another thing a lot of people have trouble with. To some climate-change skeptics, for example, the fact that a few scientists in the 1970s were worried (quite reasonably, it seemed at the time) about the possibility of a coming ice age is enough to discredit what is now the consensus of the world’s scientists: The planet’s surface temperature has risen by about 1.5 degrees Fahrenheit in the past 130 years, and human actions, including the burning of fossil fuels, are extremely likely to have been the dominant cause since the mid-20th century.
In the climate debate, the consequences of doubt are likely to be global and enduring. Climate-change skeptics in the United States have achieved their fundamental goal of halting legislative action to combat global warming. They haven’t had to win the debate on the merits; they’ve merely had to fog the room enough to keep laws governing greenhouse gas emissions from being enacted.
Some environmental activists want scientists to emerge from their ivory towers and get more involved in the policy battles. Any scientist going that route needs to do so carefully, says Liz Neeley. “That line between science communication and advocacy is very hard to step back from,” she says. In the debate over climate change, the central allegation of the skeptics is that the science saying it’s real and a serious threat is politically tinged, driven by environmental activism and not hard data. That’s not true, and it slanders honest scientists. But the claim becomes more likely to be seen as plausible if scientists go beyond their professional expertise and begin advocating specific policies.
It’s their very detachment, what you might call the cold-bloodedness of science, that makes science the killer app. It’s the way science tells us the truth rather than what we’d like the truth to be. Scientists can be as dogmatic as anyone else — but their dogma is always wilting in the hot glare of new research. In science it’s not a sin to change your mind when the evidence demands it. For some people, the tribe is more important than the truth; for the best scientists, the truth is more important than the tribe.
There's a really interesting Harry Potter fan fiction book at http://hpmor.com/ , Harry Potter and the Methods of Rationality, which basically assumes an alternative universe where Harry is a child science genius. It's really well written. Harry is thrust into a world of magic where wizards don't understand the scientific method at all. In a fun way, the book explores many of the social challenges with the scientific method, how people will pick a conclusion before they collect data, ignore results that are inconsistent with their world view, and when the data doesn't match their expectations, they'll get angry or attempt to discredit the result.
Fascinating way to illustrate the point. Thanks for the pointer, Dylan!