Science Education and Denialism
Just today I read this really interesting article on science education in the Boston Review:
The article is very good; it argues that the “top-down, technocratic view of scientific communication”, based on the idea that the public simply knows too little science is bound to fail. This approach is popular though, “It offers a relatively simple origin story for anti-science attitudes and points to a relatively easy solution”. I agree. The authors also point out that “social trust in science must be earned and cultivated, and this process depends as much on power as on knowledge”. The general public has in many cases a reason to mistrust science, both due to its commercialization and its history of nasty history, like the episodes of involuntary experimentation on humans. If someone says that big pharma is full of greedy sonofabitches. it’s hard to tell them that they are wrong. Which of course is not an immunological argument against vaccine efficiency. It’s not that the distrust is misplaced, it’s what it is directed at that’s a bit misaimed, in my opinion.
Science denialism occurs in so many shades these days (vaccines, moon landing, evolution …), and the topic of science education is obviously closely tied in to this phenomenon. I agree with the article in BR that a top down, paternalistic approach to science communication does not work. I regularly write for the Austrian intellectual weekly Die Furche, and I write about environmental problems, including climate change. I have had email correspondences with readers who were initially “climate sceptics”, and I think I have at least softened their stances in some cases. Unsurprisingly telling someone that I am smart and they are dumb is not a strategy which works, and didn’t employ it. Some of the folks I communicated with were educated, just not in the natural sciences Basic respect and politeness, and assuming the other party argues in good faith go a long way.
So I think the BR article is right in that the style of science communication is often misguided, but I think there is more to the phenomenon of science denialism. Depending on what is denied, it’s a combination of a lack of knowledge and lack of trust. If the lack of trust is motivated by observations and experiences (just miss-directed) or not is really not relevant for explaining why some aspects of science are denied, and others are not. And different versions of science denialism might have a very different weighing of lack of knowledge and lack of trust.
I had written a blog post about this aspect of science denialism 3 years ago, thought about it, modified it a bit, pitched it to a few popular science/science communication online magazines (and hence didn’t put it on public on the blog), didn’t get a positive response, and then forgot about it, sort of. The above article reminded me about the post, and the theory outlined in it. I still think it’s spot on. Here it is:
Why is there no science denialism directed at Gödel’s theorem or the Schrödinger equation? I think this article and the graph in it explain it.
More science education will not necessarily move folks’ perception of scientific insights over the “Science denialism Maginot line”. It will in the case of science denialism which works as group stupidity, like the belief that the Earth is flat. But when the main driving force for a type of science denialism is a discomfort with the conclusions of science, then more knowledge will do little.