Structuring society to counteract science denialism

The essay below is my entry to the OeAW’s (Österreichische Akademie der Wissenschaften/ Austrian Academy of Sciences) 2022 Preisfrage. The topic was “Fact or fiction: How to deal with scientific scepticism?”

—–

The problem of scientific scepticism, or, more accurately, science denialism, is a major one that has serious implications, particularly in modern society where so much is driven by scientific advances, and it is heartening that the Austrian Academy of Sciences has chosen it as a topic for this year’s Preisfrage. Before addressing the topic, I think it is necessary to clarify exactly what I do and don’t mean by certain terms to ensure that we all enter this discussion from the same starting point.

First, I will say that scepticism—questioning established knowledge—is good. Indeed, scepticism is a core principle of science itself. In science, all our knowledge is provisional, should be treated with scepticism and is accepted only to the extent that the current evidence supports it. However, the scientific scepticism that is of interest to the academy goes beyond this, to the extent that people, who lack the deep knowledge and training required to assess the evidence, doubt the scientific consensus in a manner that is disproportionate to the actual uncertainty of the conclusion. For this reason, I will not use the term scientific scepticism, which is a virtue that all scientists should share, and instead use science denialism which better captures that the problem is not scepticism itself but an unjustified denial of the scientific consensus.

Second, I will extend some charity to the science deniers. While I, too, am regularly frustrated and angry about science denialism, we need to recognise two important points; everyone holds false beliefs and no one wants to believe a lie. Even when we do our best, we all make mistakes as none of us have perfect knowledge. Some of our beliefs, even those about which there is a scientific consensus today, are wrong or incomplete. This has been true throughout history and it would be sheer arrogance to claim that we have finally achieved some sort of perfect truth that will stand for eternity. While we try our hardest, we are imperfect and can not expect perfection from others. Additionally, I believe that the majority of science deniers care about the truth. They sincerely hold that what they believe is true and it matters to them that it is true. While we may not have reached the same conclusion and whether we accept the scientific consensus or not, we are all searching for the truth.

In discussions about science denialism, it is extremely important to draw a distinction between facts and opinions; I fear that in many cases people consider different value systems as science denialism. While there is obviously interplay between facts and values, they are not the same. If someone opposes vaccination for covid-19 because they don’t accept that the vaccine is safe or don’t accept that the virus is real, that is science denialism because the position contradicts the scientific consensus. However, if someone opposes mandatory covid-19 vaccination on the grounds of personal autonomy, that is not science denialism, as there is no disagreement on the facts, but there is a conflict of values. The effect of vaccination on the spread of a disease is a scientific question but how we should act with that information is a matter of philosophy. We must be careful not to conflate the two by taking the position that how we think we should react to scientific information, based on our values, is a scientific one. To quote a basic philosophical aphorism, one can not derive ought from is.

Finally, I need to distinguish between two different types of science denialism that we might encounter; science denialism as an expression of a person’s sincere belief and science denialism as a form of propaganda. The latter is weaponised disinformation by a person who knows that the information is false but spreads it nonetheless for some political gain or malevolent purposes. One should immediately recognise that this is in contradiction to my characterisation of people caring about the truth and so is beyond the scope of what I will discuss in this essay. While there is overlap between these two groups in practice, i.e. both may share false stories, the motivations and intentions are very different, i.e. one sincerely believes what they share is true while the other knows what they are saying is false. Deliberately spreading misinformation is a huge problem, that I hope other essays in this series will address, but I am assuming that those who deliberately spread misinformation are a minority of those who hold science denialist positions. In fact, given that they know they are spreading misinformation, they do not actually hold science denialist views.

To reiterate, in this essay I wish to discuss the issue of science denialism, where people reject the scientific consensus, with the understanding that such people value truth and think that their beliefs are true. I am limiting science denialism to the objective facts; if and how someone chooses to act in response to those facts is a question of values and is not the purview of science. While I acknowledge the importance of anti-science propaganda in forming science denialist views, my focus will be on preventing people from falling for such propaganda or releasing them from its effects.

There is a need for greater trust in scientists and scientific institutions

Why, when given a choice between listening to the consensus views of doctors and scientists, all of whom are highly-educated and work in the appropriate domain, would people instead choose to listen to Alex Jones, a man with no scientific qualifications, no research experience and a history of promoting paranoid conspiracy theories, including the idea that the government is putting chemicals in tap water to make frogs gay? Why would anyone believe Alex “gay frogs” Jones?

Simply put, they trust him. Despite his complete lack of knowledge and competence, they trust him more than they trust the doctors and scientists. This is a problem because we need people to trust scientists. Not absolute trust—scepticism is an important and valuable part of science—but there should be a high base level of trust and recognition that we all want the truth and scientists are doing their best to find it. Trust is one of the most important aspects of science communication [1], so I want to spend some time on what needs to change to improve trust in science and scientists.

One potential reason there is a lack of trust is the presumed distinction between the general public and scientists, the latter commonly said to exist in ivory towers. From time to time, there are headlines that state that 81% of Americans can’t name a single living scientist [2] or that 52% of Canadians [3] and 25% of Europeans [4] are unable to name a single female scientist. In contrast, most people are able to recognise and name many more celebrities, whether actors, sportsmen or musicians. These celebrities have dedicated public relations specialists but are also far more visible. We watch their films or games, listen to their music and see them make guest appearances on talk shows. Despite their work being essential for our lives and health, scientists are seldom visible.

I think one way to boost public trust in science is for scientists to be more visible. For all the (mostly undeserved, in my opinion) hate that The Big Bang Theory received, it did a good job of helping people relate to scientists and featured actual scientists. Stephen Hawking, Neil deGrasse Tyson, George Smoot, Bill Nye, Elon Musk, Bill Gates, Mike Massimino, Steve Wozniak and Kip Thorne are all STEM personalities that appeared on the show. Obviously, that particular sort of engagement is not going to work all the time and not everyone is suited for such work but I think there is a need for celebrity scientists to put a face to science.

If scientists are not visible, there is no opportunity to build trust. If you are scared during a pandemic or confused about an issue, will you listen to a stranger or a person with whom you are familiar? Rationally, we should listen to the one who has knowledge and expertise about the issue in question, but people do not necessarily behave rationally. At least some scientists need to be regularly in the public eye and not just hosting documentaries or writing popular science books. While those are great, they only reach the people who are already interested in science. Scientists are people too and many are quick-witted with a great sense of humour. Let’s see them alongside comedians and other celebrities on panel shows like Taskmaster or QI. Let’s see them featured on late-night talk shows, not only when they have big findings but perhaps also when they get big grants. Give people an opportunity to know what scientists are trying to do and why we care about it. Make scientists a visible and familiar part of our society.

But, merely seeing scientists in the media is not enough to build trust. What signals about science do the actions of government send to the public? It’s all very well for a government to say that scientists are valued, indeed most governments pay lip service to science, but actions speak louder than words. Although they are still working towards a degree, doctoral candidates are junior scientists who conduct novel research and advance our knowledge of the world. But how are they treated?

I left South Africa to do my PhD in Austria. Part of my motivation was financial; in South Africa I would’ve had to pay high fees to do a PhD and, even if I got one, a scholarship would not cover my living expenses. As part of the Vienna BioCenter PhD Programme, I was paid a salary, as a researcher, which was actually more than in either of the two postdoctoral positions I have held subsequently! Not everyone is so lucky and PhD students do not just struggle in poor countries. A survey of 178 institutions in the US found that only 2% of them paid PhD students a living wage [5]! Just a little further north, in Canada, the situation is not much better; despite the cost of living increasing every year, government support for graduate students has not increased in almost 20 years [6]! Can we really expect the public to trust scientists when governments, despite what they say, demonstrate that they don’t think scientists are worth supporting?

So many important aspects of society rest on a bedrock of science where scientific voices should be heard, including health. Not only should scientists be involved, but the public wants to hear from scientists who work in the relevant fields [7]. For this reason, it was heartening that the South African government responded to the coronavirus pandemic by recruiting many leading scientists to a Ministerial Advisory Committee (MAC) that could provide expert advice. However, it wasn’t long before the government reformed the MAC and lashed out at scientists for criticising their lock-down regulations, which included a ban on selling open-toe shoes, for being unscientific [8]. This was not a once-off; at various points during the pandemic, government rules seemed to disregard the advice of the MAC [9,10]. In the UK, a similar story has unfolded as one high-ranking politician recently claimed that scientists were too empowered during the coronavirus pandemic [11]. When the government assembles a group of scientists and then fails to follow their advice while formulating regulations, it undermines the scientists and the broader scientific enterprise. It sets the bad principle that you should value science and scientists only insofar as they confirm your preconceptions and can be ignored otherwise. This cherry-picking is common among science denialists.

Banning misinformation is not a solution

Continuing with the example of South Africa’s response to covid-19, early on in the pandemic, the government criminalised the spread of misinformation on the coronavirus pandemic [12] leading news sites, like Daily Maverick, to close the comments section on any article that related to covid-19. Such a response should raise serious questions about how this impacts on free speech and the ability for citizens to have a voice in how their own country is run. Despite the very real risks entailed in restricting what people can say, banning misinformation is a very common approach both from governments and tech companies. I can understand the appeal; if the problem is bad information circulating in the public sphere, banning that information should solve the problem. However, I think this is a misguided approach that will ultimately do more harm than good. To quote the English philosopher John Stuart Mill [13]:

The peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error.

At this point, I refer back to my opening remarks; I am specifically addressing people sharing their genuinely-held, if false, beliefs and not to people who are saying things that they know are false for some ulterior motive. As there is no space here for a detailed discussion on the importance of free speech, I will limit myself to a few key points.

First, it is extremely difficult to define what is misinformation that should be suppressed and what is not. None of us has perfect knowledge and presumably these decisions will be made by those in power and thus further entrench the dominant narrative. We do not need to look hard to find cases where those in power suppress the truth by deeming it misinformation, e.g. the Russian government when it comes to their invasion of Ukraine. Even contradicting the scientific consensus is not a definite marker of misinformation as all scientific knowledge is provisional and we can not guarantee that, in the future, new results will not cast doubt on present-day scientific “truths.” There is, of course, a difference between negative scientific truths where there is no evidence to support a position and positive scientific truths where there is evidence supporting a position. As evidence accumulates, it becomes less and less likely that positive scientific truths will be overturned.

Second, we should note that hearing science denialist voices, or differing opinions, is not in itself a problem. Do we worry that Richard Dawkins, or any other evolutionary biologist, will become a creationist after debating creationists? If we support a particular political party, do we fear that hearing the opposition will cause us to switch allegiance? I think not. We are fully capable of being exposed to disagreement without being convinced. In the cases where the other side makes strong arguments then we should reconsider our position. The fear that we must ban people from expressing science denialist, or other divergent, views is a paternalistic one that views other people as ignorant and gullible. If that fear is real then the bigger problem is not misinformation itself but the fact that a large swathe of the population is not able to resist that misinformation. The correct course of action would be to address those deficiencies.

Third, restricting what people can say also restricts compassion and teaches people to defer to authority. Let us take the example of someone who believes that vaccines contain poison or that aeroplanes are releasing chemtrails, both of those beliefs are complete nonsense that might be restricted as misinformation. Now, let’s put ourselves in that person’s position. Do they believe that what they are saying is false? No, they are no less convinced that their belief is true than we are that ours is true. In this case, the evidence is on our side and we are right but, in other cases, what we believe strongly may also be false. If we believe that someone is in danger because of poison or chemicals, how should we behave? I contend that, if we believe that others are in danger, the right thing to do is to inform them and try to help. A ban on misinformation, however well-intentioned, is also banning people from practising compassion based on their beliefs.

While there is a risk to allowing misinformation, there are ways to mitigate them. A greater danger is to ask people to set aside their own judgement and blindly follow an authority figure. During the years of apartheid in South Africa, citizens were told by the government that some people were superior or inferior based on the colour of their skin, that races should be kept separate, that they should not share doorways, drinking fountains, benches and beaches. The appropriate response was not to defer one’s own moral judgement to that of the government, it was to stand up for one’s own beliefs and act on them. We should not set up regulations that require people to blindly trust a fallible authority for guidance, we should provide each person with the knowledge and skills necessary to make such judgements for themselves.

Banning misinformation is appealing but it does nothing to address the root cause of the problem. Why are people vulnerable to misinformation? How do we change the mind of those that have fallen prey to misinformation? If we can not address these questions, a ban on misinformation will accomplish nothing. People are exposed to large amounts of incorrect information on a daily basis and the misinformation is constantly changing and hard to define. What we need to do is teach media literacy [14,15] to ensure that people are capable of identifying misinformation on their own in an ever-changing information landscape without resorting to paternalistic interventions. It is neither practical nor, given the negative side effects, desirable to ban misinformation; we need to develop a culture that searches for and respects evidence.

Evidence should be emphasised throughout society

The strength of the scientific method over previous attempts to understand the world is that science is supported by evidence. There is a philosophical problem with teaching scientific conclusions as “facts” because scientific conclusions can change over time as new evidence comes to light. It is essential to teach scientific conclusions along with the underlying evidence as the interpretation of the evidence can change. We need to draw a distinction between denying the evidence and denying a conclusion; everyone should accept the evidence but different people can interpret the same evidence differently. Indeed, philosopher Kevin Dorst contends that ambiguous evidence is a key part of understanding how rational people can become polarised [16]. As science accumulates more, and better-quality evidence, ambiguities begin to resolve and the scientific community converges to a consensus.

A problem which contributes to science denialism is that not everyone accepts the importance of basing their world view on evidence. Instead of evidence, people may choose their beliefs according to what they were taught as children, what helps them fit into their community or what makes them feel good. Sometimes the belief without evidence seems innocuous but the mindset it encourages is one which is vulnerable to science denialism. This is a problem with beliefs which are not supported by evidence, or are even contradicted by evidence, such as horoscopes, homeopathy or religion. One of the most concerning recent developments is the hostility of the New Zealand government to science, seeking to elevate the traditional mātauranga Māori system to an equivalence with science [17]. Among the evidence-free claims that this includes is that rain is caused by the tears of a goddess [18] and singing to trees is responsible for them growing better [19]. It’s hard to see this rejection of evidence and scientific consensus from a national government not having disastrous effects on the future spread of science denialism in New Zealand.

I fear that sometimes scientists themselves may be, unintentionally, increasing science denialism. It is not hard to find examples of scientists and scientific institutions talking about “democratising” science. George Orwell identified “democracy” as a meaningless word that is open to interpretation and mostly used because anything democratic must be good [20]. While I believe “democratising science” is meant to signal bringing more people into science (while also benefiting from the good connotations of democracy), there is danger that people may additionally envisage the involvement of the key feature of democracies, voting. Scientific truth is solely determined by the evidence that supports a position; there is no voting for scientific truths nor does the number of scientists who agree with any position have any effect on whether it is valid or not. The position which has evidence behind it is the one which must be accepted. At its core, science is intrinsically undemocratic. When scientific organisations want to emphasise their openness to new people, they should not use the term “democratic” but rather a word such as “inclusive” which does not come with unintended, and potentially harmful, baggage.

One approach to combat science denialism might be to emphasise that the scientific approach is not unique to scientists but merely a formalisation of how we all already approach problems. I am in agreement with Jerry Coyne’s idea of “science broadly construed” which states that many non-scientists use the same approach as scientists for daily tasks [21]. As an example, if your bedside lamp was not working you might check that it is plugged into the wall, then maybe try the bulb or plug something else into the same outlet; all of this is the scientific method. You make an observation that the light does not come on when you flick the switch. You hypothesise that the lamp is unplugged. After verifying that it is plugged in, you discard the first hypothesis for a second, that the bulb is broken. At each step you make observations, develop hypotheses and test them. This is “science broadly construed.” We can all understand that process and, because it’s so familiar, it should be easy for non-scientists to understand that, although the questions are more complicated, what scientists are doing professionally is not really all that different in principle.

It was in November 1982 when the episode The Challenge of the BBC series Yes, Minister first aired. Part of the episode revolves around a plan to bring in pre-determined failure standards for government projects over a certain budget, i.e. government proposals would be treated like scientific experiments with a measurable outcome to determine whether it was a success or failure. In the episode, the civil servants are appalled with the suggestion and the plan is stopped but, as is so often the case with Yes, Minister, there is more than a little truth among the jokes. Forty years later, there was an opinion piece calling on politicians to behave more like scientists and abandon failed ideas [22].

Evidence should not be limited to the narrow scientific realm but it should be fully integrated into our society. I’m not claiming that we currently ignore evidence all the time but that it should be more widespread, both in justifying policies and in testing them. Politicians should not be free to spout whatever nonsense they want, they should have to justify their statements with reference to solid, supporting evidence. When laws are proposed, that should include not only the law itself but the justification for the law, the intended outcome of the law and a time period after which the law will be reviewed. It should be possible to evaluate all those criteria with some sort of evidence so that if the justification for the law is no longer valid then the law will be cancelled. If, at the time of review, the law has not had the intended effects then it should be withdrawn or modified and later reviewed again.

I’m happy to say that there are people attempting to make evidence play a bigger role in society; I’ll highlight two examples from the United Kingdom. The charity Sense about Science operates in the UK to increase public trust in science and spread an evidence-based mindset. Of particular relevance is their Ask for Evidence campaign [23] which helps teach people how to ask for and evaluate evidence from various sources. British physician and academic Ben Goldacre has also (co-)written reports on the use of randomised controlled trials and evidence for developing public policies [24] and to improve education [25]. These resources are extremely valuable and should be widely read as, although they were written for the UK government, their messages are broadly applicable. If we can reform society so that we inform all our decisions with solid evidence, that should go a long way to inoculating people against science denialism.

There is a need for basic scientific knowledge

Among all the other interventions that are needed, scientific education remains important. It is now recognised that the deficit model of science education, which states that science denialism is due to ignorance of the facts, is an incomplete explanation. Indeed, surveys have found that creationists have just as good an understanding of evolution as the members of the public that accept it, but the difference is that they do not believe the scientific theories are correct. This should not be surprising, as scientists we may know all the arguments of creationists and flat-earthers while not being convinced of their conclusions. But, while scientific knowledge is not necessarily going to cure science denialism, knowing how the world works can protect people from falling for future science denialism.

Ignorance leaves people susceptible to fear and manipulation. There’s a long-standing prank [26] wherein people are convinced to sign a petition against dihydrogen monoxide after being told it kills a certain number of people a year, is found in tumours and other scary facts. The joke is that dihydrogen monoxide is just the chemical name for water. Missing that basic bit of knowledge leaves people open to manipulation. The less scientific knowledge a person has, the less capable they are of effectively evaluating any particular claim and the more susceptible they are to fear and manipulation. The end result is not always a funny, joke petition.

As the Voyagers Wolf Project recently shared on Twitter [27], many people have a scientifically-inaccurate view of the danger wolves pose to humans; they fear that wolves will decimate their normal prey populations and, subsequently, begin to hunt humans. While the belief is false, the fear is real and can lead people to oppose wolf re-introductions or to support killing them. This is problematic just on its own but in the context of widespread ecological collapse and the need to restore the wilderness, it can have far larger consequences. While attempts to build trust in science through different methods is important, we should not neglect making sure that every person leaves school with a basic, broad understanding of how the world works.

While scientific knowledge will not prevent all possible cases of science denialism, it will prevent the most egregious. For example, at the beginning of the covid-19 pandemic, a myth was circulated that linked covid-19 and 5G radio towers [28]. Aside from just contributing to the amount of misinformation in the discourse, it also led to some people burning down 5G transmission towers. However, to start believing that rumour, one needs to both have no understanding of either viruses or radio waves. If one understands that viruses are a physical, if tiny, object made of, in the case of the coronavirus, RNA surrounded by lipids and proteins, while 5G radio waves are a form of electromagnetic radiation with a particular wavelength, then it’s immediately obvious that there is no connection. A virus can not be transmitted the same way that radio waves can nor is there any link between the two. Someone is only susceptible to such a patently-false rumour if they have not received a basic scientific education at school.

When writing about her side-gig of discussing physics with the public, German physicist Sabine Hossenfelder notes that many of the people she talks to have decent ideas, but they lack the deep knowledge of physics to participate in current research [29]. Indeed, sometimes their half-knowledge leads them to misinterpret things in ways that would be obvious to a trained physicist. These are people who want to play a part in the scientific world and think that they are seeing something everyone else is missing. What is going to happen when those incorrect conclusions, created from half-knowledge, lead them to feel unfairly excluded? I fear those people are going to switch from wanting to be part of science, to science denialists, promoting an unscientific view and warning people of the conspiracy of academia hiding the truth. Education is important.

However, it’s also worth noting that knowledge is not a panacea for science denialism. People who are overconfident in their knowledge but actually know less objectively are more likely to reject the scientific consensus [30]. This may have implications for how we teach people. Cases where people may have the correct level of confidence for their objective knowledge but still reject the scientific consensus tends to be for highly politicised topics and warns us against fuelling the extreme polarisation that is characterised by the American political landscape. However, there are also incredibly knowledgeable scientists who hold several unscientific beliefs (see Nobel disease [31]). This should act as a reminder that everyone is fallible and expertise in one domain does not necessarily mean expertise in another; we should not blindly trust a title or position but the evidence that supports a statement.

Conclusion

Science denialism is a widespread and urgent problem but it is too wide for any one group to solve. It requires action from multiple parties including scientists, scientific institutions, the media, governments, influencers and the public themselves. All have a role to play in structuring the sort of society in which we wish to live. In this essay I have touched on several areas regarding science denialism where I think something can be done. To summarise these points:

  • There is a need to build trust between the scientific community and the general public. Scientists need to be visible and involved in society outside the traditional confines of academia. The media should interact with scientists at different points in the scientific process, not just when there’s a fancy finding, and, ideally, also bring them into non-scientific areas of the public consciousness. Finally, governments must broadcast their commitment to science in terms of funding and policy decisions.
  • Banning misinformation would be more detrimental to society than allowing it. People must be taught to evaluate statements for bias, understand how they fit with our existing knowledge and judge the evidence behind them.
  • Evidence should be emphasised as much as possible. Government policies must be based on, and judged by, evidence. We should not discuss scientific conclusions without also discussing the evidence that led to those conclusions.
  • Everyone should have a basic scientific education so that they have a broad understanding of both how the world works and how science works to expand our knowledge.

Although I believe that these points will help, they are far from a comprehensive solution. There are other aspects which I have not discussed and a problem as complex as science denialism will require multiple solutions, executed in parallel, to address all facets. For example, while I feel it is too broad with its conception of anti-science beliefs, Philipp-Muller et al. wrote a good paper with many interesting avenues to explore [32].

When it comes to addressing a complex topic like this, no one person can, nor should they have to, do everything on their own. I hope that we all, both as members of the scientific community and as a part of broader society, can find one or two points which align with our capabilities and interests and play our part in creating a society which benefits all its members; a society founded on evidence and reason.

Acknowledgements

I would like to thank Avril Knott-Craig and Sarah Piché-Choquette for offering their constructive criticism on the original draft of this essay.

References

1. Palazzo A. Science and trust: why most sci comm gets it wrong. In: Biological Information [Internet]. 19 Feb 2022 [cited 27 Aug 2022]. Available: https://www.palazzolab.com/biological-information/2022/2/19/science-communication-establishing-trust

2. Reed B. 81 percent of Americans can’t name a single living scientist. Raw Story. Jan 2018. Available: https://www.rawstory.com/2018/01/81-percent-of-americans-cant-name-a-single-living-scientist/. Accessed 27 Aug 2022.

3. Chung E. Half of Canadians can’t name a woman scientist or engineer, poll finds. CBC. 8 Mar 2019. Available: https://www.cbc.ca/news/science/women-scientists-1.5048491. Accessed 27 Aug 2022.

4. Green C. Could you name more than one female scientist? Independent. 17 May 2014. Available: https://www.independent.co.uk/news/science/could-you-name-more-than-one-female-scientist-9391307.html. Accessed 27 Aug 2022.

5. Woolston C. PhD students face cash crisis with wages that don’t cover living costs. Nature. 2022;605: 775–777. doi:10.1038/d41586-022-01392-w

6. Tri-Agency Funding. In: Support Our Science [Internet]. 28 Jul 2022 [cited 27 Aug 2022]. Available: https://www.supportourscience.ca/post/tri-agency-funding

7. Oreskes N. The Public Wants Scientists to Be More Involved in Policy Debates. Scientific American. Sep 2022. Available: https://www.scientificamerican.com/article/the-public-wants-scientists-to-be-more-involved-in-policy-debates/. Accessed 27 Aug 2022.

8. Karrim A, Evans S. EXCLUSIVE | Unscientific and nonsensical: Top scientist slams government’s lockdown strategy. News24. 16 May 2020. Available: https://www.news24.com/news24/SouthAfrica/News/unscientific-and-nonsensical-top-scientific-adviser-slams-governments-lockdown-strategy-20200516. Accessed 27 Aug 2022.

9. Stone J. Further Evidence SA Government Ignored Expert COVID-19 Advice. 2 Oceans Vibe. 25 Mar 2022. Available: https://www.2oceansvibe.com/2022/03/25/further-evidence-sa-government-ignored-expert-covid-19-advice/. Accessed 27 Aug 2022.

10. Savides M. Government “ignored expert advice against exceeding 70% taxi capacity.” Times Live. 27 Aug 2020. Available: https://www.timeslive.co.za/news/south-africa/2020-08-27-government-ignored-expert-advice-against-exceeding-70-taxi-capacity/. Accessed 27 Aug 2022.

11. Forrest A. Scientists accuse Rishi Sunak of ‘Trump’ tactics with attack on Covid lockdown experts. Independent. 25 Aug 2022. Available: https://www.independent.co.uk/independentpremium/uk-news/rishi-sunak-covid-lockdown-trump-b2153559.html. Accessed 27 Aug 2022.

12. South Africa enacts regulations criminalizing ‘disinformation’ on coronavirus outbreak. In: Committee to Protect Journalists [Internet]. 19 Mar 2020 [cited 27 Aug 2022]. Available: https://cpj.org/2020/03/south-africa-enacts-regulations-criminalizing-disi/

13. Mill JS. On Liberty. Fouth Edition. London: Longmans, Green, Reader and Dyer; 1859. Available: https://en.wikisource.org/wiki/On_Liberty

14. Wasserman H, Madrid-Morales D. Untangling the web – giving children the right tools to fight fake news. Daily Maverick. 25 Jul 2022. Available: https://www.dailymaverick.co.za/article/2022-07-25-untangling-the-web-giving-children-the-right-tools-to-fight-fake-news/. Accessed 28 Aug 2022.

15. Potterton M. Banishing the BS – how to equip kids with critical thinking skills in an age of fake news. Daily Maverick. 26 Aug 2022. Available: https://www.dailymaverick.co.za/opinionista/2022-08-26-banishing-the-bs-how-to-equip-kids-with-critical-thinking-skills-in-an-age-of-fake-news/. Accessed 28 Aug 2022.

16. Dorst K. How to Polarize Rational People. In: Stranger Apologies [Internet]. 9 Dec 2020 [cited 27 Aug 2022]. Available: https://www.kevindorst.com/stranger_apologies/how-to-polarize-rational-people

17. Dunlop M. University academics’ claim mātauranga Māori “not science” sparks controversy. Radio New Zealand. 28 Jul 2021. Available: https://www.rnz.co.nz/news/te-manu-korihi/447898/university-academics-claim-matauranga-maori-not-science-sparks-controversy. Accessed 28 Aug 2022.

18. Coyne J. “Ways of knowing”: New Zealand pushes to have “indigenous knowledge” (mythology) taught on parity with modern science in science class. In: Why Evolution Is True [Internet]. 3 Dec 2021 [cited 28 Aug 2022]. Available: https://whyevolutionistrue.com/2021/12/03/ways-of-knowing-new-zealand-pushes-to-have-indigenous-knowledge-mythology-taught-on-parity-with-modern-science-in-science-class/

19. Coyne J. More “ways of knowing”: New Zealand government reports that singing traditional Māori songs to saplings helps them grow. In: Why Evolution Is True [Internet]. 23 Aug 2022 [cited 28 Aug 2022]. Available: https://whyevolutionistrue.com/2022/08/23/more-ways-of-knowing-new-zealand-government-reports-that-traditional-maori-talking-and-singing-helps-plants-grow/

20. Orwell G. Politics and the English Language. Horizon. Apr 194613. Available: https://www.orwell.ru/library/essays/politics/english/e_polit/. Accessed 28 Aug 2022.

21. Coyne JA. Faith versus fact: why science and religion are incompatible. New York: Viking; 2015.

22. Lagardien I. Dear politicians, please follow scientists’ lead and abandon failed ideas and theories. Daily Maverick. 27 Apr 2022. Available: https://www.dailymaverick.co.za/opinionista/2022-04-27-dear-politicians-please-follow-scientists-lead-and-abandon-failed-ideas-and-theories/. Accessed 27 Aug 2022.

23. Ask For Evidence. [cited 27 Aug 2022]. Available: https://askforevidence.org/index

24. Haynes L, Service O, Goldacre B, Torgerson D. Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials. 2012 Jun. Available: https://www.gov.uk/government/publications/test-learn-adapt-developing-public-policy-with-randomised-controlled-trials

25. Goldacre B. Building evidence into education. 2013 Mar. Available: https://www.gov.uk/government/news/building-evidence-into-education

26. Dihydrogen monoxide parody. In: Wikipedia [Internet]. [cited 27 Aug 2022]. Available: https://en.wikipedia.org/wiki/Dihydrogen_monoxide_parody

27. Voyageurs Wolf Project. In: Twitter [Internet]. 8 Aug 2022 [cited 27 Aug 2022]. Available: https://twitter.com/VoyaWolfProject/status/1556642788409753600

28. Pattillo A. 5G & Covid-19: The origin, explanation, and reason why scientists are concerned. 4 Aug 2020. Available: https://www.inverse.com/culture/5g-coronavirus-conspiracy-theory-explained. Accessed 27 Aug 2022.

29. Hossenfelder S. What I learned as a hired consultant to autodidact physicists. Aeon. 11 Aug 2016. Available: https://aeon.co/ideas/what-i-learned-as-a-hired-consultant-for-autodidact-physicists. Accessed 27 Aug 2022.

30. Light N, Fernbach PM, Rabb N, Geana MV, Sloman SA. Knowledge overconfidence is associated with anti-consensus views on controversial scientific issues. Sci Adv. 2022;8: eabo0038. doi:10.1126/sciadv.abo0038

31. Nobel disease. In: Wikipedia [Internet]. [cited 27 Aug 2022]. Available: https://en.wikipedia.org/wiki/Nobel_disease

32. Philipp-Muller A, Lee SWS, Petty RE. Why are people antiscience, and what can we do about it? Proc Natl Acad Sci USA. 2022;119: e2120755119. doi:10.1073/pnas.2120755119

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s