PHILIP M. FERNBACH[...]
STEVEN A. SLOMAN HTTPS://ORCID.ORG/0000-0001-8223-3788 Authors Info & Affiliations
20 Jul 2022
Vol 8, Issue 29
AbstractPublic attitudes that are in opposition to scientific consensus can be disastrous and include rejection of vaccines and opposition to climate change mitigation policies. Five studies examine the interrelationships between opposition to expert consensus on controversial scientific issues, how much people actually know about these issues, and how much they think they know. Across seven critical issues that enjoy substantial scientific consensus, as well as attitudes toward COVID-19 vaccines and mitigation measures like mask wearing and social distancing, results indicate that those with the highest levels of opposition have the lowest levels of objective knowledge but the highest levels of subjective knowledge. Implications for scientists, policymakers, and science communicators are discussed.
INTRODUCTIONUncertainty is inherent to science. A constant striving toward a better understanding of the world requires a willingness to amend or abandon previous truths, and disagreements among scientists abound. Sometimes, however, evidence is so consistent, overwhelming, or clear that a scientific consensus forms. Despite consensus by scientific communities on a handful of critical issues, many in the public maintain anti-consensus views. For example, there are sizable gaps in agreement between scientists and laypeople on whether genetically modified (GM) foods are safe to eat, climate change is due to human activity, humans have evolved over time, more nuclear power is necessary, and childhood vaccines should be mandatory (1). The coronavirus disease 2019 (COVID-19) pandemic also continues on, fueled in part by contagion among the unvaccinated (2), while social movements against vaccination policies are emerging worldwide. The consequences of these anti-consensus views are dire, including property destruction, malnutrition, disease, financial hardship, and death (3–6).
Opposition to the scientific consensus has often been attributed to nonexperts’ lack of knowledge, an idea referred to as the “deficit model” (7, 8). According to this view, people lack specific scientific knowledge, allowing attitudes from lay theories, rumors, or uninformed peers to predominate. If only people knew the facts, the deficit model posits, then they would be able to arrive at beliefs more consistent with the science. Proponents of the deficit model attempt to change attitudes through educational interventions and cite survey evidence that typically finds a moderate relation between science literacy and pro-consensus views (9–11). However, education-based interventions to bring the public in line with the scientific consensus have shown little efficacy, casting doubt on the value of the deficit model (12–14). This has led to a broadening of psychological theories that emphasize factors beyond individual knowledge. One such theory, “cultural cognition,” posits that people’s beliefs are shaped more by their cultural values or affiliations, which lead them to selectively take in and interpret information in a way that conforms to their worldviews (15–17). Evidence in support of the cultural cognition model is compelling, but other findings suggest that knowledge is still relevant. Higher levels of education, science literacy, and numeracy have been found to be associated with more polarization between groups on controversial and scientific topics (18–21). Some have suggested that better reasoning ability makes it easier for individuals to deduce their way to the conclusions they already value [(19) but see (22)]. Others have found that scientific knowledge and ideology contribute separately to attitudes (23, 24).
Recently, evidence has emerged, suggesting a potentially important revision to models of the relationship between knowledge and anti-science attitudes: Those with the most extreme anti-consensus views may be the least likely to apprehend the gaps in their knowledge. In a series of studies on opposition to GM foods, Fernbach et al. (25) found that individuals most opposed were the least knowledgeable about science and genetics but rated their understanding of the technology the highest in the sample. A similar pattern emerged for gene therapy, although not for climate change denial. Related findings have been reported for opponents of vaccination claiming to know more than doctors about autism (26) and for anti-establishment voters in a Dutch referendum reporting knowing more about the issues than they really do (27). Those with the most strongly held anti-consensus views may be not only the least knowledgeable but also the most overconfident about how much they know (28, 29).
These findings suggest that knowledge may be related to pro-science attitudes but that subjective knowledge—individuals’ assessments of their own knowledge—may track anti-science attitudes. This is a concern if high subjective knowledge is an impediment to individuals’ openness to new information (30). Mismatches between what individuals actually know (“objective knowledge”) and subjective knowledge are not uncommon (31). People tend to be bad at evaluating how much they know, thinking they understand even simple objects much better than they actually do (32). This is why self-reported understanding decreases after people try to generate mechanistic explanations, and why novices are poorer judges of their talents than experts (33, 34). Here, we explore such knowledge miscalibration as it relates to degree of disagreement with scientific consensus, finding that increasing opposition to the consensus is associated with higher levels of knowledge confidence for several scientific issues but lower levels of actual knowledge. These relationships are correlational, and they should not be interpreted as support for any one theory or model of anti-scientific attitudes. Attitudes like these are most likely driven by a complex interaction of factors, including objective and self-perceived knowledge, as well as community influences. We speculate on some of these mechanisms in the general discussion.
The current research makes four primary contributions. First, we test the generality of the relation between extremity of anti-consensus beliefs and scientific knowledge overconfidence (the difference between subjective and objective knowledge). Although related effects have been demonstrated across a handful of contexts and with different operationalizations of the constructs, there has been no test with a unitary methodology across a range of issues. In studies 1 to 3, we examine seven controversial issues on which there is a substantial scientific consensus: climate change, GM foods, vaccination, nuclear power, homeopathic medicine, evolution, and the Big Bang theory. In studies 4 and 5, we examine attitudes concerning COVID-19. Second, we provide evidence that subjective knowledge of science is meaningfully associated with behavior. When the uninformed claim they understand an issue, it is not just cheap talk, and they are not imagining a set of “alternative facts.” We show that they are willing to bet on their ability to perform well on a test of their knowledge (study 3).
Third, if the effect does not generalize to all issues, do the data give any indication why? In discussing why GM foods showed the pattern but climate change did not, Fernbach et al. (25) suggested that a potentially important difference between the issues is degree of political polarization, with climate change attitudes much more polarized by political affiliation than attitudes on GM foods. Political polarization refers to the degree to which people from different ideological groups (e.g., conservatives versus liberals) differ in their positions on an issue. When an issue is highly polarized, there may be less room for individual knowledge to influence attitudes because they are instead driven more by community influence. In studies 1 and 2, we test whether the predicted effects are attenuated for issues that are more politically polarized. Likewise, because several issues that we examine have come into conflict with religious thinking, and because religion can itself be a polarizing factor for attitudes and beliefs (21), we also test for an attenuation for issues more associated with religiosity.
Last, given the life-altering nature of the COVID-19 pandemic, do these relationships shed light on the psychology of those opposed to expert recommendations and policies aimed at reducing the infection rate? The COVID-19 pandemic, caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), is the largest spread of a respiratory disease that the world has seen in over 100 years. Although the knowledge gained and shared by the scientific community about the virus gradually increased, public health professionals prescribed traditional, time-tested, and general epidemiological measures to try to mitigate its spread. Thus, while a scientific consensus on the specifics of SARS-CoV-2 viral transmission of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) emerged slowly, consensus on how to mitigate viral contagion was well established even at the beginning of the pandemic. Nonetheless, there are notable gaps between scientists’ recommendations and the public’s willingness to act in accordance with them (35–37). Here, we examine the relations among objective knowledge, subjective knowledge, and opposition to COVID-mitigating behaviors and policies in two studies, one focused on openness to being vaccinated (study 4), and the other focused on attitudes toward mitigation behaviors such as mask wearing and social distancing (study 5).
Studies 1 and 2: Anti-consensus views across seven scientific issuesThe purpose of studies 1 and 2 was to test the generalizability of relations between participants’ opposition to scientific consensus and their objective and subjective knowledge, both within and across seven scientific issues, in a large preregistered study (combined N = 3249).