Misinformation about COVID-19: Psychological Insights: Comparison
Please note this is a comparison between Version 3 by Dean Liu and Version 2 by Dean Liu.

While the precise conceptualization of the term misinformation remains a subject of debate, the current entry defines misinformation as any type of information which is misleading or false, regardless of intent. The COVID-19 pandemic has seen the rapid and widespread sharing of misinformation on a global scale, which has had detrimental effects on containment efforts and public health. This entry offers psychological insights to better our understanding of what makes people susceptible to believing and sharing misinformation and how this can inform interventions aimed at tackling the issue.

  • misinformation
  • fake news
  • social media
  • COVID-19
  • social cognition
  • public health

1. Introduction

December 2019 saw the emergence of SARS-CoV-2, a novel virus causing the coronavirus disease (COVID-19), which spread aggressively and rapidly across the globe. By 11 March 2020, the World Health Organization (WHO) had declared the outbreak a pandemic, and by 18 September 2021, there were over 226 million cases of COVID-19 and 4.7 million deaths reported worldwide
December 2019 saw the emergence of SARS-CoV-2, a novel virus causing the coronavirus disease (COVID-19), which spread aggressively and rapidly across the globe. By 11 March 2020, the World Health Organization (WHO) had declared the outbreak a pandemic, and by 18 September 2021, there were over 226 million cases of COVID-19 and 4.7 million deaths reported worldwide
[1]
. This global crisis was paralleled by the widespread sharing of both scientific and non-scientific information surrounding COVID-19 across multiple media channels. For the first time in history, social media and technology were being used on a huge scale by public health authorities and other institutions to keep people informed, safe, and connected. Social media and technology played an essential role in the response to the pandemic, for instance, through the implementation and promotion of public health measures, the tracking and mapping of symptoms, as well as the prediction of outbreaks in real-time. At the same time, however, this same technology also facilitated the overabundant spreading of information from uninformed sources, not all of which were accurate and reliable. The global scale of the pandemic amplified this spreading as people urgently sought out and shared information in an effort to protect themselves, their families, and their communities against the virus
[2]
.
On 15 February 2020, T.A. Ghebreyesus, the Director-General of WHO, announced the concern that the omnipresence and overabundance of often conflicting and inaccurate information posed a significant challenge for public health and was jeopardizing the response to the pandemic
[3]
. WHO declared that the world was facing what they termed an infodemic; “an overabundance of information, some accurate and some not, that makes it hard for people to find trustworthy sources and reliable guidance when they need it”
[4]
(p. 2). The COVID-19 infodemic saw the spread of information concerning the origin and cause of the virus and disease, the transmission of the virus and symptoms of the disease, available prophylactics, treatments and cures, and the impact and efficacy of interventions by public health authorities or other institutions
[4]
. Amongst this information was fake news, misinformation, disinformation, and conspiracy theories, which caused many to mistrust reliable sources of information and develop a distorted risk perception of the virus
[5][6]
. Due to this, people were less likely to adopt preventative public health behaviors, which had an adverse effect on the implementation and efficacy of containment strategies
[5][7]
.
The management of the infodemic was soon publicly recognized by WHO as a crucial part of the response to COVID-19
[3]
. On 29 June 2020, WHO held its first global infodemiology conference
[8]
, which led to the publication of the WHO Public Health Research Agenda for Managing Infodemics
[9]
. In this publication, WHO identified a need for research in the field of psychology to identify factors that make people more likely to share or believe inaccurate information
[9]. Understanding these factors can inform and enhance the development of innovative and creative interventions aimed at infodemic management.
This entry begins with a description of what constitutes fake news, misinformation, and disinformation, explores cases from the COVID-19 infodemic and considers the effect this has had on the societal response to the pandemic. It then goes on to explore the main psychological factors that have been found to play a role in the believability of misinformation and the role of sharing behavior. The entry ends with a description of interventions aimed at addressing misinformation.
. Understanding these factors can inform and enhance the development of innovative and creative interventions aimed at infodemic management.

2. Fake News, Misinformation, Disinformation, and COVID-19

The sharing of fake news is not something new. A classic example dates back to 1835 when a series of fabricated articles reporting the discovery of life and civilization on the moon was published by The Sun Newspaper in New York [10]. The mid-1890s saw a surge in fake news, when two major newspapers; W.R. Hearst’s New York Journal, and J. Pulitzer’s New York World competed for readers by prioritizing sensationalism over accuracy [11]. The promotion of fake news facilitated its circulation on a mass scale, a strategy which was soon adopted by other newspapers in their attempt to gain popularity. This soon came to be known as yellow journalism [11]. Another instance began in 1933, when the Nazi government founded the Reichsministerium für Volksaufklärung und Propaganda (RMVP) to enforce Nazi ideology through the spread of carefully choreographed propaganda [12]. A similar strategy was employed by the Soviet Union, with the establishment of a unit specializing in the manufacture and dissemination of disinformation in an attempt to influence political attitudes and public opinion [13]. The sharing of fake news has also played an influential role in public health issues over the years. For instance, in 1918, fake news surrounding the emergence of the H1N1 Influenza A virus led to it being coined the Spanish flu despite no evidence of it having originated from Spain at all [14]. This had significant detrimental economic and psychosocial consequences due to stigmatization [14][15].
Although the sharing of fake news is nothing new, the proliferation and democratization of social media has provided a principle conduit allowing it to spread more rapidly than ever before. With this has come accelerated growth in public and scientific interest, with the term since being referred to as a global buzzword [16][17].

2.1. Defining Fake News, Misinformation, and Disinformation

No consensus definition of fake news currently exists [18], although various definitions have been proposed. Fake news has been referred to as news which “aesthetically resembles actual legitimate mainstream news content but that is fabricated or extremely inaccurate” [19] (p. 389) and as “false information masquerading as verifiable truth” [20] (p. 735). Based on a review of the literature, Tandoc et al. [17] proposed a typology of fake news, defining it as news satire, news parody, fabrication, manipulation, propaganda, and advertising. A similar conceptualization was proposed by Waszak et al. [21], who added to this the idea that fake news is often irrelevant. Shu et al. [18] characterized fake news as having the intent to deceive and a verifiable lack of authenticity.
The terms misinformation and disinformation are widely used in research on fake news [16]. However, the way in which these terms are used varies. Some have used the terms to distinguish false information that is spread intentionally from that which is spread unintentionally, for example, referring to misinformation as the “inadvertent sharing of false information” and disinformation as the “deliberate creation and sharing of information known to be false” [22]. More specifically, misinformation has been referred to as the publishing of “wrong information without meaning to be wrong or having a political purpose in communicating false information”, and disinformation as “manipulating and misleading people intentionally to achieve political ends” [23] (p. 24). At the same time, some have used the terms interchangeably [24][25], and others have used misinformation to mean all kinds of misleading information and disinformation to mean only that which is intentionally misleading [26][27]. In line with this, and for the purposes of this entry, the term misinformation is used to encompass all types of misleading or false information, regardless of intent.

2.2. The Role of Technology and Social Media

Advances in technology have redefined the way in which information is published, spread, and accessed [23][28]. Hardware devices, such as smartphones and tablets, are becoming more and more affordable, removing financial barriers, and allowing easier access to a variety of tools [29]. A notebook in 2001, for instance, was priced at $2200, and in 2020, a similar device was priced at just $350 [29]. As a result, the accessibility to technology is rapidly and continually increasing in what has come to be known as the democratization of technology. At the same time, technology is becoming increasingly user-friendly, providing a variety of tools in which good quality content can be easily created. Content that is of higher quality is more likely to be perceived to be from a more reliable source, and thus is more likely to be believed [30]. Using technology, such content can be instantaneously uploaded and shared online, through one of the many popular, easily accessible, free-of-charge, social media platforms, for instance. The democratization of technology therefore plays a pivotal role in the exchange and spread of misinformation.
Social media platforms, in particular, have provided an efficient, user-friendly, highly accessible tool that allows for the high-speed and cross-platform publishing and sharing of information without being vetted and at no cost [28][31]. Social media has recently been referred to as a “powerful source for fake news dissemination” [18] (p. 23). A survey carried out by the Pew Research Centre found just over half (53%) of U.S. adults get their news from social media, with Facebook, YouTube, Twitter, and Instagram being the most popular platforms of choice [32]. In addition to changing the way that news is spread, technology has also impacted the way news looks, with tweets (a written message shared on Twitter that is a maximum of 140 characters long) being considered as significant news [17]. Another critical factor to consider in the spreading of misinformation is that most social media posts are accompanied by popularity ratings (e.g., likes, shares, and comments). The more popular a post appears to be, the more attention it receives, and the more likely it is to be liked, shared, and commented on, regardless of how accurate the information is [33]. Indeed, misinformation has been found to outperform accurate information when it comes to engagement and popularity [34]. The repeated sharing of posts on and between social media platforms poses a further challenge, as it makes it more difficult for users to determine the proximate source of the information [34]. What results is an online environment with an infinitely and rapidly flowing stream of posts, some with accurate information and others with inaccurate, varying in their appearance and level of detail, with a wide range of popularity ratings and often indistinguishable sources. This makes it very challenging for people to distinguish accurate from inaccurate information.

2.3. Misinformation during the COVID-19 Pandemic

The COVID-19 pandemic provided the ideal conditions that allowed misinformation surrounding the virus to thrive; high fear, low trust, and low confidence [35]. As a result, social media platforms were inundated with shared misinformation about the virus [36][37][38]. For instance, it was found that over 25% of the most viewed YouTube videos relating to COVID-19 (with over 62 million views worldwide) contained misinformation [39]. Other research found that 46% of the U.K. population [40] and 48% of the U.S. population [41] reported being exposed to misinformation surrounding COVID-19, with 66% of those exposed reporting repeated exposure daily [42].
Misinformation about COVID-19 ranged from conspiracy theories that the virus was bioengineered in a lab in Wuhan, China [43], to the promotion of fake cures such as adding pepper to meals, drinking or injecting oneself with bleach, and gargling lemon and salt water [44], or that the symptoms of COVID-19 were exacerbated by the 5G cellular network [45]. Another prevailing narrative claimed the virus was being used by B. Gates to enforce a global vaccination program and surveillance regime [35]. One of the most widespread examples of COVID-19 related misinformation was Plandemic, a conspiracy film that promoted anti-scientific health advice such as to avoid wearing masks since they activate the virus [46]. Even political leaders were contributing to the spreading of fake news, despite having access to official information. For instance, both U.S. President D. Trump and Brazilian President J. Bolsonaro actively promoted hydroxychloroquine as an effective treatment against the virus despite the lack of scientific evidence on the efficacy of the drug [47]. This illustrates how the veracity of misinformation is often difficult to determine, as it is not always blatant. For instance, even though there is no conclusive evidence, hydroxychloroquine is actually being studied as a potential treatment for COVID-19 [48].

2.4. Effects of Misinformation on the Societal Response to COVID-19

The detrimental effect of misinformation surrounding the COVID-19 pandemic has been made evident by the societal response on the behavioral level [5]. Numerous events and behaviors demonstrated the extent of the believability of misinformation, with uninformed opinion and conspiracy theories often being falsely equated to scientific evidence [20]. Research has shown that believing misinformation interferes with perceptions of the seriousness of COVID-19, which causes the underestimation of the risk posed by the virus [6]. As risk perception has been found to be significantly associated with the adoption of preventative health behaviors [7], this might partially explain why so many failed to adhere to the recommended guidelines for the containment of the virus (e.g., frequent hand washing and social distancing) [49][50], and are hesitant to receive the vaccine [50][51]. An additional factor in explaining this behavior might be the increased propensity to mistrust information from expert sources that is associated with believing misinformation [50][52], leading to the adoption of avoidance behaviors over health-protective behaviors [53]. Misinformation surrounding COVID-19 has also encouraged many to try dangerous treatments with severe health consequences. For instance, the myth that COVID-19 can be cured by drinking highly concentrated alcohol led many to follow this information, resulting in over 5800 hospitalizations, 60 cases of blindness, and 800 deaths worldwide [54].
Besides influencing health behaviors, misinformation has also led to the stigmatization of various groups of people, such as the Chinese, who have faced discrimination due to biased and misleading media coverage [55]. “China Kids Stay at Home” [56] and “China is the Real Sick Man of Asia” [57] were amongst some of the headlines promoted by influential news companies worldwide. As a result, the Chinese have faced racial discrimination, unequal treatment, and social isolation, having negative psychological consequences, including stress and anxiety [58]. This discrimination was not limited to Chinese but extended to Asians more generally. For instance, between 19 March 2020, and 15 April 2020, 1497 instances of COVID-19-related discrimination against Asian-Americans were reported to the Asian Pacific Policy and Planning Council in the U.S. [59]. These included hate crimes such as the attempted murder and stabbing of a Burmese-American father, four-year-old and two-year-old because the perpetrator mistakenly assumed the family was Chinese and therefore infecting people with the virus [60]. Other acts of violence due to misinformation included the destruction of telecommunication masts and the verbal and physical abuse of telecommunication workers in the U.S., Europe, and Australasia due to the 5G conspiracy theory (cf., Jolley and Paterson [61]), as well as mob attacks [5].

3. Psychological Factors Affecting the Susceptibility to Misinformation

It should not be ignored that many were not susceptible to believing misinformation, adhering to expert-recommended guidelines, and adopting the appropriate health-preventative behaviors [7]. This begs the question of what it is that makes some people more susceptible to believing misinformation than others. Identifying such factors is important in informing interventions for addressing misinformation. This section uses psychological theory to explore what makes people susceptible to believing misinformation.

3.1. Emotionality

The rapid spread of the highly contagious SARS-CoV-2 and its associated high mortality rates meant that feelings of uncertainty and threat were rife during the COVID-19 pandemic [38]. This was exacerbated by extreme measures, such as social isolation and quarantine, aimed at containing the virus, which left many experiencing significant feelings of anger, confusion, anxiety, and stress [62]. Prolonged elevated stress-related emotions are well known to activate symptoms of depression, and this proved to be the case for many [63][64]. A systematic review and meta-analysis of research on the prevalence of stress-related emotions during the COVID-19 pandemic concluded that it had caused a significant detrimental effect on public mental health [64]. In one study, for instance, over one-third (34%) of Chinese people reported experiencing moderate to severe levels of stress or anxiety-related symptoms in response to COVID-19 [65]. The stress-related emotional response to the pandemic was deemed so significant it led to the conceptualization of COVID stress syndrome [66].
The emotional response to the COVID-19 pandemic is a crucial factor to consider for understanding the believability and spread of misinformation [38][67]. People tend to be strongly motivated to maintain a sense of control and understanding over their lives [68], and when this sense is under threat, it results in heightened feelings of anxiety [69]. In an attempt to reduce this anxiety and regain their sense of control, people “compensate with strategies that lead to greater acceptance of misconceptions” [70] (p. 3). These strategies include sense-making mechanisms, whereby information is obtained from various sources in order to make sense of a complex and unfamiliar situation, as well as having someone or something to blame and project feelings of anxiety towards [67].
Misinformation, and especially conspiracy theories, provide a narrative for people to both make sense of a situation and place the blame somewhere by explaining an event as a result of an influential individual or organization’s secret attempts to achieve a sinister goal [67]. They offer an appealing solution to making sense of a situation and thus to regaining a sense of control in a way that is often clearer and simpler than that offered by accurate information. For instance, believing that China manufactured SARS-CoV-2 in a laboratory as a bioweapon offers an intelligible explanation of the origin of the virus, as well as somewhere to place the blame. The accurate account, on the other hand, which proposes that SARS-CoV-2 could have originated from the transmission of the virus from an animal to a human (a random, uncontrollable event), potentially enhances anxiety and lack of control. It has been repeatedly proven that feelings of anxiety and a lack of control foster openness to information and are a significant driving factor underlying a higher propensity to believe misinformation and conspiracy theories (see, e.g., Douglas et al. [71]).

3.2. Motivated Reasoning

On 19 April 2020, just over a month into the COVID-19 pandemic, an American protestor shouted, “This is a free country. Land of the free. Go to China if you want communism” at a nurse who was counter-protesting the lifting of quarantine measures [72]. This was one of many incidents that demonstrated how the pandemic led to the polarization of discourse and revealed deep-rooted epistemological and political positions [20]. This was largely fueled by the actions of governors and politicians, many of whom had opposing views on COVID-19 [73]. Various governors in the U.S., for instance, discounted the recommendations of health officials, failing to fully implement social distancing measures and openly discouraging the use of face masks [73]. In addition, news coverage surrounding COVID-19 was highly polarized and politicized, with politicians appearing more frequently in the news than scientists [74]. The politicization of COVID-19 was found to have had a detrimental effect on efforts to contain the virus, mainly through the spread of misinformation [75].
There is a growing body of research that links political ideology to the societal response to the pandemic [73][76][77], with political differences having been found to be the most significant factor predicting policy preferences and the adoption of health behaviors [77]. Political ideology has also been found to underlie susceptibility to believing and sharing misinformation [73], making it a crucial factor to consider in understanding the COVID-19 infodemic. The link between political ideology and the believability of misinformation can be better understood using the psychological theory of motivated reasoning or identity-protective cognition.
Motivated reasoning posits that information processing is directed so that it protects and is non-threatening to an individual’s existing beliefs or identity [78]. When an individual is faced with information that conflicts with either of these, they are likely to experience cognitive dissonance, which refers to a state of mental discomfort caused by conflicting attitudes or beliefs [79]. When in cognitive dissonance, people engage in thought processes that serve to minimize this discomfort. Building on this, the theory of motivated reasoning proposes that when faced with multiple sources of polarized information, people are more likely to believe that which reinforces their pre-existing beliefs (confirmation bias) and reject those which undermine their pre-existing beliefs (disconfirmation bias) [19][73][80]. Therefore, pre-existing ideological and partisan attitudes and beliefs might prevent people from fact checking of information, and lead to higher levels of engagement with ideologically concordant information [81]. These higher levels of engagement inform the efforts of curation algorithms of social media platforms, which present the user with content which maintains their interest and maximizes engagement. As a result, people become enclosed in a filter bubble, in which they are more likely to be exposed to information that confirms their pre-existing attitudes (selective-exposure) and less likely to be exposed to diverse content [81][82]. In this way, the interaction between user engagement and algorithmic content curation contributes to the spread of misinformation, and thus presents a significant challenge in addressing the issue.
Although there is much evidence to support that political ideology is associated with believability (cf., Pennycook and Rand, [19]), it is essential to consider that this effect has been found to be smaller than that of the accuracy of the information [83]. Therefore, information that is accurate and politically discordant is more likely to be believed than misinformation that is politically concordant [19]. Since accuracy is a more significant predictor of susceptibility to misinformation than political concordance, this raises the question of why misinformation is ever believed at all. One explanation is offered by Pennycook and Rand [84], who suggest that the issue lies in whether or not people are able to accurately determine the integrity of information; therefore, that susceptibility to misinformation might actually be better explained by a lack of reasoning rather than by motivational reasoning.

3.3. Cognitive Reasoning

Dual-process theory characterizes human cognition as having two distinct thinking styles; intuitive or autonomous thinking (system 1 processing) and analytic, rational, or reflective thinking (system 2 processing) [85][86]. The distinction between the two is demonstrated through the performance on a conflict task; the intuitive, incorrect response is a result of intuitive thinking, which is speedy and effortless and requires minimal working memory resources. The correct result requires analytic thinking, which is effortful, deliberate, and requires more working memory resources. However, humans tend to instinctively avoid resource-demanding processes whenever possible [87]. According to Pennycook and Rand [84], “humans are cognitive misers, in that resource-demanding cognitive processes are typically avoided” (p. 2). This is problematic when it comes to discerning truth from falsehood, given that analytic reasoning supports sound judgment [88].
There is a growing body of research providing support for the association between analytic reasoning processes and skepticism about epistemically ambiguous information [86]. For instance, a greater tendency to engage in analytic thinking is linked to the detection of pseudo-profound bullshit [86] as well as the rejection of conspiracy theories [89], including those related to COVID-19 [90]. However, more recent research is highlighting the critical role of prior knowledge in susceptibility to misinformation. It has been shown, for example, that the association between analytic reasoning and the rejection of misinformation is significantly stronger when the information is more obviously inaccurate [84], which suggests that an individual’s prior knowledge is an important factor underlying susceptibility to misinformation. This is supported by the finding that scientific reasoning has been found to be a stronger predictor of COVID-19 conspiracy theory beliefs than analytic thinking [90].
Scientific reasoning refers to having scientific knowledge and applying its “methods or principles of scientific inquiry to reasoning and problem-solving situations” [91] (p. 173). To better understand the role of scientific reasoning in the believability of misinformation, it is important to note the distinction between a belief (an attitude that is based on realistic, factual evidence), and an epistemically suspect belief (a belief which is not supported by factual evidence, and which conflicts with current knowledge) [90]. For example, the belief that methanol can cure COVID-19 conflicts with the factual evidence that methanol is toxic for human consumption. People with better scientific reasoning skills tend to hold the deep-rooted belief that scientific knowledge provides the most accurate conceptualization of the world [90]. As a result, they also tend to have beliefs which are supported by scientific evidence and therefore hold fewer epistemically suspected beliefs [92]. Therefore, it seems that having pre-exiting scientific knowledge and actually stopping to apply this knowledge to an analytical reasoning process makes people less susceptible to believing COVID-19 misinformation.

3.4. Heuristics

Cognitive psychology proposes heuristics as a thought process underlying intuitive thinking that ease the cognitive load when it comes to making judgments [93]. The word heuristic originates from the Ancient Greek word εὑρίσκω, meaning to find [93]. It refers to the process whereby an individual makes a decision based on a general rule of thumb, with very little cognitive reasoning involved [93]. Heuristics offer useful shortcuts for making a quick judgment call, however, the judgment is not guaranteed to be optimal or rational [93]. Simon [94] describes heuristics as satisficing; offering solutions that are good enough for the situation at hand, but which could be optimized. Research has shown that heuristics are used extensively in decision making in a variety of contexts (cf., Horne et al. [93]).
A recent study aimed to assess the use of heuristics in judging the veracity of COVID-19-related information [93]. Participants were shown a variety of news headlines relating to COVID-19 and were asked whether or not they believed the information and why. The researchers found that heuristics were extensively used in making judgments, and found that these fell into three broad categories; self-cognitive heuristics, content heuristics and source heuristics. Self-cognitive heuristics included heuristics based on how far the information aligned with the individual’s beliefs, their previous experiences, or their pre-existing knowledge. The more these are aligned, the more likely the individual is to accept the information as accurate. It was found that people made judgments by comparing the news to beliefs such as “vaccinations are proven safe and everyone should get vaccinated”, or previous experiences such as “I use the oil and it works”. Content heuristics consider supporting evidence, bias, accuracy, coherency and writing style. An example of a content heuristic is “they use derogatory terms such as libtard which is an obvious sign that the article is biased”. Source heuristics refer to whether the source of the information is perceived as accurate and include heuristics such as “the Chicago-Sun Times is a reputable paper”.
The most commonly used heuristics were belief alignment heuristics (29%), followed by knowledge alignment heuristics (22%), with just 6% basing their judgments on perceived accuracy. These findings provide further support for the argument that a lack of cognitive reasoning, and therefore failure to accurately judge the veracity of information underlies the susceptibility to believing misinformation [84].

4. Social Media Sharing

The simple click or tap of a like or share button is all it takes for the instantaneous dissemination of information. Advances in social media technology have granted users the ability to share information across multiple platforms simultaneously. A survey carried out by the Pew Research Center found that around three-quarters (73%) of U.S. adults are multiplatform users [95]. Of Facebook users, for example, 91% also use Instagram, 90% use Twitter, 90% use LinkedIn, 89% use Pinterest, 89% use Snapchat, 85% use WhatsApp and 81% use YouTube [95]. This creates an expansive online network, allowing for the seamless sharing of information, with every single share significantly expanding its reach. If, for example, an individual with 1000 Twitter followers shared a post, which in turn was shared by just 10% of those followers (to their own network of 1,000 followers), the post will effortlessly have reached 100,000 people [96]. This illustrates how conducive social media is to the rapid spreading of information, and how sharing exacerbated the spread of misinformation that led to the COVID-19 infodemic [2]. Therefore, exploring social-media sharing behavior can enhance our understanding of the spread of misinformation during the COVID-19 pandemic.

Understanding Sharing Behavior

Shared information is often mistakenly assumed to have been shared based on the individual believing the information [19]. However, recent research has proven that this is not necessarily the case, and that information is shared for a variety of reasons [19]. In one study, for instance, participants were presented with false claims about COVID-19 and asked whether or not they would share them on social media [97]. It was found that the perception of accuracy of the statement did not play a significant role in the intention to share, with the intention to share being 91% higher than the judgment of their accuracy. These findings demonstrate that people are willing to share COVID-19 related information without being certain of its accuracy. Pennycook and Rand [19] identify three possible explanations for this disconnection between sharing intention and accuracy judgment; the preference-based account, the confusion-based account, and the inattention-based account.
Consistent with the theory of motivated reasoning [85], the preference-based account proposes that people prioritize their political identity or moral viewpoints, over accuracy and truth. From this perspective, people share misinformation, even when knowing it is inaccurate, for reasons driven by their political or moral ideology. These reasons might include virtue signaling [98], social dominance orientation [99], furthering a political agenda [77], or simply because the information is thought to be interesting [30]. However, according to Pennybrook et al. [83], just 16% of shared misinformation is driven by preference-based motives.
According to the confusion-based account, people mistakenly and genuinely believe the misinformation they share to be accurate. This perspective is supported by the findings of Pennybrook et al. [83] which showed that only one-third (33%) of shared misinformation was believed, and two-thirds (67%) was shared as a result of confusion. Based on these findings, a significant amount of information that is shared online can be explained by confusion.
The inattention-based account suggests that while people generally prefer to only share accurate information, they fail to do so due to distractions from the online social media environment. Social media environments have become an attention economy, where posts compete for user attention and provide distractions which hinder with analytic thinking processes [64]. Much content is generated and posted with the aim of capturing as much attention as possible. This often done through the posting and sharing of ideologically extreme posts with the hope of achieving high popularity ratings (likes, comments and shares). Popularity ratings have been repeatedly shown to attract more attention, regardless of information veracity [33][96]. Therefore, such an environment is likely to provide conditions which make people who engage in intuitive rather than analytic reasoning [100] especially susceptible to believing and sharing misinformation. Indeed, analytic reasoning, besides being associated with a higher tendency to reject misinformation [89], is also associated with sharing behavior that is based on more accurate judgments of information veracity [97], and the sharing of more reliable information [101]. Therefore, a lack of analytic thinking seems to be a source of misjudgments when it comes to sharing information on social media.

5. Interventions for Addressing Misinformation

Current interventions for addressing misinformation fall into four broad categories; algorithmic, corrective, legislative and psychological [102]. Algorithmic approaches use machine learning, network analysis and natural language processing to detect misinformation [18]. A ranking algorithm downranks any information that is classified as problematic, making it less likely for users to see it. Although these approaches have been implemented by social media companies including Google and Facebook, they have not been entirely effective for two main reasons. Firstly, it is not always easy to ascertain the veracity of information; the truth is not always black and white. Therefore, algorithmic approaches run the risk of false positives and unjustified censorship, which is what happened at Facebook in 2017 [103]. Secondly, misinformation evolves rapidly, as the COVID-19 infodemic has proven. Therefore, in order for algorithmic approaches to remain effective, they would need to evolve at the same pace, which is difficult given that even classifiers trained to detect misinformation were unequipped for novel claims surrounding COVID-19 [19].
Corrective approaches attempt to debunk misinformation using fact-checking and correction [104]. Fact-checking initiatives such as PolitiFact [105] and Snopes [106] check and debunk major headlines which are published on their websites. The evidence on the efficacy of fact-checking approaches is mixed, with some highlighting their efficacy in addressing misinformation, and others suggesting they could actually increase belief in misinformation [107][108]. Since it is impossible to fact-check every story, the stories which have not been checked may be mistakenly assumed to have been verified, and therefore regarded as accurate. Fact-checking and debunking approaches were not able to handle the surge of information during the COVID-19 pandemic and they are simply not scalable [109].
Some countries have adopted a legislative approach in tackling misinformation, introducing new regulation and legislation. For instance, France introduced the Fake News Law, which placed restrictions on the information media companies were allowed to publish [110]. A similar initiative was implemented in the U.K., with a specialist unit set up to counter false claims against the COVID-19 pandemic [111]. The concern with such initiatives, however, arises from granting the power for an individual organization to decide what classifies as accurate and what doesn’t [102]. EUvsDisinfo (a European Union-funded group dedicated to tackling misinformation), for instance, was subject to heavy criticism, including from Dutch politicians, for infringing freedom of speech and was subsequently proposed the initiative be scrapped altogether [112].

5.1. Psychologically-Informed Interventions for Addressing Misinformation

The shortcomings of the approaches to addressing the sharing and spread of misinformation described above have seen scientists turn to psychology, education and the behavioral sciences in search of more effective interventions [102]. Two promising psychologically-informed approaches, namely, inoculation or prebunking, and accuracy prompts, are detailed below.

5.1.1. Inoculation

Misinformation has been described as something which “spreads through networks much like a real virus ‘infecting its host’ and rapidly transmitting falsehoods from one mind to another” [5] (p. 3). The non-psychological interventions described above all attempt to correct misinformation after the damage has already been done and face various difficulties in doing so. Researchers have now shifted their focus to a more proactive prebunking (i.e., preemptive bunking) or inoculation against misinformation [5][19][102][109].
This approach is based on inoculation theory, which uses an analogy from immunology [113]. Inoculation theory posits that in the same way in which vaccines work through exposure to a weakened version of a virus, preemptive exposure to weakened examples of misinformation might make people more immune, and less susceptible to believing it [113]. In what van der Linden et al. [5] term a persuasion inoculation, individuals are presented with some misinformation that has been weakened by the addition of two elements [5][102][113]. The first of these is a forewarning that the individual is about to be exposed to counter-attitudinal content (the affective basis), which is thought to elicit feelings of threat, and trigger the protection of pre-existing beliefs. The second is a preemptive refutation of counterarguments (the cognitive basis), which essentially teach and inform the user by modelling the counterarguing process. The information is weakened to the point where it doesn’t actually persuade the person, but is enough to trigger protective responses such as enhanced analytical thinking [113]. Following this experience, the individual develops mental antibodies to misinformation, and will likely employ these when exposed to similar challenges in the real world, thus reducing their susceptibility to misinformation. A meta-analysis of research on inoculation theory concluded that inoculation theory is indeed effective at protecting attitudes from persuasion [114].
The Bad News Game [115] is an award-winning online browser game which puts inoculation theory into practice. It uses a simulated social media environment where the user plays the role of a misinformation creator and learns about the spreading of misinformation in an engaging way (cf., van der Linden & Roozenbeek [102] for a detailed description). Similar to this is Go Viral! [116], a practical application of inoculation theory developed by WHO in collaboration with the U.K. government, specifically aimed at inoculating people against COVID-19 misinformation. This game focuses on building resistance to three techniques used on social media to manipulate people; fearmongering, conspiracy theories and the use of fake experts. Research has shown that these games significantly improve the ability to identify and resist misinformation [102][117].
One limitation of such approaches is identified by Pennycook and Rand [84], who note that they are opt-in; people have to voluntarily choose to participate with the inoculation technique. The problem with this is that people who are low on cognitive reflection and most susceptible to misinformation (and therefore in need of inoculation), are also less likely to participate in such activities. Shorter forms of inoculation (e.g., presenting digital media literacy tips), have proven to be effective in helping people to determine news veracity [118] and these may be more scalable and have more reach.

5.1.2. Accuracy Prompts

As noted above, the inattention-based account of misinformation sharing on social media posits that people generally only want to share information that is accurate, and one of the main reasons underlying the spread of misinformation on is the failure to accurately determine its veracity prior to sharing [83][97]. Research has shown that by shifting their attention towards accuracy, people can better distinguish misinformation from accurate information [97]. Accuracy prompts encourage people to do just this by having people rate the accuracy of information prior to making a judgment about sharing it. This approach is appealing as it does not rely on software to identify and distinguish between accurate information and misinformation (as is the case with algorithmic approaches [102]). In addition, accuracy prompts are easily scalable (unlike fact-checking for instance, which is time consuming and does not cover all information [109]). Accuracy prompts provide a promising approach towards tackling misinformation in a way which preserves user autonomy and encourages them to exercise their desire to avoid sharing misinformation [83].

6. Concluding Remarks

The COVID-19 infodemic has offered a stark warning sign as to the sheer scale and detrimental effects of misinformation, highlighting the importance of understanding its spread as well as its consequences for public health and everyday life. The evidence presented in this entry has revealed a variety of factors underlying the failure to discern misinformation from truth, including values and beliefs, political ideology and scientific knowledge. Overall, however, it seems that susceptibility to misinformation is mainly due to people failing to actually stop and think about the information they are exposed to. In the chaotic online environment created by social media, intuitive reasoning trumps analytic reasoning, resulting in endless impulsive clicks and taps, mindless shares and superficially alluring popularity ratings. As a result of these findings, interventions aimed at tackling misinformation are shifting their focus from remedial approaches to preventative approaches, with creative initiatives that encourage people to think deeper before they act. Although there is still some way to go in fully understanding the spread of misinformation, the field of psychology is proving to offer valuable insights and offers promising avenues for future research.
This entry begins with a description of what constitutes fake news, misinformation, and disinformation, explores cases from the COVID-19 infodemic and considers the effect this has had on the societal response to the pandemic. It then goes on to explore the main psychological factors that have been found to play a role in the believability of misinformation and the role of sharing behavior. The entry ends with a description of interventions aimed at addressing misinformation.

References

  1. World Health Organization. WHO Coronavirus (COVID-19) Dashboard. Available online: https://covid19.who.int (accessed on 18 September 2021).
  2. Tangcharoensathien, V.; Calleja, N.; Nguyen, T.; Purnat, T.; D’Agostino, M.; Garcia-Saiso, S.; Landry, M.; Rashidian, A.; Hamilton, C.; AbdAllah, A.; et al. Framework for Managing the COVID-19 Infodemic: Methods and Results of an Online, Crowdsourced WHO Technical Consultation. J. Med. Internet Res. 2020, 22, e19659.
  3. World Health Organisation. Munich Security Conference. Available online: https://www.who.int/dg/speeches/detail/munich-security-conference (accessed on 18 September 2021).
  4. World Health Organization. Coronavirus Disease 2019 (COVID-19) Situation Report-86. Available online: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200415-sitrep-86-covid-19.pdf?sfvrsn=c615ea20_6 (accessed on 18 September 2021).
  5. Van der Linden, S.; Roozenbeek, J.; Compton, J. Inoculating Against Fake News About COVID-19. Front. Psychol. 2020, 11.
  6. Krause, N.M.; Freiling, I.; Beets, B.; Brossard, D. Fact-Checking as Risk Communication: The Multi-Layered Risk of Misinformation in Times of COVID-19. J. Risk Res. 2020, 23, 1052–1059.
  7. Dryhurst, S.; Schneider, C.R.; Kerr, J.; Freeman, A.L.J.; Recchia, G.; van der Bles, A.M.; Spiegelhalter, D.; van der Linden, S. Risk Perceptions of COVID-19 around the World. J. Risk Res. 2020, 23, 994–1006.
  8. World Health Organization. 1st WHO Infodemiology Conference: How Infodemics Affect the World & How They Can Be Managed. Available online: https://www.who.int/docs/default-source/epi-win/infodemic-management/infodemiology-scientific-conference-booklet.pdf?sfvrsn=179de76a_4 (accessed on 18 September 2021).
  9. World Health Organization. WHO Public Health Research Agenda For Managing Infodemics. Available online: https://www.who.int/publications/i/item/9789240019508 (accessed on 18 September 2021).
  10. Gunn, J. Alternate Worlds: The Illustrated History of Science Fiction; A & W Visual Library: Englewood Cliffs, NJ, USA, 1975; ISBN 978-0-89104-049-1.
  11. Kaplan, R.L. Yellow Journalism. In The International Encyclopedia of Communication; Donsbach, W., Ed.; John Wiley & Sons, Ltd.: Chichester, UK, 2008; ISBN 978-1-4051-8640-7.
  12. Evans, R.J. The Third Reich in Power; Penguin Books: New York, NY, USA, 2006; ISBN 978-0-14-303790-3.
  13. Kux, D. Soviet Active Measures and Disinformation: Overview and Assessment. US Army War Coll. Q. Parameters 1985, 15, 17.
  14. Hoppe, T. “Spanish Flu”: When Infectious Disease Names Blur Origins and Stigmatize Those Infected. Am. J. Public Health 2018, 108, 1462–1464.
  15. Basco, S.; Domènech, J.; Rosés, J.R. The Redistributive Effects of Pandemics: Evidence on the Spanish Flu. World Dev. 2021, 141, 105389.
  16. Farkas, J.; Schou, J. Fake News as a Floating Signifier: Hegemony, Antagonism and the Politics of Falsehood. Javn. Public 2018, 25, 298–314.
  17. Tandoc, E.C.; Lim, Z.W.; Ling, R. Defining “Fake News”: A Typology of Scholarly Definitions. Digit. J. 2018, 6, 137–153.
  18. Shu, K.; Sliva, A.; Wang, S.; Tang, J.; Liu, H. Fake News Detection on Social Media: A Data Mining Perspective. SIGKDD Explor. Newsl. 2017, 19, 22–36.
  19. Pennycook, G.; Rand, D.G. The Psychology of Fake News. Trends Cogn. Sci. 2021, 25, 388–402.
  20. Hartley, K.; Vu, M.K. Fighting Fake News in the COVID-19 Era: Policy Insights from an Equilibrium Model. Policy Sci. 2020, 53, 735–758.
  21. Waszak, P.M.; Kasprzycka-Waszak, W.; Kubanek, A. The Spread of Medical Fake News in Social Media—The Pilot Quantitative Study. Health Policy Technol. 2018, 7, 115–118.
  22. Wardle, C. Information Disorder: The Essential Glossary; Shorenstein Center on Media, Politics, and Public Policy, Harvard Kennedy School: Harvard, MA, USA, 2018.
  23. Benkler, Y.; Faris, R.; Roberts, H. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics; Oxford University Press: Oxford, UK, 2018; ISBN 0-19-092364-4.
  24. Floridi, L. Brave.Net.World: The Internet as a Disinformation Superhighway? Electron. Libr. 1996, 14, 509–514.
  25. Skinner, S.; Martin, B. Racist Disinformation on the World Wide Web: Initial Implications for the LIS Community. Aust. Libr. J. 2000, 49, 259–269.
  26. Keshavarz, H. How Credible Is Information on the Web: Reflections on Misinformation and Disinformation. Infopreneurship J. 2014, 1, 1–17.
  27. Karlova, N.A.; Fisher, K.E. A Social Diffusion Model of Misinformation and Disinformation for Understanding Human Information Behaviour. Inf. Res. 2013, 18, 573.
  28. Jewitt, R.; Dahlberg, L. The Trouble With Twittering: Integrating Social Media into Mainstream News. Int. J. Media Cult. Politics 2009, 5, 233–246.
  29. Karlovitz, T.J. The Democratization of Technology—And Its Limitation. In Managing Customer Experiences in an Omnichannel World: Melody of Online and Offline Environments in the Customer Journey; Dirsehan, T., Ed.; Emerald Publishing Limited: Bingley, UK, 2020; pp. 13–25. ISBN 978-1-80043-389-2.
  30. Altay, S.; de Araujo, E.; Mercier, H. “If This Account Is True, It Is Most Enormously Wonderful”: Interestingness-If-True and the Sharing of True and False News. Digit. J. 2021, 1–22.
  31. Vosoughi, S.; Roy, D.; Aral, S. The Spread of True and False News Online. Science 2018, 359, 1146–1151.
  32. Pew Research Center. News Use Across Social Media Platforms in 2020; Pew Research Center: Washington, DC, USA, 2020.
  33. Thorson, E. Changing Patterns Of News Consumption and Participation: News Recommendation Engines. Inf. Commun. Soc. 2008, 11, 473–489.
  34. Kang, H.; Bae, K.; Zhang, S.; Sundar, S.S. Source Cues in Online News: Is the Proximate Source More Powerful than Distal Sources? J. Mass Commun. Q. 2011, 88, 719–736.
  35. Shahsavari, S.; Holur, P.; Wang, T.; Tangherlini, T.R.; Roychowdhury, V. Conspiracy in the Time of Corona: Automatic Detection of Emerging COVID-19 Conspiracy Theories in Social Media and the News. J. Comput. Soc. Sci. 2020, 3, 279–317.
  36. Garfin, D.R.; Silver, R.C.; Holman, E.A. The Novel Coronavirus (COVID-2019) Outbreak: Amplification of Public Health Consequences by Media Exposure. Health Psychol. 2020, 39, 355–357.
  37. Gallotti, R.; Valle, F.; Castaldo, N.; Sacco, P.; De Domenico, M. Assessing the Risks of ‘Infodemics’ in Response to COVID-19 Epidemics. Nat. Hum. Behav. 2020, 4, 1285–1293.
  38. De Coninck, D.; Frissen, T.; Matthijs, K.; d’Haenens, L.; Lits, G.; Champagne-Poirier, O.; Carignan, M.-E.; David, M.D.; Pignard-Cheynel, N.; Salerno, S.; et al. Beliefs in Conspiracy Theories and Misinformation About COVID-19: Comparative Perspectives on the Role of Anxiety, Depression and Exposure to and Trust in Information Sources. Front. Psychol. 2021, 12, 646394.
  39. Li, H.O.-Y.; Bailey, A.; Huynh, D.; Chan, J. YouTube as a Source of Information on COVID-19: A Pandemic of Misinformation? BMJ Glob. Health 2020, 5, e002604.
  40. Ofcom. Half of UK Adults Exposed to False Claims about Coronavirus. Available online: https://www.ofcom.org.uk/about-ofcom/latest/features-and-news/half-of-uk-adults-exposed-to-false-claims-about-coronavirus (accessed on 19 September 2021).
  41. Pew Research Center. Americans Immersed in COVID-19 News; Most Think Media Are Doing Fairly Well Covering It. Available online: https://www.journalism.org/2020/03/18/americans-immersed-in-covid-19-news-most-think-media-are-doing-fairly-well-covering-it/ (accessed on 19 September 2021).
  42. Pennycook, G.; Cannon, T.D.; Rand, D.G. Prior Exposure Increases Perceived Accuracy of Fake News. J. Exp. Psychol. Gen. 2018, 147, 1865–1880.
  43. Andersen, K.G.; Rambaut, A.; Lipkin, W.I.; Holmes, E.C.; Garry, R.F. The Proximal Origin of SARS-CoV-2. Nat. Med. 2020, 26, 450–452.
  44. World Health Organization. Coronavirus Disease (COVID-19) Advice for the Public: Mythbusters. Available online: https://www.who.int/emergencies/diseases/novel-coronavirus-2019/advice-for-public/myth-busters (accessed on 19 September 2021).
  45. Ahmed, W.; Vidal-Alaball, J.; Downing, J.; López Seguí, F. COVID-19 and the 5G Conspiracy Theory: Social Network Analysis of Twitter Data. J. Med. Internet Res. 2020, 22, e19458.
  46. Cook, J.; Van Der Linden, S.; Lewandowsky, S.; Ecker, U.K. Coronavirus, ‘Plandemic’and the Seven Traits of Conspiratorial Thinking. Available online: https://theconversation.com/coronavirus-plandemic-and-the-seven-traits-of-conspiratorial-thinking-138483 (accessed on 19 September 2020).
  47. Owens, B. Excitement around Hydroxychloroquine for Treating COVID-19 Causes Challenges for Rheumatology. Lancet Rheumatol. 2020, 2, e257.
  48. Meyerowitz, E.A.; Vannier, A.G.L.; Friesen, M.G.N.; Schoenfeld, S.; Gelfand, J.A.; Callahan, M.V.; Kim, A.Y.; Reeves, P.M.; Poznansky, M.C. Rethinking the Role of Hydroxychloroquine in the Treatment of COVID-19. FASEB J. 2020, 34, 6027–6037.
  49. Stanley, M.L.; Barr, N.; Peters, K.; Seli, P. Analytic-Thinking Predicts Hoax Beliefs and Helping Behaviors in Response to the COVID-19 Pandemic. Think. Reason. 2021, 27, 464–477.
  50. Freeman, D.; Waite, F.; Rosebrock, L.; Petit, A.; Causier, C.; East, A.; Jenner, L.; Teale, A.-L.; Carr, L.; Mulhall, S.; et al. Coronavirus Conspiracy Beliefs, Mistrust, and Compliance with Government Guidelines in England. Psychol. Med. 2020, 1–13.
  51. Imhoff, R.; Lamberty, P. A Bioweapon or a Hoax? The Link between Distinct Conspiracy Beliefs About the Coronavirus Disease (COVID-19) Outbreak and Pandemic Behavior. Soc. Psychol. Personal. Sci. 2020, 11, 1110–1118.
  52. Uscinski, J.E.; Enders, A.M.; Klofstad, C.; Seelig, M.; Funchion, J.; Everett, C.; Wuchty, S.; Premaratne, K.; Murthi, M. Why Do People Believe COVID-19 Conspiracy Theories? HKS Misinf. Rev. 2020.
  53. Lep, Ž.; Babnik, K.; Hacin Beyazoglu, K. Emotional Responses and Self-Protective Behavior Within Days of the COVID-19 Outbreak: The Promoting Role of Information Credibility. Front. Psychol. 2020, 11, 1846.
  54. Islam, M.S.; Sarkar, T.; Khan, S.H.; Mostofa Kamal, A.-H.; Hasan, S.M.M.; Kabir, A.; Yeasmin, D.; Islam, M.A.; Amin Chowdhury, K.I.; Anwar, K.S.; et al. COVID-19–Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis. Am. J. Trop. Med. Hyg. 2020, 103, 1621–1629.
  55. Wen, J.; Aston, J.; Liu, X.; Ying, T. Effects of Misleading Media Coverage on Public Health Crisis: A Case of the 2019 Novel Coronavirus Outbreak in China. Anatolia 2020, 31, 331–336.
  56. Armstrong, C.; Hildebrandt, C. China Kids Stay Home. Available online: https://www.pressreader.com/australia/the-daily-telegraph-sydney/20200129/281479278389840 (accessed on 23 September 2019).
  57. Mead, W.R. China Is the Real Sick Man of Asia. Available online: https://www.wsj.com/articles/china-is-the-real-sick-man-of-asia-11580773677 (accessed on 23 September 2021).
  58. Zheng, Y.; Goh, E.; Wen, J. The Effects of Misleading Media Reports about COVID-19 on Chinese Tourists’ Mental Health: A Perspective Article. Anatolia 2020, 31, 337–340.
  59. Jeung, R.; Nham, K. Incidents of Coronavirus-Related Discrimination. Available online: http://www.asianpacificpolicyandplanningcouncil.org/wp-content/uploads/CA_Report_6_30_20.pdf (accessed on 23 September 2021).
  60. Tessler, H.; Choi, M.; Kao, G. The Anxiety of Being Asian American: Hate Crimes and Negative Biases During the COVID-19 Pandemic. Am. J. Crim. Justice 2020, 45, 636–646.
  61. Jolley, D.; Paterson, J.L. Pylons Ablaze: Examining the Role of 5G COVID-19 Conspiracy Beliefs and Support for Violence. Br. J. Soc. Psychol. 2020, 59, 628–640.
  62. Brooks, S.K.; Webster, R.K.; Smith, L.E.; Woodland, L.; Wessely, S.; Greenberg, N.; Rubin, G.J. The Psychological Impact of Quarantine and How to Reduce It: Rapid Review of the Evidence. Lancet 2020, 395, 912–920.
  63. Barzilay, R.; Moore, T.M.; Greenberg, D.M.; DiDomenico, G.E.; Brown, L.A.; White, L.K.; Gur, R.C.; Gur, R.E. Resilience, COVID-19-Related Stress, Anxiety and Depression during the Pandemic in a Large Population Enriched for Healthcare Providers. Transl. Psychiatry 2020, 10, 291.
  64. Salari, N.; Hosseinian-Far, A.; Jalali, R.; Vaisi-Raygani, A.; Rasoulpoor, S.; Mohammadi, M.; Rasoulpoor, S.; Khaledi-Paveh, B. Prevalence of Stress, Anxiety, Depression among the General Population during the COVID-19 Pandemic: A Systematic Review and Meta-Analysis. Glob. Health 2020, 16, 57.
  65. Qiu, J.; Shen, B.; Zhao, M.; Wang, Z.; Xie, B.; Xu, Y. A Nationwide Survey of Psychological Distress among Chinese People in the COVID-19 Epidemic: Implications and Policy Recommendations. Gen. Psychiatry 2020, 33, e100213.
  66. Taylor, S.; Landry, C.A.; Paluszek, M.M.; Fergus, T.A.; McKay, D.; Asmundson, G.J.G. COVID Stress Syndrome: Concept, Structure, and Correlates. Depress. Anxiety 2020, 37, 706–714.
  67. Šrol, J.; Ballová Mikušková, E.; Čavojová, V. When We Are Worried, What Are We Thinking? Anxiety, Lack of Control, and Conspiracy Beliefs amidst the COVID-19 Pandemic. Appl. Cognit. Psychol. 2021, 35, 720–729.
  68. Landau, M.J.; Kay, A.C.; Whitson, J.A. Compensatory Control and the Appeal of a Structured World. Psychol. Bull. 2015, 141, 694–722.
  69. Kay, A.C.; Whitson, J.A.; Gaucher, D.; Galinsky, A.D. Compensatory Control: Achieving Order Through the Mind, Our Institutions, and the Heavens. Curr. Dir. Psychol. Sci. 2009, 18, 264–268.
  70. Nyhan, B. Misinformation and Fact-Checking: Research Findings from Social Science. Available online: https://www.dartmouth.edu/~nyhan/Misinformation_and_Fact-checking.pdf (accessed on 28 September 2021).
  71. Douglas, K.M.; Uscinski, J.E.; Sutton, R.M.; Cichocka, A.; Nefes, T.; Ang, C.S.; Deravi, F. Understanding Conspiracy Theories. Political Psychol. 2019, 40, 3–35.
  72. Armus, T.; Hassan, J. ‘Go to China If You Want Communism’: Anti-Quarantine Protester Clashes with People in Scrubs. Available online: https://www.washingtonpost.com/nation/2020/04/20/go-china-if-you-want-communism-anti-quarantine-protester-clashes-with-people-scrubs/ (accessed on 29 September 2021).
  73. Sylvester, S.M. COVID-19 and Motivated Reasoning: The Influence of Knowledge on COVID-Related Policy and Health Behavior. Soc. Sci. Q. 2021, 12989.
  74. Hart, P.S.; Chinn, S.; Soroka, S. Politicization and Polarization in COVID-19 News Coverage. Sci. Commun. 2020, 42, 679–697.
  75. Motta, M.; Stecula, D.; Farhart, C. How Right-Leaning Media Coverage of COVID-19 Facilitated the Spread of Misinformation in the Early Stages of the Pandemic in the U.S. Can. J. Political Sci. 2020, 53, 335–342.
  76. Harvey, J.N.; Lawson, V.L. The Importance of Health Belief Models in Determining Self-Care Behaviour in Diabetes. Diabet. Med. 2009, 26, 5–13.
  77. Gadarian, S.K.; Goodman, S.W.; Pepinsky, T.B. Partisanship, Health Behavior, and Policy Attitudes in the Early Stages of the COVID-19 Pandemic. PLoS ONE 2021, 16, e0249596.
  78. Kunda, Z. The Case for Motivated Reasoning. Psychol. Bull. 1990, 108, 480–498.
  79. Festinger, L. A Theory of Cognitive Dissonance; Stanford University Press: Redwood City, CA, USA, 1957; p. 291.
  80. Freiling, I.; Krause, N.M.; Scheufele, D.A.; Brossard, D. Believing and Sharing Misinformation, Fact-Checks, and Accurate Information on Social Media: The Role of Anxiety during COVID-19. New Media Soc. 2021, 146144482110114.
  81. Lazer, D.M.J.; Baum, M.A.; Benkler, Y.; Berinsky, A.J.; Greenhill, K.M.; Menczer, F.; Metzger, M.J.; Nyhan, B.; Pennycook, G.; Rothschild, D.; et al. The Science of Fake News. Science 2018, 359, 1094–1096.
  82. Spohr, D. Fake News and Ideological Polarization: Filter Bubbles and Selective Exposure on Social Media. Bus. Inf. Rev. 2017, 34, 150–160.
  83. Pennycook, G.; Epstein, Z.; Mosleh, M.; Arechar, A.A.; Eckles, D.; Rand, D.G. Shifting Attention to Accuracy Can Reduce Misinformation Online. Nature 2021, 592, 590–595.
  84. Pennycook, G.; Rand, D.G. Lazy, Not Biased: Susceptibility to Partisan Fake News Is Better Explained by Lack of Reasoning than by Motivated Reasoning. Cognition 2019, 188, 39–50.
  85. Kahneman, D. Thinking, Fast and Slow, 1st pbk. ed.; Farrar, Straus and Giroux: New York, NY, USA, 2013; ISBN 978-0-374-53355-7.
  86. Pennycook, G.; Fugelsang, J.A.; Koehler, D.J. Everyday Consequences of Analytic Thinking. Curr. Dir. Psychol. Sci. 2015, 24, 425–432.
  87. Fiske, S.T.; Taylor, S.E. Social Cognition: From Brains to Culture, 2nd ed.; SAGE: Los Angeles, CA, USA, 2013; ISBN 978-1-4462-5814-9.
  88. Kohlberg, L. Stage and sequence: The cognitive-developmental approach to socialization. In Handbook of Socialization Theory and Research; Goslin, D.A., Ed.; Rand McNally: Chicago, IL, USA, 1969; pp. 347–480.
  89. Swami, V.; Voracek, M.; Stieger, S.; Tran, U.S.; Furnham, A. Analytic Thinking Reduces Belief in Conspiracy Theories. Cognition 2014, 133, 572–585.
  90. Čavojová, V.; Šrol, J.; Ballová Mikušková, E. How Scientific Reasoning Correlates with Health-Related Beliefs and Behaviors during the COVID-19 Pandemic? J. Health Psychol. 2020, 135910532096226.
  91. Zimmerman, C. The Development of Scientific Thinking Skills in Elementary and Middle School. Dev. Rev. 2007, 27, 172–223.
  92. Čavojová, V.; Šrol, J.; Jurkovič, M. Why Should We Try to Think like Scientists? Scientific Reasoning and Susceptibility to Epistemically Suspect Beliefs and Cognitive Biases. Appl. Cognit. Psychol. 2020, 34, 85–95.
  93. Horne, B.D.; Nevo, D.; Adali, S.; Manikonda, L.; Arrington, C. Tailoring Heuristics and Timing AI Interventions for Supporting News Veracity Assessments. Comput. Hum. Beha. Rep. 2020, 2, 100043.
  94. Simon, H.A. Rational Choice and the Structure of the Environment. Psychol. Rev. 1956, 63, 129–138.
  95. Smith, A.; Anderson, M. Social Media Use in 2018. Available online: https://www.pewresearch.org/internet/2018/03/01/social-media-use-in-2018/ (accessed on 1 October 2021).
  96. Brady, W.J.; Crockett, M.J.; Van Bavel, J.J. The MAD Model of Moral Contagion: The Role of Motivation, Attention, and Design in the Spread of Moralized Content Online. Perspect. Psychol. Sci. 2020, 15, 978–1010.
  97. Pennycook, G.; McPhetres, J.; Zhang, Y.; Lu, J.G.; Rand, D.G. Fighting COVID-19 Misinformation on Social Media: Experimental Evidence for a Scalable Accuracy-Nudge Intervention. Psychol. Sci. 2020, 31, 770–780.
  98. Jordan, J.J.; Rand, D.G. Signaling When No One Is Watching: A Reputation Heuristics Account of Outrage and Punishment in One-Shot Anonymous Interactions. J. Personal. Soc. Psychol. 2020, 118, 57–88.
  99. Lobato, E.J.C.; Powell, M.; Padilla, L.M.K.; Holbrook, C. Factors Predicting Willingness to Share COVID-19 Misinformation. Front. Psychol. 2020, 11, 566108.
  100. Kahneman, D.; Slovic, S.P.; Slovic, P.; Tversky, A. Judgment under Uncertainty: Heuristics and Biases; Cambridge University Press: Cambridge, UK, 1982; ISBN 0-521-28414-7.
  101. Mosleh, M.; Pennycook, G.; Arechar, A.A.; Rand, D.G. Cognitive Reflection Correlates with Behavior on Twitter. Nat. Commun. 2021, 12, 921.
  102. Van Der Linden, S.; Rozenbeek, J. Psychological inoculation against fake news. In The Psychology of Fake News: Accepting, Sharing, and Correcting Misinformation; Greifeneder, R., Jaffé, M.E., Newman, E.J., Schwarz, N., Eds.; Routledge: New York, NY, USA, 2021; pp. 147–170.
  103. Wakefield, J. Facebook’s Fake News Experiment Backfires. Available online: https://www.bbc.com/news/technology-41900877 (accessed on 2 October 2021).
  104. Zollo, F.; Bessi, A.; Del Vicario, M.; Scala, A.; Caldarelli, G.; Shekhtman, L.; Havlin, S.; Quattrociocchi, W. Debunking in a World of Tribes. PLoS ONE 2017, 12, e0181821.
  105. Politifact. Available online: https://www.politifact.com (accessed on 8 October 2021).
  106. Snopes. Available online: https://www.snopes.com (accessed on 8 October 2021).
  107. Nyhan, B.; Reifler, J. When Corrections Fail: The Persistence of Political Misperceptions. Political Behav. 2010, 32, 303–330.
  108. Nyhan, B.; Porter, E.; Reifler, J.; Wood, T. Taking Corrections Literally But Not Seriously? The Effects of Information on Factual Beliefs and Candidate Favorability. SSRN J. 2017.
  109. Van Bavel, J.J.; Baicker, K.; Boggio, P.S.; Capraro, V.; Cichocka, A.; Cikara, M.; Crockett, M.J.; Crum, A.J.; Douglas, K.M.; Druckman, J.N.; et al. Using Social and Behavioural Science to Support COVID-19 Pandemic Response. Nat. Hum. Behav. 2020, 4, 460–471.
  110. Bremner, C. France Aims to Ban Fake News at Election Time. Available online: https://www.thetimes.co.uk/article/france-aims-to-ban-fake-news-at-election-time-jwspzjx83 (accessed on 2 October 2021).
  111. BBC News Coronavirus: Unit Set up to Counter False Claims. Available online: https://www.bbc.com/news/uk-politics-51800216 (accessed on 2 October 2021).
  112. Pieters, J. Dutch Politicians Want EU Anti-Fake News Watchdog Scrapped. Available online: https://nltimes.nl/2018/03/06/dutch-politicians-want-eu-anti-fake-news-watchdog-scrapped (accessed on 2 October 2021).
  113. McGuire, W.J. A Vaccine for Brainwash. Psychol. Today 1970, 3, 36–64.
  114. Banas, J.A.; Rains, S.A. A Meta-Analysis of Research on Inoculation Theory. Commun. Monogr. 2010, 77, 281–311.
  115. Bad News. Available online: www.getbadnews.com (accessed on 8 October 2021).
  116. Go Viral! Available online: https://www.goviralgame.com/en/play (accessed on 8 October 2021).
  117. Basol, M.; Roozenbeek, J.; Van der Linden, S. Good News about Bad News: Gamified Inoculation Boosts Confidence and Cognitive Immunity Against Fake News. J. Cognit. 2020, 3, 2.
  118. Guess, A.M.; Lerner, M.; Lyons, B.; Montgomery, J.M.; Nyhan, B.; Reifler, J.; Sircar, N. A Digital Media Literacy Intervention Increases Discernment between Mainstream and False News in the United States and India. Proc. Natl. Acad. Sci. USA 2020, 117, 15536–15545.
More
Video Production Service