The Half-Truth Effect and Its Implications for Sustainability: History
Please note this is an old version of this entry, which may differ significantly from the current revision.
Contributor: , , ,

Half-truth is defined as “a statement that mingles truth and falsehood with deliberate intent to deceive”. 

  • fake news
  • half-truth effect
  • persuasion
  • misinformation

1. Introduction

The lie which is half a truth is ever the blackest of lies.”
[1] (St. 8).
Is Tennyson’s observation accurate? Are half-truths more insidious than lies that are not associated with truths? If so, what are the boundary conditions of the half-truth effect? What are the implications of the half-truth effect for understanding misinformation campaigns more generally, and how can people protect themselves from the half-truth effect? The present research addresses these questions by proposing that consumers process misinformation in a way that conforms to Tennyson’s quote. This new phenomenon can be called as the half-truth effect ​and offer some insights into the moderation of the new proposed half-truth effect.
Misinformation is defined as information that is initially presented as valid but is subsequently shown to be incorrect [2][3]. The phenomenon of misinformation has become widespread in online environments [4][5], as exemplified by the fact that one in four Americans have admitted to sharing false information online [6], and that falsehood spreads more quickly than truth on social networking sites like Twitter [7]. In recent years, digital misinformation has been influencing the public’s perceptions of scientific topics [8][9][10][11][12][13]. For instance, false accounts regarding the safety of vaccines [14][15] and climate change [2][3] have been gaining more online support. The rapid spreading of false information has been posing a growing risk to the health of the public and the planet [16][17], so much so that the World Economic Forum has classified it among the most relevant dangers to modern society [18].
Half-truth is defined as “a statement that mingles truth and falsehood with deliberate intent to deceive” [19]. However, it may not be that simple. The current work proposes and investigates a new effect, the half-truth effect, that hypothesizes that it is both message structure and veracity of the elements of the argument that can influence belief in misinformation on topics of sustainability and genetically modified organisms (GMOs). Additionally, the following content analyze a personal characteristic that influences the half-truth effect and a potential debiasing technique to provide a roadmap for policy makers to counteract misinformation.

2. History

While there is an entire stream of research investigating how algorithms can be used to identify (and potentially remove) false information across many different online platforms [20], a separate and important stream of research looks at how humans process and use that false information. Given that the artificial intelligence algorithms are not likely to completely remove false information, these content focuses on how humans process that information once it is encountered. Previous work on the processing of misinformation has focused on psychological antecedents that lead individuals to believe in false claims. For instance, research has found that political ideology shapes belief in false claims [21].
So far, research has focused on individual differences and categorical message content, but not on how the message is structured. That is, although previous work has identified several psychological factors that influence belief in misinformation, the literature is currently silent on how the underlying message structure may influence belief in misinformation. This potential gap in the literature is important because message structure may be more controllable than repetition by a bad actor attempting to portray a piece of false information as true. Thus, it is important for policy makers to understand how message structure and order of message components work to influence believability of a false claim.

3. New Discoveries: The Half-Truth Effect

To close this gap in the literature, this work seeks to examine the role of message structure in shaping perceived truthfulness of misleading claims. It can be contended that the structure of the information contained in the message will elicit (or not) a greater belief in the veracity of false information. In deference to the Tennyson quote that inspired this hypothesis, this new phenomenon can be called as the half-truth effect​. It can be proposed that belief in misinformation will be shaped by not only the dictionary definition—a half-truth contains both true and false information [19]—but also by the order of the true and false claims that make up the piece of false information. Specifically, it can be contended that individuals will be more likely to believe in a false message when the message starts with a true piece of information and then uses logical terminology to tie it to an unrelated false piece of information. Conversely, individuals will be less likely to believe misinformation when the false information is presented first, even when followed by a true element.
There are several reasons for suspecting that the half-truth effect occurs in this way. First, it has been shown that emotions induced by an a priori irrelevant event carry over to influence a subsequent (unrelated) economic decision [22]. In the same way, it can be expected that presenting a true claim should encourage message recipients to perceive the communicator as more credible and trustworthy [23][24][25], and those perceptions of creditability and trustworthiness may carry over to the false statement presented next. Furthermore, receiving a true claim first should increase open-mindedness and encourage recipients to entertain the possibility that the subsequent claims are also valid [26][27]. This has been shown in another context such that consumers who have been primed to make supportive elaborations about an unrelated series of propositions (e.g., primed into an acquiescence mindset) are more likely to be positively influenced by an unrelated advertisement encountered next [28][29]. Receiving a true claim first may also elicit an acquiescence mindset that encourages people to accept the subsequent claim as true. Extrapolating these findings into the realm of misinformation, it is hypothesized that the order of presentation should matter to the ultimate perceived truthfulness of a presented message.
That is, a message should be perceived as more truthful when it begins with a statement that is true (even when it is followed by an unrelated false statement). Conversely, the opposite order of information presentation should eliminate the half-truth effect such that presenting a false claim first should reduce perceptions of source credibility and induce the naysaying mindset, resulting in a lower perceived truthfulness of the message. Hence, the order in which claims of mixed validity are presented should be an important moderator of the half-truth effect.
If this is true, then the half-truth effect is dependent on the initial evaluation of the primary element of the message. Thus, the ability for one to discern whether something is fact or fiction becomes an important moderator to consider. By default, humans want to trust things [30]. However, individuals vary in their ability to discern information that is profound from information that does not contain meaning [31]. That is, random information and buzzwords that are combined into a nonsense statement yet formatted with a standard syntactic structure are perceived to be profound by a segment of the population. Pennycook and colleagues [31] call this pseudo-profound bullshit. Drawing inspiration from quotes by Deepak Chopra, Pennycook and colleagues have validated a measure for bullshit receptivity (BSR; example items “Hidden meaning transforms unparalleled abstract beauty”, “Good health imparts reality to subtle creativity”) and have shown that those who are receptive to pseudo-profound bullshit are not merely indiscriminate in their thoughts, but rather do not discern the deceptive vagueness of the statement. Thus, they are more prone to believe statements as true if the syntax implies profundity.
These results can highlight a potential moderator of the half-truth effect. If one is focused on syntax and not actively considering the validity of the initial statement, then there is no reason to assume that the information provided is anything but truthful. Therefore, those who readily accept statements as truthful without a second thought (i.e., high BSR) are unlikely to experience the half-truth effect because they are equally likely to believe both true and false statements. In contrast, those who are low in what Pennycook terms BSR (bullshit receptivity) are more likely to experience the half-truth effect as they will seek to establish the validity of the argument early on and then hold onto that conclusion. The proposed model is depicted in Figure 1.
Figure 1. The half-truth effect model: Statement order is hypothesized to be moderated by bullshit receptivity (BSR), such that the half-truth effect will be evident among those who have a low BSR, but not among those who have a high BSR.
Although the illusory truth effect [32] and this newly proposed half-truth effect may sound similar in their name, they are really quite different in application. First, there is the matter of structure. The illusory truth effect focuses on a singular statement which (through repetition) forms a mental foothold on which the individual builds the illusion of knowledge, or in this case, truth [33]. Conversely, the half-truth effect examines how consumers interpret truth when they encounter multiple statements in tandem. This difference is important because even in social media with limited character counts, messages are often complex, multi-part arguments rather than the singular statement that forms the basis of the illusory truth effect. For this reason, it is important that we understand how message receivers interpret statements that combine true and false elements.
Second, the illusory truth effect relies on repetition to increase belief in both true and false statements independently [34][35], but the half-truth effect is hypothesized to be evident even in a single presentation of a message.
Third, the illusory truth effect is mediated by processing fluency [36][37][38] and is found when the repetitions happen over various time intervals [34][35][39][40][41] and with various numbers of repetitions [34][40][41][42]. By contrast, the half-truth effect hypothesizes that credibility is either built or undermined in the initial statement, creating either an acquiescence mindset (when the initial statement is true) or a naysaying mindset (when the initial statement is false), and that initial perception will carry over onto the subsequent elements of the message—even if the latter is of the opposite valence.
Given the prevalence of misinformation on both social media and traditional media, plus the fact that prior knowledge may not always save one from falling prey to misinformation [34][39][43], understanding the various elements that influence believability of a message is extremely important. When considering misinformation in the real world, specifically in the context of sustainability, the effect of message order becomes significantly more important as it lends itself to consumer judgements and decision-making.

4. Applications: Sustainability and Misinformation

Misinformation has spread both on- and off-line across many important topics including politics, sustainability, and others. Misinformation is so prevalent on sustainability issues that public belief on the subject (e.g., climate change) does not accurately reflect the consensus among scientists. While the vast majority of climate scientists agree that climate warming is likely to be a result of human activities [44], only about 57% of the public believes the same [45]. Importantly, the general misunderstanding of sustainability topics leads to a (false) belief that adopting sustainable solutions is ineffective or undesirable, resulting in reduced implementation of available solutions, such as sustainable living and GMOs (e.g., genetically modified crops), to the detriment of the environment [46].
This paper investigates how the half-truth effect may contribute to the spreading of false information on sustainable living and GMOs, as well as a potential tactic to highlight the falsehoods when they are present. This is an important avenue of research given the relevant role that sustainable solutions play in benefitting the environment and humankind [46][47]. Research has documented several environmental benefits that results from adopting sustainable practices. For instance, the implementation of sustainable agricultural technologies using GMOs can result in more productive food systems, larger yields, and enhanced food security [47][48][49][50]. Additionally, sustainable living practices such as reducing energy consumption can help reduce carbon footprint and improve air and water quality [51]. At the same time, however, public perceptions of sustainable living have been negatively influenced by misinformation leading to the (false) belief that living sustainably is only possible for high income populations and/or by making lifestyle sacrifices [46], and the (false) belief that food grown from GMO seeds is not safe to eat [52].
Thus, sustainable living practices and GMOs are both solutions that can provide substantial benefits to the environment and society. Improving the understanding and acceptance of these ideas, however, has been undermined by the spreading of false messages. The present work investigates how the half-truth effect influences false beliefs in these topics, as well as how to counter this misinformation by applying the Poison Parasite Counter by Cialdini and colleagues [53].

5. Countering the Half-Truth Effect through Poison Parasite Counter

Given that misinformation has become an issue of growing concern in recent years, research has focused on identifying strategies to combat its proliferation. For example, some research has shown that belief in false information can be reduced through counterarguments and misinformation reminders [54][55]. However, if the debunking strategy is encountered without having first seen the false information, the debunking strategies can result in an increased (relative to the no-correction condition) belief in the misinformation [56]. In fact, repeated exposure to false information enhances its perceived truthfulness, even when the information is followed by corrections [42].
Recently, Cialdini and colleagues [53] have introduced the Poison Parasite Counter (PPC) as a method to durably counter false information. The PPC presents two components, the “poison” and the “parasite”. The “poison” component refers to a counterargument that can effectively offset a false claim. In order to be poisonous enough, the counterargument can, for instance, point to the inaccuracy or dishonesty of the false claim. The “parasitic” component decreases belief in misinformation by embedding a counterargument into a false claim. That is, the parasitic component relies on associative memory by enhancing the perceptual similarity between the counterargument and the original false message, so that when one reencounters the original message, that false message acts as a retrieval cue for the counter-message. By creating an association between the false message and the counter argument, the parasitic component undermines the effects of repeated exposure to false claims. The PPC has been shown to counter misleading messages regarding political candidates [53].
It can be proposed that the PPC can be effectively employed to reduce believability of factually untrue statements, even in the face of the half-truth effect. Additionally, whereas Cialdini and colleagues focused their PPC work on messages that were completely false, it seeks to examine the effectiveness of the PPC in undermining belief in misleading messages that are composed by true and false claims that are linked by a flawed logic. That is, even in messages composed of two true elements, if the logical connection between those statements is not sound, then the message is factually untrue in its composite. Thus, the PPC, if effective, should reduce the belief in the composite message made up of factually true statements that are linked by a flawed logical relationship.
Hence, it can be hypothesized that in the absence of PPC, the half-truth effect will be prominent such that a post that begins with a true statement (regardless of the truth of the second part of the message) will be more likely to be believed than a message that begins with a false statement (even if the second half of the message present true information). However, when the participant is presented with PPC inoculation, the belief in the original post will decrease, but the half-truth effect will remain—the PPC will produce a uniform decrease in the belief of all message structures.

6. Results

The results show that the half-truth effect enhances belief in misinformation regarding sustainability and GMOs. Specifically, the studies indicate that the order in which true and false facts are presented influences the perception of the truthfulness of the combined claim. That is, presenting a true claim first increases believability of the overall argument, whereas presenting a false claim first reduces believability. The results also indicate that the half-truth effect is moderated by individuals’ bullshit receptivity (BSR, [52]), such that the effect is evident for individuals with low BSR, but not for individuals with moderate or high BSR. Additionally, the studies show that using the Poison Parasite Counter (PPC) debiasing technique [53] reduces belief in false information but does not eliminate the half-truth effect.

This entry is adapted from the peer-reviewed paper 10.3390/su14116943

References

  1. Tennyson, L.A. The Grandmother. 1864. Available online: https://collections.vam.ac.uk/item/O198447/the-grandmother-photograph-cameron-julia-margaret/ (accessed on 30 March 2022).
  2. Lewandowsky, S. Climate Change Disinformation and how to Combat It. Annu. Rev. Public Health 2021, 42, 1–21.
  3. Treen, K.M.d.; Williams, H.T.P.; O’Neill, S.J. Online Misinformation about Climate Change. Wiley interdisciplinary reviews. Clim. Change 2020, 11, e665.
  4. Vicario, M.D.; Bessi, A.; Zollo, F.; Petroni, F.; Scala, A.; Caldarelli, G.; Stanley, H.E.; Quattrociocchi, W. The Spreading of Misinformation Online. Proc. Natl. Acad. Sci. USA 2016, 113, 554–559.
  5. Wang, Y.; McKee, M.; Torbica, A.; Stuckler, D. Systematic Literature Review on the Spread of Health-Related Misinformation on Social Media. Soc. Sci. Med. 2019, 240, 112552.
  6. Barthel, M.; Mitchell, A.; Holcomb, J. Many Americans Believe Fake News Is Sowing Confusion. 2016. Available online: https://www.pewresearch.org/journalism/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/ (accessed on 30 March 2022).
  7. Vosoughi, S.; Roy, D.; Aral, S. The Spread of True and False News Online. Sci. (Am. Assoc. Adv. Sci.) 2018, 359, 1146–1151.
  8. Hong, S.C. Presumed Effects of “Fake News” on the Global Warming Discussion in a Cross-Cultural Context. Sustainability 2020, 12, 2123.
  9. Kim, S.; Kim, S. The Crisis of Public Health and Infodemic: Analyzing Belief Structure of Fake News about COVID-19 Pandemic. Sustainability 2020, 12, 9904.
  10. Ries, M. The COVID-19 Infodemic: Mechanism, Impact, and Counter-Measures—A Review of Reviews. Sustainability 2022, 14, 2605.
  11. Scheufele, D.A.; Krause, N.M. Science Audiences, Misinformation, and Fake News. Proc. Natl. Acad. Sci. USA 2019, 116, 7662–7669.
  12. De Sousa, Á.F.L.; Schneider, G.; de Carvalho, H.E.F.; de Oliveira, L.B.; Lima, S.V.M.A.; de Sousa, A.R.; de Araújo, T.M.E.; Camargo, E.L.S.; Oriá, M.O.B.; Ramos, C.V.; et al. COVID-19 Misinformation in Portuguese-Speaking Countries: Agreement with Content and Associated Factors. Sustainability 2021, 14, 235.
  13. Farrell, J.; McConnell, K.; Brulle, R. Evidence-Based Strategies to Combat Scientific Misinformation. Nat. Clim. Change 2019, 9, 191–195.
  14. Larson, H.J. The Biggest Pandemic Risk? Viral Misinformation. Nature 2018, 562, 309.
  15. Loomba, S.; de Figueiredo, A.; Piatek, S.J.; de Graaf, K.; Larson, H.J. Measuring the Impact of COVID-19 Vaccine Misinformation on Vaccination Intent in the UK and USA. Nat. Hum. Behav. 2021, 5, 337–348.
  16. Van Der Linden, S.; Maibach, E.; Cook, J.; Leiserowitz, A.; Lewandowsky, S. Inoculating Against Misinformation. Sci. (Am. Assoc. Adv. Sci.) 2017, 358, 1141–1142.
  17. Cacciatore, M.A. Misinformation and Public Opinion of Science and Health: Approaches, Findings, and Future Directions. Proc. Natl. Acad. Sci. USA 2021, 118, 1.
  18. Charlton, E. Fake News: What It Is, and How to Spot It. 2019. Available online: https://europeansting.com/2019/03/06/fake-news-what-it-is-and-how-to-spot-it/ (accessed on 30 March 2022).
  19. Merriam-Webster. Half-Truth. In Merriam-Webster.Com Dictionary. 2022. Available online: https://www-merriam-webster-com.uc.idm.oclc.org/dictionary/half-truth (accessed on 30 March 2022).
  20. Xarhoulacos, C.; Anagnostopoulou, A.; Stergiopoulos, G.; Gritzalis, D. Misinformation Vs. Situational Awareness: The Art of Deception and the Need for Cross-Domain Detection. Sensors 2021, 21, 5496.
  21. Van Bavel, J.J.; Pereira, A. The Partisan Brain: An Identity-Based Model of Political Belief. Trends Cogn. Sci. 2018, 22, 213–224.
  22. Lerner, J.S.; Small, D.A.; Loewenstein, G. Heart Strings and Purse Strings: Carryover Effects of Emotions on Economic Decisions. Psychol. Sci. 2004, 15, 337–341.
  23. Kruglanski, A.W.; Shah, J.Y.; Pierro, A.; Mannetti, L. When Similarity Breeds Content: Need for Closure and the Allure of Homogeneous and Self-Resembling Groups. J. Pers. Soc. Psychol. 2002, 83, 648–662.
  24. Priester, J.R.; Petty, R.E. Source Attributions and Persuasion: Perceived Honesty as a Determinant of Message Scrutiny. Personal. Soc. Psychol. Bull. 1995, 21, 637–654.
  25. Priester, J.R.; Petty, R.E. The Influence of Spokesperson Trustworthiness on Message Elaboration, Attitude Strength, and Advertising Effectiveness. J. Consum. Psychol. 2003, 13, 408–421.
  26. Kruglanski, A.W.; Webster, D.M. Motivated Closing of the Mind: “Seizing” and “Freezing”. Psychol. Rev. 1996, 103, 263–283.
  27. Roets, A.; Kruglanski, A.W.; Kossowska, M.; Pierro, A.; Hong, Y. The Motivated Gatekeeper of our Minds: New Directions in Need for Closure Theory and Research. In Advances in Experimental Social Psychology; Elsevier Science & Technology: Waltham MA, USA, 2015; Volume 52, pp. 221–283.
  28. Wyer, R.S.; Xu, A.J.; Shen, H. The Effects of Past Behavior on Future Goal-Directed Activity. In Advances in Experimental Social Psychology; Elsevier Science & Technology: Waltham, MA, USA, 2012; Volume 46, pp. 237–283.
  29. Xu, A.J.; Wyer, R.S. The Role of Bolstering and Counterarguing Mind-Sets in Persuasion. J. Consum. Res. 2012, 38, 920–932.
  30. Berg, J.; Dickhaut, J.; McCabe, K. Trust, Reciprocity, and Social History. Games Econ. Behav. 1995, 10, 122–142.
  31. Pennycook, G.; Cheyne, J.A.; Barr, N.; Koehler, D.J.; Fugelsang, J.A. On the Reception and Detection of Pseudo-Profound Bullshit. Judgm. Decis. Mak. 2015, 10, 549–563.
  32. Dechêne, A.; Stahl, C.; Hansen, J.; Wänke, M. The Truth about the Truth: A Meta-Analytic Review of the Truth Effect. Personal. Soc. Psychol. Rev. 2010, 14, 238–257.
  33. Hasher, L.; Goldstein, D.; Toppino, T. Frequency and the Conference of Referential Validity. J. Verbal Learn. Verbal Behav. 1977, 16, 107–112.
  34. Fazio, L.K.; Pillai, R.M.; Patel, D. The Effects of Repetition on Belief in Naturalistic Settings. J. Exp. Psychol. Gen. 2022.
  35. Brown, A.S.; Nix, L.A. Turning Lies into Truths: Referential Validation of Falsehoods. J. Exp. Psychol. Learn. Mem. Cogn. 1996, 22, 1088–1100.
  36. Begg, I.M.; Anas, A.; Farinacci, S. Dissociation of Processes in Belief: Source Recollection, Statement Familiarity, and the Illusion of Truth. J. Exp. Psychol. Gen. 1992, 121, 446–458.
  37. Reber, R.; Schwarz, N. Effects of Perceptual Fluency on Judgments of Truth. Conscious. Cogn. 1999, 8, 338–342.
  38. Alter, A.L.; Oppenheimer, D.M. Uniting the Tribes of Fluency to Form a Metacognitive Nation. Personal. Soc. Psychol. Rev. 2009, 13, 219–235.
  39. Fazio, L.K. Repetition Increases Perceived Truth Even for Known Falsehoods. Collabra. Psychol. 2020, 6, 38.
  40. Hassan, A.; Barber, S.J. The Effects of Repetition Frequency on the Illusory Truth Effect. Cogn. Res. Princ. Implic. 2021, 6, 38.
  41. Gigerenzer, G. External Validity of Laboratory Experiments: The Frequency-Validity Relationship. Am. J. Psychol. 1984, 97, 185–195.
  42. Pennycook, G.; Cannon, T.D.; Rand, D.G. Prior Exposure Increases Perceived Accuracy of Fake News. J. Exp. Psychol. Gen. 2018, 147, 1865–1880.
  43. Fazio, L.K.; Brashier, N.M.; Payne, B.K.; Marsh, E.J. Knowledge does Not Protect Against Illusory Truth. J. Exp. Psychol. Gen. 2015, 144, 993–1002.
  44. Cook, J.; Oreskes, N.; Doran, P.T.; Anderegg, W.R.L.; Verheggen, B.; Maibach, E.W.; Carlton, J.S.; Lewandowsky, S.; Skuce, A.G.; Green, S.A.; et al. Consensus on Consensus: A Synthesis of Consensus Estimates on Human-Caused Global Warming. Environ. Res. Lett. 2016, 11, 48002–48008.
  45. Marlon, J.; Neyens, L.; Jefferson, M.; Howe, P.; Mildenberger, M.; Leiserowitz, A. Yale Climate Opinion Maps 2021. 2022. Available online: https://climatecommunication.yale.edu/visualizations-data/ycom-us/ (accessed on 30 March 2022).
  46. Butters, C. Myths and Issues about Sustainable Living. Sustainability 2021, 13, 7521.
  47. Piñeiro, V.; Arias, J.; Dürr, J.; Elverdin, P.; Ibáñez, A.M.; Kinengyere, A.; Opazo, C.M.; Owoo, N.; Page, J.R.; Prager, S.D. A Scoping Review on Incentives for Adoption of Sustainable Agricultural Practices and their Outcomes. Nat. Sustain. 2020, 3, 809–820.
  48. Teklewold, H.; Kassie, M.; Shiferaw, B. Adoption of Multiple Sustainable Agricultural Practices in Rural Ethiopia. J. Agric. Econ. 2013, 64, 597–623.
  49. Mannion, A.M.; Morse, S. Biotechnology in Agriculture: Agronomic and Environmental Considerations and Reflections Based on 15 Years of GM Crops. Prog. Phys. Geogr. 2012, 36, 747–763.
  50. Zilberman, D.; Holland, T.G.; Trilnick, I. Agricultural GMOs-what we Know and Where Scientists Disagree. Sustainability 2018, 10, 1514.
  51. Kamal, A.; Al-Ghamdi, S.G.; Koc, M. Revaluing the Costs and Benefits of Energy Efficiency: A Systematic Review. Energy Res. Soc. Sci. 2019, 54, 68–84.
  52. Pew Research Center. Genetically Modified Foods (GMOs) and Views on Food Safety; Pew Research Center: Washington, DC, USA, 2015.
  53. Cialdini, R.B.; Lasky-Fink, J.; Demaine, L.J.; Barrett, D.W.; Sagarin, B.J.; Rogers, T. Poison Parasite Counter: Turning Duplicitous Mass Communications into Self-Negating Memory-Retrieval Cues. Psychol. Sci. 2021, 32, 1811–1829.
  54. Ecker, U.K.H.; Lewandowsky, S.; Jayawardana, K.; Mladenovic, A. Refutations of Equivocal Claims: No Evidence for an Ironic Effect of Counterargument Number. J. Appl. Res. Mem. Cogn. 2019, 8, 98–107.
  55. Wahlheim, C.N.; Alexander, T.R.; Peske, C.D. Reminders of Everyday Misinformation Statements can Enhance Memory for and Beliefs in Corrections of those Statements in the Short Term. Psychol. Sci. 2020, 31, 1325–1339.
  56. Autry, K.S.; Duarte, S.E. Correcting the Unknown: Negated Corrections may Increase Belief in Misinformation. Appl. Cogn. Psychol. 2021, 35, 960–975.
More
This entry is offline, you can click here to edit this entry!