1000/1000
Hot
Most Recent
Fact checking is the act of checking factual assertions in non-fictional text in order to determine the veracity and correctness of the factual statements in the text. This may be done either before (ante hoc) or after (post hoc) the text has been published or otherwise disseminated. Ante hoc fact-checking (fact checking before dissemination) aims to remove errors and allow text to proceed to dissemination (or to rejection if it fails confirmations or other criteria). Post hoc fact-checking is most often followed by a written report of inaccuracies, sometimes with a visual metric from the checking organization (e.g., Pinocchios from The Washington Post Fact Checker, or TRUTH-O-METER ratings from PolitiFact). Several organizations are devoted to post hoc fact-checking, such as FactCheck.org and PolitiFact. Research on the impact of fact-checking is relatively recent but the existing research suggests that fact-checking does indeed correct misperceptions among citizens, as well as discourage politicians from spreading misinformation.
One study finds that fact-checkers PolitiFact, FactCheck.org, and Washington Post's Fact Checker overwhelmingly agree on their evaluations of claims.[1][2]
However, a study by Morgan Marietta, David C. Barker and Todd Bowser found "substantial differences in the questions asked and the answers offered." They concluded that this limited the "usefulness of fact-checking for citizens trying to decide which version of disputed realities to believe."[3]
A paper by Chloe Lim, Ph.D. student at Stanford University, finds little overlap in the statements that fact-checkers check. Out of 1065 fact-checks by PolitiFact and 240 fact-checks by The Washington Post's Fact-Checker, there were only 70 statements that both fact-checkers checked. The study found that the fact-checkers gave consistent ratings for 56 out of 70 statements, which means that one out every five times, the two fact-checkers disagree on the accuracy of statements.[4]
Studies of post hoc fact checking have made clear that such efforts often result in changes in the behavior, in general, of both the speaker (making them more careful in their pronouncements) and of the listener or reader (making them more discerning with regard to the factual accuracy of content); observations include the propensities of audiences to be completely unswayed by corrections to errors regarding the most divisive subjects, or the tendency to be more greatly persuaded by corrections of negative reporting (e.g., "attack ads"), and to see minds changed only when the individual in error was someone reasonably like-minded to begin with.[5]
A 2015 study found evidence a "backfire effect" (correcting false information may make partisan individuals cling more strongly to their views): "Corrective information adapted from the Centers for Disease Control and Prevention (CDC) website significantly reduced belief in the myth that the flu vaccine can give you the flu as well as concerns about its safety. However, the correction also significantly reduced intent to vaccinate among respondents with high levels of concern about vaccine side effects--a response that was not observed among those with low levels of concern."[6] A 2017 study attempted to replicate the findings of the 2015 study but failed to do so.[7]
A 2016 study found little evidence for the "backfire effect": "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments."[8] A study of Donald Trump supporters during the 2016 race similarly found little evidence for the backfire effect: "When respondents read a news article about Mr. Trump’s speech that included F.B.I. statistics indicating that crime had “fallen dramatically and consistently over time,” their misperceptions about crime declined compared with those who saw a version of the article that omitted corrective information (though misperceptions persisted among a sizable minority)."[9]
Studies have shown that fact-checking can affect citizens' belief in the accuracy of claims made in political advertisement.[10] A paper by a group of Paris School of Economics and Sciences Po economists found that falsehoods by Marine Le Pen during the 2017 French presidential election campaign (i) successfully persuaded voters, (ii) lost their persuasiveness when fact-checked, and (iii) did not reduce voters' political support for Le Pen when her claims were fact-checked.[11] A 2017 study in the Journal of Politics found that "individuals consistently update political beliefs in the appropriate direction, even on facts that have clear implications for political party reputations, though they do so cautiously and with some bias... Interestingly, those who identify with one of the political parties are no more biased or cautious than pure independents in their learning, conditional on initial beliefs."[12]
A study by Yale University cognitive scientists Gordon Pennycook and David G. Rand found that Facebook tags of fake articles "did significantly reduce their perceived accuracy relative to a control without tags, but only modestly".[13] A Dartmouth study led by Brendan Nyhan found that Facebook tags had a greater impact than the Yale study found.[14] A "disputed" tag on a false headline reduced the number of respondents who considered the headline accurate from 29% to 19%, whereas a "rated false" tag pushed the number down to 16%.[14] The Yale study found evidence of a backfire effect among Trump supporters younger than 26 years whereby the presence of both untagged and tagged fake articles made the untagged fake articles appear more accurate.[13] In response to research which questioned the effectiveness of the Facebook "disputed" tags, Facebook decided to drop the tags in December 2017 and would instead put articles which fact-checked a fake news story next to the fake news story link whenever it is shared on Facebook.[15]
Based on the findings of a 2017 study in the journal Psychological Science, the most effective ways to reduce misinformation through corrections is by:[16]
A forthcoming study in the Journal of Experimental Political Science found "strong evidence that citizens are willing to accept corrections to fake news, regardless of their ideology and the content of the fake stories."[17]
A paper by Andrew Guess (of Princeton University), Brendan Nyhan (Dartmouth College) and Jason Reifler (University of Exeter) found that consumers of fake news tended to have less favorable views of fact-checking, in particular Trump supporters.[18] The paper found that fake news consumers rarely encountered fact-checks: "only about half of the Americans who visited a fake news website during the study period also saw any fact-check from one of the dedicated fact-checking website (14.0%)."[18]
A 2018 study found that Republicans were more likely to correct their false information on voter fraud if the correction came from Breitbart News rather than a non-partisan neutral source such as PolitiFact.[19]
A 2015 experimental study found that fact-checking can encourage politicians to not spread misinformation. The study found that it might help improve political discourse by increasing the reputational costs or risks of spreading misinformation for political elites. The researchers sent, "a series of letters about the risks to their reputation and electoral security if they were caught making questionable statements. The legislators who were sent these letters were substantially less likely to receive a negative fact-checking rating or to have their accuracy questioned publicly, suggesting that fact-checking can reduce inaccuracy when it poses a salient threat."[20]
One experimental study found that fact-checking during debates affected viewers' assessment of the candidates' debate performance and "greater willingness to vote for a candidate when the fact-check indicates that the candidate is being honest."[21]
A study of Trump supporters during the 2016 presidential campaign found that while fact checks of false claims made by Trump reduced his supporters' belief in the false claims in question, the corrections did not alter their attitudes towards Trump.[22]
Political fact-checking is sometimes criticized as being opinion journalism.[23][24] In September 2016, a Rasmussen Reports national telephone and online survey found that "just 29% of all Likely U.S. Voters trust media fact-checking of candidates’ comments. Sixty-two percent (62%) believe instead that news organizations skew the facts to help candidates they support."[25][26]
The Reporters’ Lab at Duke University maintains a database of fact checking organizations that is managed by Mark Stencel and Bill Adair. The database tracks more than 100 non-partisan organizations around the world. The Lab's inclusion criteria is based on whether the organization
- examines all parties and sides;
- examines discrete claims and reaches conclusions;
- tracks political promises;
- is transparent about sources and methods;
- discloses funding/affiliations;
- and whether its primary mission is news and information.[27]
Among the benefits of printing only checked copy is that it averts serious, sometimes costly, problems, e.g. lawsuits and discreditation. Fact checkers are primarily useful in catching accidental mistakes; they are not guaranteed safeguards against those who wish to commit journalistic frauds
The possible societal benefit of honing the fundamental skill of fact checking has been noted in a round table discussion by Moshe Benovitz, who observes that "modern students use their wireless worlds to augment skepticism and to reject dogma," but goes on to argue that this has positive implications for values development. He argues:
"We can encourage our students to embrace information and vigorously pursue accuracy and veracity. Fact checking can become a learned skill, and technology can be harnessed in a way that makes it second nature… By finding opportunities to integrate technology into learning, students will automatically sense the beautiful blending of… their cyber… [and non-virtual worlds]. Instead of two spheres coexisting uneasily and warily orbiting one another, there is a valuable experience of synthesis…".[63]
He closes, noting that this constitutes "new opportunities for students to contribute to the discussion like never before, inserting technology positively into academic settings" (rather than it being seen as purely as agent of distraction).[63]
One journalistic controversy is that of admitted and disgraced reporter and plagiarist Stephen Glass, who began his journalism career as a fact-checker. The fact checkers at The New Republic and other weeklies for which he worked never flagged the numerous fictions in Glass's reporting. Michael Kelly, who edited some of Glass's concocted stories, blamed himself, rather than the fact-checkers, saying: "Any fact-checking system is built on trust ... If a reporter is willing to fake notes, it defeats the system. Anyway, the real vetting system is not fact-checking but the editor." [64]
The following is a list of individuals for whom it has been reported, reliably, that they have played such a fact checking role at some point in their careers, often as a stepping point to other journalistic endeavors, or to an independent writing career: