We support our Publishers and Content Creators. You can view this story on their website by CLICKING HERE.
After Meta CEO Mark Zuckerberg announced the company will be dumping its partnership with fact-checking companies in favor of a X-like community notes-style method of combating misinformation, PolitiFact tried to fearmonger about such a move. Unfortunately for writer Angela Fu, any criticism she had could also apply to PolitiFact and its partners.
On Wednesday, Fu relied on “experts” who naturally told her what she wanted to hear. Championing one side’s experts is one reason why fact-checkers have burned their credibility. For example, in 2022, PolitiFact rated Sen. Mitch McConnell “false” because liberal experts disagreed with his take on a Democratic “voting rights” bill; the fact that PolitiFact also interviewed CATO’s Ilya Shapiro, who agreed with McConnell, had no impact on their rating.
Fu also warned, “Fact-checkers say that they’ve noticed misinformation go unchecked on X. Science Feedback, a fact-checking organization in the U.S. that was part of Meta’s program, analyzed X posts from the 2024 European Parliament elections. It found that out of the 894 tweets that professional fact-checkers identified as containing misinformation, only 11.7% had a Community Note attached.”
Because the European Fact-Checking Standards Network only provides its data upon request, readers are left to simply guess what those 894 tweets said, although climate-related issues are likely disproportionately represented. Furthermore, it is not like the fact-checkers fact-check everything themselves. Likewise, while PolitiFact may find it necessary to put a label on posts about Donald Trump not being dead, most people do not waste their time visiting such pages.
Back home, Fu writes, “A separate analysis by Poynter and Faked Up into Community Notes made on Election Day in the U.S. found that only a small percentage of notes were rated as helpful.”
That doesn’t mean what Fu seems to think it does. Having a small percentage of notes rated as helpful helps prevent abuse by trolls or hacks with agendas to push or axes to grind.
Meta, like X, will prevent abuse by requiring users of different perspectives to agree on a note, but Fu warned:
This requirement ostensibly reduces bias — one issue Meta claimed to have with its third-party fact-checking program — but experts say that in practice, it is difficult for users of different ideologies to reach a consensus. As a result, misinformation about politics or other controversial topics often goes unchecked on X.
‘The facts don’t care about if there is a consensus about them or not,’ said Maarten Schenk, co-founder and chief technology officer of Lead Stories, a U.S.-based fact-checking organization that operates in multiple languages. ‘The shape of the Earth remains exactly the same whether social media users can form a consensus about it or not.’
Yes, and misinformation, especially left-wing misinformation, goes largely unchecked on PolitiFact. Likewise, while PolitiFact has never endorsed Flat Earth Theory, it has gotten things wrong before, sometimes embarrassingly so.
Yet, Fu further warns, “Schenk and other fact-checkers noted the discrepancy between Meta’s stated goal of reducing bias and its decision to switch to a system so vulnerable to political biases.”
It is attitudes like that that led Zuckerberg to make his decision. You can’t rate JD Vance “mostly false” because you didn’t share his full quote that would have him shown agreeing with you or call Donald Trump a liar for calling Kamala Harris a Marxist, but insist Joe Biden comparing Republicans to Jim Crow is nuanced and then claim you are better at reducing political bias than the average social media user.
?xml>