“Meta has perennially been a home for Russian, Chinese, and Iranian disinformation,” claims Gordon Crovitz, co-CEO of NewsGuard, a company that provides a tool to evaluate the trustworthiness of online information. “Now, Meta apparently has decided to open the floodgates completely.”

Again, fact-checking isn’t perfect; Croviz says that NewsGuard has tracked several “false narratives” on Meta’s platforms already. And the community notes model with which Meta will replace its fact-checking battalions can still be somewhat effective. But research from Mahavedan and others has shown that crowdsourced solutions miss vast swaths of misinformation. And unless Meta commits to maximal transparency in how its version is implemented and used, it will be impossible to know whether the systems are working at all.

It’s also unlikely that the switch to community notes will solve the “bias” problem Meta executives cite are so outwardly concerned about, given that it seems unlikely to exist in the first place.

“The motivator for all of this changing of Meta’s policies and Musk’s takeover of Twitter is this accusation of social media companies being biased against conservatives,” said David Rand, a behavioral scientist at MIT. “There’s just not good evidence of that.”

In a recently published paper in Nature, Rand and his coauthors found that while Twitter users who used a Trump-related hashtag in 2020 were more than four times likelier to ultimately be suspended than those who used pro-Biden hashtags, they were also much more likely to have shared “low-quality” or misleading news.

“Just because there’s a difference in who’s getting acted on, that doesn’t mean there’s bias,” says Rand. “Crowd ratings can do a pretty good job of reproducing the fact-checker ratings … You’re still going to see more conservatives get sanctioned than liberals.”

And while X gets outsize attention in part because of Musk, remember that it’s an order of magnitude smaller than Facebook’s 3 billion monthly active users, which will present its own challenges when Meta installs its own community notes-style system. “There’s a reason there’s only one Wikipedia in the world,” says Matzarlis. “It’s very hard to get a crowdsourced anything off the ground at scale.”

As for the loosening of Meta’s Hateful Conduct policy, that in itself is an inherently political choice. It’s still allowing some things and not allowing others; moving those boundaries to accommodate bigotry does not mean they don’t exist. It just means that Meta is more OK with it than it was the day before.

So much depends on exactly how Meta’s system will work in practice. But between the moderation changes and the community guidelines overhaul, Facebook, Instagram, and Threads are careening toward a world where anyone can say that gay and trans people have a “mental illness,” where AI slop will proliferate even more aggressively, where outrageous claims spread unchecked, where truth itself is malleable.

You know: just like X.

Share.
Exit mobile version