Oops

Facebook accidentally blocked lots of legitimate coronavirus news yesterday

It’s a symptom of a much larger issue at the social network.

filadendron/E+/Getty Images

A bug in Facebook’s anti-spam filters blocked some legitimate posts about COVID-19, according to reports by many Twitter users. Facebook’s Vice President of Integrity Guy Rosen has since reported that the incorrectly removed posts have been restored. Some users have replied to his tweet with statements that their posts are still stuck in spam limbo.

The cause of the issue is not the real problem here: Facebook is. Despite its public touting of measures to quell the spread of misinformation, Facebook is continually thwarted by its own existence. The bug may be fixed now — but the problem won’t stop here.

Could AI be the root cause? — Alex Stamos, a former Facebook security executive, theorized that the issue was in part due to a shift to auto-moderation, as many of Facebook’s content moderators have been sent home as a measure against the spread of COVID-19. Rosen quickly responded to this speculation by saying the problem was “unrelated to any changes in our content moderator workforce.”

Facebook’s anti-spam bug explanation is plausible, but its refusal to pin any of the blame on its reduced workforce is not. How could missing a fair portion of your moderators not impact the moderation process?

Facebook needs to get its shit together — Facebook has a favorite tune to sing, and it goes something like this: we’re doing everything we can to help. Its efforts are not enough.

The company’s press release site is currently awash in stories about being helpful in the face of the coronavirus pandemic. For example, Facebook partnered yesterday with The International Fact-Checking Network to increase its capacity.

But these well-publicized investments are not fixing Facebook’s moderation problems, which are much more far-reaching than just this isolated issue. The social network has long been awash with fake news articles and hate speech.

Facebook’s moderation problem won’t just go away after the pandemic panic has calmed down. At some point, the company will need to face facts and make large-scale changes if it wants to actually keep its users informed.