Culture

Facebook's latest propaganda sweep takes down QAnon-linked pages

The social network also removed pages and accounts linked to VDARE, an anti-immigration website.

Cybercrime concept with hooded hacker using tablet computer, modern futuristic digital glitch effect
Shutterstock

Facebook has published a list of all the coordinated misinformation networks that it pulled from its platform in April. Since the 2016 elections, the company has been playing cat-and-mouse with groups attempting to strategically spread false or inflammatory information, usually in hopes of generating clicks to their sites and subsequent ad revenue.

Although Facebook is currently fighting a new wave of misinformation regarding the coronavirus pandemic, it says that the removals it's announcing today focus on networks it identified before the outbreak began. In particular, Facebook took down 5 pages, 20 accounts, and 6 groups linked to the fringe conspiracy theorist QAnon network.

Conspiracy accounts are a real threat — QAnon's primary shtick, if you will, is peddling a theory that the "deep state" is secretly devising a plot to take down President Trump. Trump himself has been occasionally caught retweeting QAnon-linked accounts. The accounts he has elevated are known to make some wild claims:

Conspiracy theory groups like QAnon are not just an imaginary threat but have caused real-world violence, including the 2016 shooting at Comet Ping Pong pizzeria in D.C. that was inspired by a claim on Twitter that the Clintons were running a pedophilia ring from the restaurant. Facebook last month also removed accounts and pages linked to VDARE, a website that posts anti-immigration content. In all, that takedown caught 19 pages, 15 accounts, and one group.

Cleaning up the mess — Facebook says it continuously monitors for fake news networks on its platform through the use of both automated and manual detection. It doesn't publicize all of its takedowns either, but says it focuses on those that attempt to manipulate public debate such as to sway elections. That makes sense since Facebook's reputation amongst Democrats was heavily battered after Trump won, with many saying the company was too slow to act against misinformation.

Facebook has seen its costs rise significantly as it hires up to address issues of hate speech, election interference, and other content moderation. The company's expenses rose 51 percent in 2019 over the previous year, which the company attributed to a bulking up of its content moderation and safety teams.

Despite this, the sheer scale of its platform has made it difficult to keep up with the ever-changing tactics of its more crafty users.