Culture

Facebook accidentally spread QAnon info while trying to limit it

First a delay in its official response. Now a glitch. Mark Zuckerberg's company can't seem to get its anti-extremism project right.

SANDY HUFFAKER/AFP/Getty Images

Facebook's Redirect Initiative is, according to the company at least, about turning QAnon-curious people away from the extremist conspiracy group and toward the Global Network on Extremism and Technology's (GNET's) website. Mark Zuckerberg's network went live with the program this summer, but this week announced a "glitch" exposed Facebook users to QAnon content even when they weren't looking for it.

"When we first launched the Redirect Initiative for QAnon today there was a glitch that caused people to see information about this topic when they searched for unrelated terms. We’ve paused this Redirect while we fix the issue," Facebook announced on Twitter.

Redirecting users to GNET resources was to "help inform them of the realities of QAnon and its ties to violence and real world harm," Facebook said. But with the apparent technical snafu, people were able to access content from the movement that passionately believes in conspiracies such as a secret group of powerful people in local and foreign governments running a child trafficking chain, how these figures supposedly rely on sacrificial bloodshed for power, how there is apparently a global scheme to undo the United States, and more.

The movement — with its increasingly disturbing theories — has elicited the scrutiny of the Federal Bureau of Investigation, which calls it a domestic terrorist threat.

Get it together — Facebook's current tone toward everything and anything related to QAnon directly contradicts its previous conduct. In the past, the company's support for free speech — which frequently bordered on an absolutist and binary-based approach — allowed groups and pages affiliated with the QAnon movement to thrive.

While Facebook's modus operandi has changed only after public pressure and severe rebukes, QAnon content is the same as it was before: unhinged. The same supporters being banned right now were openly calling for violence against politicians back then alongside peddling hoaxes around Black Lives Matter protests, believing in a supposed day of reckoning called "The Storm" on which Donald Trump will carry out revenge against the pedophilic politicians, among other ideas so ludicrous they make David Icke's suggestion the British Royal Family are lizards looks practically reasonable.

Facebook doesn't deserve praise for its nascent commitment to tackle the problem its silence and hands-off approach helped boost in the first place. Instead, it deserves to be investigated for its ongoing unwillingness to take responsibility for the violent, racist, and extremist content its platform continues to harbor and defend.