Culture

Facebook is finally, maybe, kind of toning down political content

Too little too late — and without really addressing the platform's underlying issues.

Moyo Studio/E+/Getty Images

Facebook said today that it will be changing its algorithms to present users with less political content, beginning for a small portion of users in Canada, Brazil, and Indonesia this week. Those algorithm tweaks will also be coming to the United States in the next few weeks, the company said in a blog post.

“As Mark Zuckerberg mentioned in our recent earnings call,” writes Aastha Gupta, Product Management Director at Facebook, “one common piece of feedback we hear is that people don’t want political content to take over their News Feed.”

Gupta says that initially, Facebook plans to conduct a series of tests to draw some better conclusions about how much political content people actually want on their feeds. The company plans to explore “a variety of ways to rank political content” using a number of markers and to then use this information strategically moving forward. COVID-19 information will be exempt from these tests.

Phew. After many, many years of political content on Facebook causing real-life turmoil, the company is finally ready to take a step back and say, hmm, yeah, perhaps it would be nice if our algorithms didn’t prioritize that quite so much. As usual, Facebook’s take on moderation is extremely reactionary — and, time and time again, we’ve watched that policy strategy play out as too little too late. And besides, political content isn't really the issue here. Moderation is.

You might not even see a difference — What, exactly, Facebook will be changing is not made at all clear by today’s announcement. The company’s approach is extremely hesitant: aiming to please the wide spectrum of users on the service, Facebook says it’s trying a variety of approaches in toning down political content.

Gupta says in her blog post that Facebook’s goal is to “preserve the ability to find and interact with political content on Facebook, while respecting each person’s appetite for it at the top of their News Feed.” This seemingly personalized approach has a pivotal downside: there’s a good chance you won’t notice a difference at all. An overly diplomatic approach could end in these algorithm changes being more status quo than anything else.

Skirting the issue — While it’s true that many users feel they’re seeing too much general political content on their feeds, this isn’t the real problem at hand. The amount of political content is much less concerning than the many, many allowances Facebook has made for the spreading of conspiracies, general misinformation, hate speech, and inciting violence.

Facebook’s problem isn’t too much political content showing up on users’ feeds — it’s that so much of that content is inherently harmful. Facebook struggles not just with choosing which content to moderate but also in actually moderating it. Those large-scale platform issues aren’t going to be solved by slightly reducing the amount of political content you see on a daily basis.

Facebook is, above all else, a business. There’s a fair chance the results of these anticipated tests will prove inconclusive or just too varied. In that case, Facebook might choose to make minimal changes — or none at all — to its algorithms, in order to keep the most users and businesses happy.