Facebook announced today that it will alert people when they're about to share a news article that's more than three months old. The company is addressing the way in which old articles sometimes recirculate, confusing people into thinking outdated information is current.
People don't read before sharing — Countless studies have shown that most users on Facebook re-share articles based on the headline alone, without even opening the link. That's problematic because the article may reflect the state of the world in the past, not the present. A user might read an article stating that growth in new coronavirus cases is slowing in Florida without seeing the date of publication and think it's still accurate. Cases are actually spiking in the state following its reopening.
"Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share," said Facebook. "News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events."
Recent studies have found that 45 percent of Americans get their news from Facebook, highlighting the scale of the problem. The company is trying to address this situation somewhat with its recently launched News tab, where users are presented with a feed of fresh articles with an emphasis on local news outlets.
Some media publications like the Guardian have tried to address the old-article issue on their own by placing prominent labels atop older stories. If you're reading one of their stories from 2015 about President Donald Trump, for example, you'll see a label above the article that reads, "This article is more than 3 years old," highlighted in yellow. The Guardian's social media cards also get updated with a new photo indicating the article is old. But implementing this requires engineering work on the part of cash-strapped media companies.
Facebook has made other changes to combat confusion surrounding news articles, such as placing context buttons on links that provide Wikipedia information on the news outlet publishing the story.
One step forward, two steps back — At the same time, however, Facebook has been under intense scrutiny from the public and its own employees for its content policies. The company has refused to follow Twitter's lead and place any disclaimers against President Trump's posts of dubious accuracy, with CEO Mark Zuckerberg adding that, "Facebook shouldn't be the arbiter of truth of everything people say online." It continues to allow white supremacy groups on the platform, where policies around content moderation are less strict and misinformation can thrive. Facebook's own internal studies have found that 64 percent of people who joined an extremist group did so because its algorithms recommended it.
It's increasingly clear that Facebook is willing to moderate its platform, but not at the risk of angering conservatives, who cry censorship despite dominating the platform. It's unclear how much alerting users to old posts will matter if Facebook continues to allow much unreliable content to spread on the platform.
Twitter recently made its own change to address the way people share articles, stopping users on Android from retweeting articles that they haven't opened. It's still a limited test, however, and it's not known whether Twitter will expand the feature to iOS or make it permanent.