Tech

Facebook adds abstract location data to Pages with large audiences in attempt at transparency

Ostensibly to fight misinformation on Facebook and Instagram, the feature is unlikely to help much.

Bloomberg/Bloomberg/Getty Images

Facebook will start providing more background information on Pages and Instagram accounts with large audiences, the company said today in a blog post. Facebook says the new feature is meant to increase transparency around the sources of information being spread on its platforms.

Right now the feature is being piloted in the U.S., and it’s first rolling out for accounts based outside U.S. borders with large American followings. Facebook specifically mentions that this feature is meant to combat political misinformation. It’s part of a larger series of updates that hope to “protect” the 2020 U.S. elections.

The increased transparency is useful and, truth be told, it should have been activated long ago. Facebook and Instagram users have the right to know the hands behind these accounts. This being said: this feature alone won’t be enough to stem the flow of misinformation on either Facebook or Instagram. It’s a band-aid rather than a solution.

Context clues — The increased transparency provided by this update really only provides the bare minimum of information. This puts the onus on users to decipher whether or not any given account is a reliable source of information.

As pictured above, the “About This Page” popup really only displays broad location data about the page in question. Users are then meant to take this context into account when choosing whether or not to believe information from the source.

How will this help? — Facebook hopes providing location data about high-reach Pages and Instagram accounts will assist users in deciding for themselves about their validity.

The obvious example here is political information: click on the page and see it’s based in Russia and you’ll probably decide not to read it. This might be helpful in limited circumstances.

Nowhere near enough — This increased transparency is severely limited in its abilities to stop misinformation from reaching users.

For one thing, the feature is ripe for misuse. If an organization is attempting to spread misinformation and knows its location is being used to verify that information, the organization would likely find a way to base its accounts in the United States. This could easily persuade readers that the information is more credible — the opposite of the feature’s intended effect.

Even if the feature is somehow implemented correctly, it doesn’t do much to address the misinformation problem at Facebook. In fact, the platform’s misinformation problem seems to be getting worse, with reports cropping up that coronavirus misinformation is slipping through its content filters. Concerted efforts to spread misinformation — like the many anti-quarantine groups linked to one gun-rights group — are somehow allowed to continue operating, even after their roots have been brought to Facebook’s attention.

If Facebook wants to stop misinformation from spreading through its enormous user base, it’s going to need changes much more far-reaching than just some simple location transparency. Probably best to not to hold our breath while we wait for that to happen.