Tech

Facebook employees aren't impressed with its election preparation

49%

The percentage of Facebook employees who don't believe the company is having a positive impact on the world.

JOSH EDELSON/AFP/Getty Images

Despite beefing up its fact-checking and misinformation policies, only a slight majority of Facebook employees, 51 percent, think the social network is having a positive impact on the world. That information comes from a survey taken by more than 49,000 employees in October, as seen by BuzzFeed News.

The most common comments made by employees related to hate speech and misinformation on the platform, and concern that leadership is focusing on the wrong metrics. Employee performance is often evaluated based on growth metrics, such as increasing usage of a new feature.

Despite the growing discontent, 69 percent (nice) of employees reported that Facebook is a favorable place to work, and the average employee intends to stay at the company for 4.3 years.

Not good enough — Facebook has implemented a variety of measures to try and stem misinformation surrounding today's election, such as temporarily blocking ads any new political ads. It recently banned the QAnon conspiracy movement that claims a Deep State is trying to take down President Trump.

But in internal message boards, employees continue to flag issues, like a Washington Post story that right-wing pages are given preferential treatment in order to appease Trump, and another report indicating that the Biden campaign has been charged more for Facebook ads than Trump.

As concerns rise that mail-in ballots may slow a final result, Facebook and Twitter will have to grapple with how to address premature declarations of a winner. For its part, Facebook says it will rely on six outlets including the Associated Press and Fox News to decide when an outcome is official. It will also label any premature victory announcements, though whether or not users actually pay attention to those is a big question mark. The real the danger is that Trump may try and declare victory and then say it was stolen from him should he end up losing; Facebook has largely refused to take down his posts. More aggressive labeling could further anger conservatives who have been attacking social media companies over false claims of censorship.

Ironically, for all the talk that CEO Mark Zuckerberg does about wanting to support free speech on Facebook, employees were recently barred from discussing politics internally. The company felt it was leading to too much division, failing to understand that politics and business are inextricably linked. Facebook has a significant influence on the world, and its decisions are political. The division has really emerged because Facebook presents a set of values but doesn't follow them, and that creates tension among employees who want to see the company follow the values it sets forth.

Content cop — Nobody is super confident that Facebook is more prepared than it was in 2016. No matter what the company says, it allows sensationalist content to spread rapidly and only is able to take it down after it's too late. Its platform was designed to make sharing easy, and that conflicts with ensuring integrity. It doesn't want to be in this position of policing content, so it's always done so slowly and begrudgingly.

The next few days will be a big test of whether or not Facebook's investments in fighting misinformation have paid off.