Facebook is in the process of creating new teams focused on studying and implementing fixes around racial biases on both Facebook and Instagram, according to sources familiar with the matter. The formation of these teams is a fairly radical departure for the company, which largely chooses to sweep issues of race and civil rights under the rug whenever possible.
An Instagram representative confirmed the new teams for both Instagram and the core Facebook app. However, Facebook has not made any sort of formal announcement just yet.
“The racial justice movement is a moment of real significance for our company,” said Instagram’s head of product Vishal Shah. “Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves.
The creation of task forces with the sole mission of positing solutions to racial biases on Facebook and Instagram isn’t just necessary — it’s long overdue. That much is immediately clear from Facebook’s damning civil rights report published earlier this month. The new teams will certainly have their work cut out for them.
Equity and inclusion — The newly formed team at Instagram is tasked with tackling “equity and inclusion,” according to The Wall Street Journal. That includes examining how Black, Hispanic, and other minority users are affected by Facebook’s algorithms and machine-learning systems. The team will compare how those experiences contrast with those of white users on Instagram.
Some collaboration is planned — The team working on Facebook’s core app — which is currently being called the “Inclusivity Product Team” — will look at similar issues.
That team is expected to consult with “a council of Black users and experts on race,” Facebook says. The Inclusivity Product Team will also work directly with other product teams at Facebook in order to design features to directly support minority users on the platform.
Algorithmic bias is still a hazy area — At this point, it is a well-known fact that, in general, our software does have a discrimination problem. But taking action to reverse those biases is still severely lacking, thanks mostly to actions taken to keep the algorithms themselves top-secret.
Facebook’s two-year-long civil rights audit, while quite comprehensive overall, did not assess potential bias in the company’s algorithms because the auditors were not given access to internal research and models. Facebook has made similar moves in the past; at one point last year the company even barred employees from studying racial impacts of its algorithms without special permission from top-level executives.
It’s easy to be cynical about these new teams, given Facebook’s treatment of civil rights issues thus far. But, unlike most of its social justice initiatives, Facebook hasn’t made the creation of these teams a PR move. We can hold onto some small hope that they’re being formed in a (rare) bid to actually affect change at Facebook. The company's civil rights issues are surely far-reaching enough to warrant it.