Facebook reiterates to the Senate its inability to keep kids safe
“The question that haunts me is: How can we, or parents, or anyone, trust Facebook?”
Sen. Blumenthal (D-Conn.), in opening remarks at today's hearing.
Today, members of the Senate’s consumer protection subcommittee took turns questioning Facebook’s head of global safety, Antigone Davis, on how Facebook and Instagram are protecting their youngest users from harm. The subcommittee hearing was called in response to The Wall Street Journal’s damning report about Facebook’s “toxic” effects on teenagers.
The hearing, which lasted nearly two and a half hours, ended up being somewhat repetitive, perhaps because there are only so many interesting questions one can ask about this particular topic. The vast majority of the questioning attempted to clarify why Facebook had been sitting on all this research showing the harm it causes some of its users without really doing much at all about it.
“Facebook routinely puts profits ahead of kids’ online safety, chooses the growth of its products over the wellbeing of our children, and it’s delinquent to take action,” said Sen. Richard Blumenthal (D-Conn.) by way of introduction. “It’s failing to hold itself accountable.”
In her own introductory statements, Davis explained that her team works “tirelessly” to make sure kids have a positive experience. Her responses throughout the hearing attempted to bolster this assertion, but, by and large, they only served to underscore that Facebook’s problems are of its own making.
A systemic problem — Facebook’s pushback to the WSJ report has mostly revolved around what it deemed a “simply inaccurate” analysis of leaked research. Most of today’s hearing amounted to a back-and-forth about why, exactly Facebook hadn’t done anything about this information, which clearly points to Instagram and Facebook being harmful. (Though, if prompted, Facebook will still argue with you about the implications of that data.)
One refrain underscored Davis’ entire testimony: Facebook is doing everything it can. She spoke to, amongst other solutions, the fact that targeted advertising is limited for teenagers; she pointed to forthcoming pilot features for “nudging” users toward uplifting posts and for encouraging users to “take a break” from its platforms.
Dangerous by design — The problem with this framing is that adding features and restricting access to others can never really mitigate the mental health effects of being on Facebook and Instagram in the first place. The harms we’re discussing are caused by the platforms’ most basic functions.
Liking photos, suggestion algorithms, influencers looking surreally beautiful, depressing news — those are the bedrock upon which Facebook and Instagram are built. With each of its platforms, Facebook’s goal is to keep users scrolling so they can see the ads from which it makes money. Algorithms are fine-tuned to serve up whatever drives engagement, and that’s often harmful, shocking, or enraging content. To really “fix” the problem would, from Facebook’s side of things, mean totally reassessing how its platforms work and what they choose to prioritize.
So how do we fix it? — The Senate subcommittee’s purpose for today’s hearing was muddled at best. Some questions — like Sen. Dan Sullivan’s (R-Alaska) meandering ones about the Chinese Communist Party — had only passing significance to the pre-determined task of “protecting kids online.”
Because Facebook’s toxicity is very much rooted in its base functions, there’s no way for the company to mitigate these harms on its own. This really is why the Senate has stepped in here. The hope is that rooting through these issues with lawmakers will accelerate the policymaking process. Laws that govern how internet companies deal with and protect children are outdated. The Children’s Internet Protection Act (CIPA) was enacted in 2000. That’s an eternity in internet time. Facebook and Instagram didn’t even exist at the turn of the millennium.
Facebook needs to play ball — Sen. Blumenthal made the subcommittee’s hopes clear in his introductory statements: that Facebook would not only cooperate with updated policies but also help in pushing for those regulations to pass. If Facebook is genuine in its assertions that it wants to protect children and foster a positive environment, it ought to accept regulation with open arms. It won’t, though. It will only change the status quo when not doing so will cost it money, and enough money to matter. Paltry fines of a few million dollars won’t do it — Facebook makes billions each year.
In today’s hearing, Davis was asked specifically whether or not Facebook would commit to supporting updated bills like the Kids Internet Design and Safety (KIDS) Act. Her response? The standard deflection comeback: “I’d be happy to follow up.” Which is Facebook speak for, “Not unless we’re forced to.” It’s time to ditch the conciliatory carrot and crack the legislative whip instead.
The next part of the subcommittee’s hearing will question a whistleblower who reportedly leaked “thousands” of research documents to the Senate. The hearing will be held on Tuesday, October 5, at 10 a.m. ET.