Culture

Activists are using facial recognition on the police

"For a while now, everyone was aware the big guys could use this to identify and oppress the little guys, but we’re now approaching the technological threshold where the little guys can do it to the big guys."

Andrew Maximov, Belarusian technologist, speaking to the NYT

Nathan Howard/Getty Images News/Getty Images

Thanks to advanced artificial intelligence, facial recognition technology has swiftly become one of law enforcement’s favorite tools for identifying and tracking down protestors and other people of interest. Now activists are giving cops a taste of their own medicine by turning facial recognition technology against them instead.

Law enforcement continues to use the technology unchecked in many, many places across the United States. Large police forces like the New York Police Department have conducted thousands of searches with the technology and aren’t planning to stop any time soon. Facial recognition has ended up reinforcing and making worse existing power imbalances.

But this increased usage has also forced facial recognition technology to face a reckoning of sorts in recent months. A handful of big tech companies, including the likes of IBM, Microsoft, and Amazon, have either shuttered their facial recognition divisions or at the very least placed temporary moratoriums on them.

Activists around the world have taken matters into their own hands. If the public should be subject to surveillance software, why shouldn’t the cops be as well? A few of these activists spoke to The New York Times about their efforts. They go beyond simply turning law enforcement’s own weapons against them — they shine a spotlight on the inherent problems with the use of facial recognition technology as a whole.

Power to the people — Christopher Howell, a self-taught coder, and long-time activist, began wondering this summer whether or not he could use facial recognition technology to reveal police officers’ identities — much in the same way law enforcement had been across the United States. After Portland banned the use of facial recognition technology for public surveillance in September, Howell asked the city’s mayor whether or not the ban would apply to citizens as well, he tells The New York Times.

After the mayor confirmed the ban wouldn’t affect individuals, Howell began using publicly available platforms — like Google’s TensorFlow — to build machine-learning models. He’s collected thousands of images of Portland police officers from the news and social media after finding their names in public records. He says Facebook, in particular, has been helpful in these efforts.

And around the world — It isn’t only the United States that’s taken notice of the evils of this technology. Elsewhere, French artist Paolo Cirio published 4,000 faces of police officers for an online exhibit called Capture. Cirio says the project is the beginning of an effort to create his own facial recognition app.

France’s interior minister threatened legal action, despite the photos being publicly available on the internet anyway. Cirio took the exhibit down, but he plans to republish them.

“It’s childish to try to stop me,” Cirio said, “as an artist who is trying to raise the problem, instead of addressing the problem itself.”

A public reckoning — Facial recognition software has progressed far beyond the policies we’ve created to govern it. There’s just not enough of a framework for the technology to be used in a manner that protects privacy while still making it useful. This lack of policy means law enforcement is free to run wild with facial recognition, individual privacy be damned.

Software like Clearview AI has provided those in power with more power than ever before. In the pursuit of what they believe to be justice, law enforcement and other arms of government are willing to ignore personal liberties in the search for what’s supposedly “right.” Perhaps they’ll feel more circumspect about using facial recognition technology when it’s being used to target them for a change and threatening their privacy.