ACAB

LAPD's new data-driven policing is just as racist as its old ways

Turns out predictive policing by any name ends up with similarly biased results. Who would've thought?

LOS ANGELES, CA - OCTOBER 14, 2021: A member of the LAPD leaves their headquarters on 1st St. in dow...
Mel Melcon/Los Angeles Times/Getty Images

Newly released documents revealed as part of a Stop LAPD Spying Coalition study show the full extent of how both Operation Laser and PredPol reinforced existing racial biases, such as decisions to patrol neighborhoods with higher populations of Black and Brown communities.

The Los Angeles Police Department ended its flagship Operation LASER and PredPol predictive policing programs in 2019 and 2020 respectively after general public outcry over its tendency toward racial bias. But the department’s plans for replacement technology have largely shared the same flaws and failed to create any meaningful change in the organization.

Last year, the LAPD cited COVID-19 budgetary constraints as the reason for discontinuing its use of PredPol — not the concerns experts had shared about the program’s biases and larger privacy concerns. The Stop LAPD Spying Coalition, one of the leading activist groups against PredPol, said at the time that it was obvious the decision was the result of organizing that had been done.

Attempts at reforming police departments from the inside rarely provide the transparency and technological innovation we’re promised. Maybe it’s time to try a new tactic.

Flawed from the start — The LAPD’s crime-prediction programs were considered huge successes amongst law enforcement during the early 2010s. Operation LASER (Los Angeles Strategic Extraction and Restoration) aimed to use historical data to map areas with high crime levels. This allowed them to patrol these areas more frequently and ostensibly lower crime rates in the process.

This system was fundamentally flawed. Using historical data only ended up sending more cops to places where people had called for police more in the past, creating a vicious cycle. A 2019 LAPD general report showed that the vast majority of people flagged by LASER’s bulletins were Black or Latinx. As in 85 percent of all reports. Those groups comprise less than 60 percent of the LA population altogether.

If you’re wondering what kind of twisted company would create such a program, we have an easy enough answer for you. PredPol was created by Palantir — you know, the data-mining firm whose CEO has said he has no qualms with his technology being used to kill people.

And the new programs aren’t helping — New documents from Stop LAPD Spying’s latest report confirm just how problematic Operation LASER was. Besides its racist biases, the program’s mechanisms were disorganized and inaccurate. When stopping someone, police were prompted to fill out “field cards” with personal information. That information was fed into PredPol’s databases whether or not that person had committed any crimes.

After the dissolution of PredPol and Operation LASER, the LAPD created new programs to similar ends, like something called “data-informed community-focused policing (DICFP). But DICFP borrows much of its groundwork from Operation LASER, including its focus on increasing policing in neighborhoods with already-high call rates.

DICFP’s biggest problem is the same as that of Operation LASER: It only serves to automate law enforcement’s existing logic systems. As an organizer from Stop LAPD Spying puts it, that includes “targeting poor people, targeting unhoused people, targeting Black, Brown, and disabled people.”

Privacy advocates worry that any data-driven programs for predictive policing will end up with similar — if not worse — results. Tech companies like Palantir shouldn’t be left to create safeguards, either. That’s how Los Angeles ended up in this mess.