Culture

This art project uses AI to imagine the next victims of police violence

‘Future Wake,’ one of Mozilla's 2021 Creative Media Award winners, gives audiences a dark inversion of predictive policing data.

Around 10:30 a.m. on November 12, Brooklyn police were dispatched to investigate a report of a suicidal individual. When they arrived, they found a black male holding a knife to his chest. At some point during the confrontation, officers tased the man before eventually shooting and killing him. Details are still murky as to why they felt the use of deadly force was necessary.

This isn’t a summary of an event from earlier this year, but a prediction generated by AI software drawing from a dataset of roughly 30,990 instances of police brutality. Designed by two anonymous artist-technologists who were awarded one of Mozilla’s 2021 Creative Media Awards, Future Wake is a new project that aims to turn “predictive policing data” on its head — instead of estimating which demographics are supposedly most likely to commit crimes, the data is used to determine where, when, and who is most likely to be directly affected by police violence.

Holding a (black) mirror up to police — In a joint statement by the project’s two anonymous creators, they describe how “Future Wake turns the application of predictive policing upside down. Rather than predicting crimes committed by the public, it focuses on future fatal encounters with the police. To predict future events, Future Wake uses historical data of past victims of police violence to predict where, when, who and how the next victim will die.”

Instead of reducing these tragedies to simple statistics and data points, the artists hoped to “connect viewers to depictions of these predicted future civilian-police encounters through human-driven storytelling.”

Built by AI, brought to life by humans — To accomplish their goal, Future Wake’s designers relied on two datasets spanning around 20 years, Mapping Police Violence and Fatal Encounters, to help build a system utilizing three AI models to “predict” how many people would be killed by police per day, hotspots where those deaths might occur, as well as the likely manner in which they would die. Using Style-GAN ADA-2 and first-order modeling (AKA deepfakes), artists were then able to build faces, eye movements, and backstories for their fictional victims.

From there, Future Wake relies on humans to bring these uncanny stories to life. Actual volunteers submitted recorded readings for each “Wake” displayed on the site. Designers are still looking for volunteers for future AI generations — anyone interested is encouraged to head over to the Future Wake website for contact details.

Poignant and unsettling — Although deceptively simple (from an audience standpoint), Future Wake presents a disquieting and complex look at America’s epidemic of police brutality. The statistics paint a grim, accurate portrait of law enforcement bias — a bias reinforced by shoddy methodology like predictive policing — but seeing these all-too-real, AI-generated stories and faces provides an entirely new and novel way to envision the issues at hand.