Fast company logo
|
advertisement

A searing critique of predictive policing, Future Wake uses past data on police violence to predict where it might occur in the future—and who will be targeted.

This horrifying AI model predicts future instances of police brutality

[Photo: NeONBRAND/Unsplash]

BY Mark Sullivan1 minute read

Two artists sponsored by the Mozilla Foundation have flipped the script on law enforcement’s troubled history of using big data to anticipate where future crimes might be committed. Their project, called Future Wake, uses artificial intelligence and data of past instances of police violence to predict where police brutality might strike next.

Future Wake is an interactive website featuring the images and stories of fictional people who, the data suggests, could be victims of police brutality in the future. The artists trained the computer vision and natural language processing models on historical records of police violence to generate the fictional likenesses and words of the potential victims. The characters, all of which are computer generated, look something like deepfakes.

The AI models also predict the location and manner of the police brutality. The victims tell the story of their targeting by police, and about the event that led to their death.

“Officers with the Violent Crimes Task Force will come to my home to serve a warrant to me, as I am wanted for a felony,” says a Latino man who the project predicts will be a victim of police violence in Los Angeles. “The officers will enter my home, and I will pull out a handgun and we will begin to shoot each other. The officers will shoot and kill me.”

[Screenshot: Future Wake]
The duo who created the Future Wake project, who have decided to remain anonymous, say the work is intended to “stir discussions around predictive policing and police-related fatal encounters.”

Over the past decade police departments around the country have experimented with using big data analytics to predict where future crimes might occur, or to identify individuals who are likely to commit crimes or be victims of a crimes. The practice has come under scrutiny because biases within the historical crime data analyzed by the algorithms can be perpetuated in their predictions.

The data used to train the Future Wake models came from Fatal Encounters, which contains records of 30,798 victims killed by police in the U.S. between January 2000 and September 2021. The project also used data from Mapping Police Violence, which contains details on 9,468 victims killed by police in the U.S. from January 2013 to September 2021.

The work and the website, which went live on October 14, are funded by Mozilla’s Creative Media Awards.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics