This horrifying AI model predicts future instances of police brutality

Two artists sponsored by the Mozilla Foundation have flipped the script on law enforcement’s troubled history of using big data to anticipate where future crimes might be committed. Their project, called Future Wake, uses artificial intelligence and data of past instances of police violence to predict where police brutality might strike next. Future Wake is an interactive website featuring the images and stories of fictional people who, the data suggests, could be victims of police brutality in the future. The artists trained the computer vision and natural language processing models on historical records of police violence to generate the fictional likenesses and words of the potential victims. The characters, all of which are computer generated, look something like deepfakes. The AI models also predict the location and manner of the police brutality. The victims tell the story of their targeting by police, and about the event that led to their death. “Officers with the Violent Crimes Task Force will come to my home to serve a warrant to me, as I am wanted for a felony,” says a Latino man who the project predicts will be a victim of police violence in Los Angeles. “The officers will enter my home, and I will pull out a handgun and we will begin to shoot each other. The officers will shoot and kill me.” [Screenshot: Future Wake]The duo who created the Future Wake project, who have decided to remain anonymous, say the work is intended to “stir discussions around predictive policing and police-related fatal encounters.” Over the past decade police departments around the country have experimented with using big data analytics to predict where future crimes might occur, or to identify individuals who are likely to commit crimes or be victims of a crimes. The practice has come under scrutiny because biases within the historical crime data analyzed by the algorithms can be perpetuated in their predictions. The data used to train the Future Wake models came from Fatal Encounters, which contains records of 30,798 victims killed by police in the U.S. between January 2000 and September 2021. The project also used data from Mapping Police Violence, which contains details on 9,468 victims killed by police in the U.S. from January 2013 to September 2021. The work and the website, which went live on October 14, are funded by Mozilla’s Creative Media Awards.

https://www.fastcompany.com/90689806/ai-police-brutality-predictions-future-wake?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Létrehozva 4y | 2021. nov. 2. 12:21:38


Jelentkezéshez jelentkezzen be

EGYÉB POSTS Ebben a csoportban

How Tesla’s Autopilot verdict could stifle Musk’s robotaxi expansion

A court verdict against Tesla last week, stemming from a fatal 2019 crash of an Aut

2025. aug. 5. 17:50:11 | Fast company - tech
Cloudflare vs. Perplexity: a web scraping war with big implications for AI

When the web was established several decades ago, it was built on a number of principles. Among them was a key, overarching standard dubbed “netiquette”: Do unto others as you’d want done unto you

2025. aug. 5. 17:50:09 | Fast company - tech
Taiwanese authorities investigate TSMC chip trade secrets leak

Taiwanese authorities have detained three people for allegedly stealing technology trade secrets from Taiwan Semiconductor Manufacturing Co (

2025. aug. 5. 17:50:08 | Fast company - tech
AT&T to pay $177 million in data breach settlement. Here’s how to claim up to $5,000

After suffering two significant data breaches in recent years, AT&T has agreed to pay $177 million to customers affected by the incidents. Some individuals could receive

2025. aug. 5. 11:10:02 | Fast company - tech
What the White House Action Plan on AI gets right and wrong about bias

Artificial intelligence fuels something called automation bias. I often bring thi

2025. aug. 5. 8:40:04 | Fast company - tech
Online scam uses fake ICE raids at Target and Walmart to steal personal data

A new online scam is exploiting fears surrounding immigration raids.

If your “For You” page on

2025. aug. 5. 6:20:07 | Fast company - tech