In the wake of the Uvalde shooting, surveillance tech is not the answer

When an indescribable tragedy was inflicted upon Robb Elementary School in Uvalde, Texas, we saw the predictable platitudes. We saw well-worn promises of changes that will never come. And we saw growing frustration that so many political leaders are willing to sacrifice children on the altar of the Second Amendment. But we also saw a comparatively newer element of the post-shooting ritual rear its ugly head: the increasingly insistent claims that technology can magically keep our kids safe. It won’t. In the days since the attack, we’ve heard from seemingly every surveillance vendor on the market, each promising that their tech will somehow be the solution to America’s gun-violence problem. Just days before the Uvalde attack, the CEO of the controversial metal detector firm Evolv positioned his firm in the Washington Post as the panopticon panacea, claiming his product would “democratize security.” In response to criticism that his error-plagued metal detector is just as likely to alert officers to laptops as it was guns, Evolv hid behind claims that public scrutiny might endanger the company’s proprietary algorithm. Once again, the tech companies are frantic to ensure the public won’t look behind the curtain. Evolv is far from the only firm trying to profit from tragedy. Clearview AI, which first received global condemnation for taking billions of photos from social media and other websites to create its facial recognition software, announced the same day as the Uvalde shooting that it was working to bring facial recognition to schools. The move came just weeks after the company agreed to stop sales to private companies as part of a settlement with the American Civil Liberties Union (ACLU). Tech companies weren’t the only ones claiming that surveillance would be the solution. For feckless politicians unwilling to stand up to gun companies, the magical thinking of surveillance “solutionism” offers an easy out. In his speech last week before the National Rifle Association (NRA)’s annual convention, former President Donald Trump called for “strong exterior fencing, metal detectors, and the use of new technology” in order to prevent more mass shootings. At the same conference, Texas Senator Ted Cruz said schools should simply “buy every metal detector known to man.” But the truth is that none of those measures would work to prevent tragedies like the Uvalde shooting. After all, Robb Elementary itself had taken preventative surveillance measures. The Uvalde school district had doubled security spending in the years leading up to the attack, paying for its own police force, expanded police partnerships, drug- and firearm-sniffing dogs, security vestibules, new fences, and the now-ubiquitous lockdown drills. But there were also digital measures, including social-media-surveillance software from software firm Social Sentinel, Raptor Technologies’ visitor management software, metal detectors, cameras, threat-assessment teams, and reporting programs. Yet none of that worked against an 18-year-old with an AR-15-style rifle.  Social-media-monitoring software is a powerful way to racially profile students and replicate past police discrimination, but it consistently fails to identify those who pose a threat in the future. Instead, the systems lead police and prosecutors to imprison young Black, Latinx, and Muslim men for offhand comments and out-of-context remarks. These tools often rely on simplistic keyword analysis or repurposed algorithms that were initially used for unrelated tasks, and they simply cannot show us who is going to commit a crime before they actually do. And even when schools deploy more sophisticated forms of AI, there’s no way to effectively avoid the bias that has plagued predictive policing in the past.  But even as the technology fails, vendors face no consequences, and school officials (unwittingly) reward them for their inefficacy through even more spending. Uvalde’s metal detectors, cameras, and social-media monitoring didn’t save anyone on May 24, and yet the vendors who sold those systems will likely make more money than ever this year. And politicians opposed to gun-safety regulation will continue to hold up empty surveillance answers as our salvation because it’s easier than acknowledging that only actual gun safety will protect our kids from mass shootings. If we want to protect kids, we know the gun-safety measures that will do it. But until we’re ready to act, we’ll be stuck with thoughts, prayers . . . and surveillance tech that doesn’t work. Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.), a New York City-based civil rights and privacy group, a TED fellow, and a visiting fellow at Yale Law School’s Information Society Project. He can be reached on Twitter at @FoxCahn.

https://www.fastcompany.com/90755634/in-the-wake-of-the-uvalde-shooting-surveillance-tech-is-not-the-answer?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Létrehozva 3y | 2022. jún. 1. 4:20:46


Jelentkezéshez jelentkezzen be

EGYÉB POSTS Ebben a csoportban

Linda Yaccarino was supposed to tame X. Elon Musk wouldn’t let her

Some news stories are gobsmackingly obvious in their importance. Others are complete nonstories. So what to make of the

2025. júl. 9. 19:10:07 | Fast company - tech
Apple’s next CEO: A new look at Tim Cook’s potential successors after latest exec shakeup

Yesterday, Apple unexpectedly announced the most radical shakeup to its C-suite in years. The company revealed that Jeff Williams, its current chief operating officer (COO), will be departing the

2025. júl. 9. 16:40:09 | Fast company - tech
PBS chief Paula Kerger warns public broadcasting could collapse in small communities if Congress strips federal funding

As Congress moves to make massive cuts to public broadcasting this week, Paula Kerger, president and CEO of the Public Broadcasting Service (PBS), gives an unflinching look at the organization’s f

2025. júl. 9. 14:30:04 | Fast company - tech
These personality types are most likely to cheat using AI

As recent graduates proudly showcase their use of ChatGPT for final projects, some may wonder: What kind of person turns to

2025. júl. 9. 14:30:04 | Fast company - tech