In the wake of the Uvalde shooting, surveillance tech is not the answer

When an indescribable tragedy was inflicted upon Robb Elementary School in Uvalde, Texas, we saw the predictable platitudes. We saw well-worn promises of changes that will never come. And we saw growing frustration that so many political leaders are willing to sacrifice children on the altar of the Second Amendment. But we also saw a comparatively newer element of the post-shooting ritual rear its ugly head: the increasingly insistent claims that technology can magically keep our kids safe. It won’t. In the days since the attack, we’ve heard from seemingly every surveillance vendor on the market, each promising that their tech will somehow be the solution to America’s gun-violence problem. Just days before the Uvalde attack, the CEO of the controversial metal detector firm Evolv positioned his firm in the Washington Post as the panopticon panacea, claiming his product would “democratize security.” In response to criticism that his error-plagued metal detector is just as likely to alert officers to laptops as it was guns, Evolv hid behind claims that public scrutiny might endanger the company’s proprietary algorithm. Once again, the tech companies are frantic to ensure the public won’t look behind the curtain. Evolv is far from the only firm trying to profit from tragedy. Clearview AI, which first received global condemnation for taking billions of photos from social media and other websites to create its facial recognition software, announced the same day as the Uvalde shooting that it was working to bring facial recognition to schools. The move came just weeks after the company agreed to stop sales to private companies as part of a settlement with the American Civil Liberties Union (ACLU). Tech companies weren’t the only ones claiming that surveillance would be the solution. For feckless politicians unwilling to stand up to gun companies, the magical thinking of surveillance “solutionism” offers an easy out. In his speech last week before the National Rifle Association (NRA)’s annual convention, former President Donald Trump called for “strong exterior fencing, metal detectors, and the use of new technology” in order to prevent more mass shootings. At the same conference, Texas Senator Ted Cruz said schools should simply “buy every metal detector known to man.” But the truth is that none of those measures would work to prevent tragedies like the Uvalde shooting. After all, Robb Elementary itself had taken preventative surveillance measures. The Uvalde school district had doubled security spending in the years leading up to the attack, paying for its own police force, expanded police partnerships, drug- and firearm-sniffing dogs, security vestibules, new fences, and the now-ubiquitous lockdown drills. But there were also digital measures, including social-media-surveillance software from software firm Social Sentinel, Raptor Technologies’ visitor management software, metal detectors, cameras, threat-assessment teams, and reporting programs. Yet none of that worked against an 18-year-old with an AR-15-style rifle.  Social-media-monitoring software is a powerful way to racially profile students and replicate past police discrimination, but it consistently fails to identify those who pose a threat in the future. Instead, the systems lead police and prosecutors to imprison young Black, Latinx, and Muslim men for offhand comments and out-of-context remarks. These tools often rely on simplistic keyword analysis or repurposed algorithms that were initially used for unrelated tasks, and they simply cannot show us who is going to commit a crime before they actually do. And even when schools deploy more sophisticated forms of AI, there’s no way to effectively avoid the bias that has plagued predictive policing in the past.  But even as the technology fails, vendors face no consequences, and school officials (unwittingly) reward them for their inefficacy through even more spending. Uvalde’s metal detectors, cameras, and social-media monitoring didn’t save anyone on May 24, and yet the vendors who sold those systems will likely make more money than ever this year. And politicians opposed to gun-safety regulation will continue to hold up empty surveillance answers as our salvation because it’s easier than acknowledging that only actual gun safety will protect our kids from mass shootings. If we want to protect kids, we know the gun-safety measures that will do it. But until we’re ready to act, we’ll be stuck with thoughts, prayers . . . and surveillance tech that doesn’t work. Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.), a New York City-based civil rights and privacy group, a TED fellow, and a visiting fellow at Yale Law School’s Information Society Project. He can be reached on Twitter at @FoxCahn.

https://www.fastcompany.com/90755634/in-the-wake-of-the-uvalde-shooting-surveillance-tech-is-not-the-answer?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Erstellt 3y | 01.06.2022, 04:20:46


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

Those security codes you ask to receive via text leave your accounts vulnerable. Do this instead

Do you receive login security codes for your online accounts via text message? These are the six- or seven-digit numbers sent via SMS that you need to enter along with your password when trying to

21.06.2025, 10:40:03 | Fast company - tech
This is the best online file converter—and it’s totally free

We were supposed to be finished with files by now.

For years, tech companies (well, certain tech companies) tooted their horns about a future in which files didn’t matter. You d

21.06.2025, 10:40:02 | Fast company - tech
Astroworld is back in the spotlight and survivors are sharing haunting stories on TikTok

Astroworld is back in the news, and social media has some thoughts.

In November 2021, a

20.06.2025, 23:10:03 | Fast company - tech
Your reliance on ChatGPT might be really bad for your brain

If you value critical thinking, you may want to rethink your use of ChatGPT.

As graduates

20.06.2025, 18:30:02 | Fast company - tech
What is ‘office chair butt’? TikTok’s viral term for a real health problem

Rather than the Sunday scaries or toxic bosses, employees have unlocked a new workplace fear: office chair butt.

While not a new concern, the term has resurfaced on TikTok to describe ho

20.06.2025, 16:10:07 | Fast company - tech
How this Parisian music streaming service is fighting AI fraud

Music streaming service Deezer said Friday that it will start flagging albums with AI-generated songs, part of its fight against

20.06.2025, 16:10:06 | Fast company - tech