Why Tinder’s background check is a major backfire

It’s a pattern we’ve seen too many times: Real-world violence migrates online, and after protracted denial, platforms that profit from abuse now promise to save us. Their “solutions” are convenient–and conveniently profitable–but only make matters worse. The latest example in the news is particularly wrenching: Tinder’s flailing efforts to address the all-too-real threat of intimate partner violence with unproven, error-prone background checks. But not only will the company’s new surveillance software fail to keep users safe, it will put even more of us at risk. It’s hard to capture every way that Tinder’s plan is poised to fail. First, background checks don’t work, not at telling you who is likely to abuse. Background checks are great at telling you about drug use and financial difficulties when users are Black, Brown, or from other over-policed communities. Want to know if someone has a history with marijuana use? Tinder is here to help. But if you want to know if someone is likely to commit an act of intimate partner violence, suddenly the data doesn’t look so good. That’s because the vast majority of abusers are never charged for their violence, and when they are, it fits the same pattern of discrimination that defines every dimension of American policing. Not only are white abusers more likely to go free, but BIPOC survivors are often arrested alongside their abusers. So, if you are a wealthy white abuser, statistically, Tinder will give you green light nearly every time. Even worse, using these faulty checks re-victimizes survivors of intimate partner violence, placing the burden of preventing attacks on them. Soon, users who don’t run the free trial check, or who exhaust the trial and are unable to pay, could be blamed for failing to predict their own attack. This will become the latest justification for police, university officials, and others in positions of power to ignore survivors, silence their complaints, and deny them support. AI may be ineffective at preventing crime, but it is very effective at preserving the status quo. Background checks ignore the lived experience of survivors, buying into the outdated narrative that somehow the cycle of abuse could have been stopped if only people knew their partner had a criminal history from the start. Just as damming, the system relies on the broken logic of broken windows policing, and the belief that someone should be completely exiled from society, their whole life discarded, if they have been convicted of a crime. Tinder’s digital chivalry is just as antiquated as the analog patriarchs of the past. These performative protections ignore the reality that Tinder is built on a business model that puts survivors at risk. Tinder and other data apps are fueled by our most intimate data and most intimate moments. That information is a constant stream of data for advertisers and anyone else willing to pay. But that data is increasingly accessible, not just to those who want to track us around the internet with ads, but to abusers who wish to track us to our businesses and home. Tinder can’t claim credit for keeping users safe when it continues to sell the very data that puts them at risk. For just a few hundred dollars, anyone could purchase datasets that include Tinder users’ location history. They could reconstruct their movements and romantic lives . . . and worse. A niche market for the tech-savvy stalker. But the threat to Tinder users doesn’t just come from those who break the law, but also from those who claim to enforce it. Increasingly, law enforcement agencies are purchasing the very same location data as advertisers. It’s scary enough when your local police department can buy a record of every place you’ve been, but it’s outright terrifying when ICE can do the same. If Tinder truly wanted to protect its users, it wouldn’t invest in this new, misguided form of user surveillance; it would stop enabling surveillance of its users. And if it wanted to help those targeted by abusers both on and off its platform, it would invest in the countless community-based groups that provide survivors what they need most: low-tech resources like a safe place to sleep when escaping their abuser. Yes, technology has made this problem worse–it has put people in harm’s way–but the solution isn’t more unproven, discriminatory technology. Instead, the solution is to listen to survivors. Real protection prioritizes their expressed needs, like supporting financial independence, data security, and the groups that fight for and with survivors of intimate partner violence every day. Albert Fox Cahn is the founder and executive director of the Surveillance Technology Oversight Project (S.T.O.P.), a New York-based civil rights and privacy group, and a visiting fellow at Yale Law School’s Information Society Project. Sarah Roth is an advocacy and communications intern at S.T.O.P., a recent graduate of Vassar College, and prospective JD candidate.

https://www.fastcompany.com/90746616/why-tinders-background-check-is-a-major-backfire?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Utworzony 3y | 30 kwi 2022, 08:20:53


Zaloguj się, aby dodać komentarz

Inne posty w tej grupie

Linda Yaccarino was supposed to tame X. Elon Musk wouldn’t let her

Some news stories are gobsmackingly obvious in their importance. Others are complete nonstories. So what to make of the

9 lip 2025, 19:10:07 | Fast company - tech
Apple’s next CEO: A new look at Tim Cook’s potential successors after latest exec shakeup

Yesterday, Apple unexpectedly announced the most radical shakeup to its C-suite in years. The company revealed that Jeff Williams, its current chief operating officer (COO), will be departing the

9 lip 2025, 16:40:09 | Fast company - tech
PBS chief Paula Kerger warns public broadcasting could collapse in small communities if Congress strips federal funding

As Congress moves to make massive cuts to public broadcasting this week, Paula Kerger, president and CEO of the Public Broadcasting Service (PBS), gives an unflinching look at the organization’s f

9 lip 2025, 14:30:04 | Fast company - tech
These personality types are most likely to cheat using AI

As recent graduates proudly showcase their use of ChatGPT for final projects, some may wonder: What kind of person turns to

9 lip 2025, 14:30:04 | Fast company - tech
Samsung fixed everything you hated about foldable phones—except the price

Just over a month ago, Samsung did something strange to start hyping up its next foldable phone announcements.

Those phones, which Samsung revealed today, are officially called the Samsu

9 lip 2025, 14:30:04 | Fast company - tech