The TSA’s facial recognition program could protect privacy rights. Here’s how

The Federal Aviation Authorization Bill passed last month, but one element was missing: a bipartisan amendment to pause the TSA’s ongoing facial recognition program due to concerns about travelers’ privacy rights. 

There are legitimate reasons for implementing The TSA’s facial recognition technology. Ideally, it would speed up the flow of the 2 million daily passengers traveling through TSA checkpoints, while improving identity verification. As airports grow more crowded and identity fraud increases, implementing state-of-the-art security protocols is essential.  

But our faces are our most personal and identifiable features—even more so than our names, fingerprints, or even our voices. This makes facial recognition technology a prime target for bad actors. As a privacy professional, it’s my job to advocate to protect sensitive data. And as AI cloning technology improves, and consumers increasingly use their faces as passwords, safeguarding the privacy and security of biometric data is more crucial than ever. 

The TSA’s facial recognition program (currently in use at 84 airports, with plans to expand to over 400) is still in the pilot stages. However, the agency has indicated that this technology may one day be the norm

The privacy advocacy community doesn’t have to see this as a failure. Privacy rights and security protections are not mutually exclusive. The TSA can adopt this program while also protecting the privacy rights of travelers. But the agency must take a page out of the privacy best practices handbook—implementing strict data governance protocols and oversight, and offering more reasonable and accessible opportunities to opt out of the program. Here’s what that would look like. 

Minimizing data collection and adopting strict data governance protocols

The TSA says it will not store the biometric data collected and used for identity verification at security checkpoints. One exception is data shared and used by the TSA’s parent agency, the Department of Homeland Security, (DHS) to test the efficacy of their facial recognition technology. 

The DHS said in January that it has protocols in place to define what biometric data the TSA can share for testing, how it is securely transferred, as well as who can access and use it. It also said that any data saved for evaluation has strict retention periods, after which the department will delete it. These are all promising signs that the TSA takes biometric data collection seriously. 

However, the DHS does have a history of security lapses. In 2018, a data breach exposed the personally identifiable information (PII) of over 240,000 current and former employees. In 2019, hackers stole thousands of traveler photos. I’m hopeful that the DHS learned from these incidents and improved its data security protocols. Yet opting not to store traveler information at all—a practice known as data minimization in the privacy world—remains the safest option. 

Providing clear and fair opt-out opportunities

I recently traveled through Amsterdam’s Schiphol Airport, which uses biometric data verification at passport checkpoints. While a large notice informed me of my privacy rights, I was forcefully herded into the biometric data verification line. When I visited the website on the airport signage to learn more, I found a generic overview of my GDPR rights. A security agent even reprimanded me for taking a photograph of the privacy notice. I left with a sense that while I theoretically had the option to opt-out, it would be a major inconvenience.

The TSA has an opportunity to make its consent practices outstanding in comparison. For example: 

  • Adding clear and abundant signage both in person and on the TSA’s website that describes the program and travelers’ right to opt out 
  • Making airport staff available to answer clarifying questions, or provide translated documents  
  • Creating separate, clearly marked, and well-staffed lines for travelers choosing to participate or opt out 
  • Providing travelers an opportunity to revoke consent without facing consequences

When the TSA designs an appropriate system for opting out, the agency must also consider seniors (who may be unfamiliar with what biometric data collection entails), minorities (who are at greater risk of misidentification by AI systems), and non-English or ESL speakers (who may struggle to understand checkpoint signage). 

Routine audits and public oversight 

Even the most robust data governance programs need routine audits and third-party oversight. The TSA should conduct routine privacy impact assessments (PIAs) to examine the risk potential of data collection and storage, and verified third parties—like a public oversight committee—should also review these PIAs. Privacy rights groups like EPIC, the CDT, and the 14 senators concerned about the TSA program, are all excellent candidates for an oversight committee. An oversight committee would also help build trust in the program so more travelers feel comfortable opting in.

Wide adoption of facial recognition technology is inevitable, but that doesn’t mean we have to be complacent about how organizations collect and govern our biometric data. The rollout of this facial recognition program will set a baseline standard for how other federal agencies adopt and govern similar tools. If privacy-protective best practices are the norm from the very beginning, future iterations of this technology will face the same set of checks and balances. 

The TSA has the opportunity to set a new standard for America’s security. Will they take it? 

https://www.fastcompany.com/91142081/the-tsas-facial-recognition-program-could-protect-privacy-rights-heres-how?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Creată 11mo | 20 iun. 2024, 10:10:05


Autentifică-te pentru a adăuga comentarii

Alte posturi din acest grup

This smart new internet speed test blows Ookla out of the water

These days, our tech experiences are all about speed—and our expectations for instant action are actually kinda insane.

Think about it: Not so long ago, phones, computers, and e

24 mai 2025, 12:50:02 | Fast company - tech
Use this Google Flights “anywhere” hack to see where you can travel on your budget 

Memorial Day Weekend is upon us, marking the unofficial start of the summer vacation season in America. Yet, a recent Bankrate survey from late April found that

24 mai 2025, 10:30:04 | Fast company - tech
Need to relax? The Internet Archive is livestreaming microfiche scans to a lo-fi beats soundtrack

Want to watch history being preserved in real time?

The Internet Archive, the digital library of internet sites and other cultural artifacts, has started 

23 mai 2025, 22:50:04 | Fast company - tech
What’s actually driving the protein boom?

There’s a quiet transformation underway in how we eat. It’s not being led by chefs, influencers, or climate activists. It’s being driven by a new class of pharmaceuticals that are changing the way

23 mai 2025, 18:20:05 | Fast company - tech
‘Bro invented soup’: People are rolling their eyes at the water-based cooking trend on TikTok

On TikTok, soup is getting a rebrand. It’s now water-based cooking, to you.

“Pov you started water based cooking and now your skin is clear, your stomach is thriving and you recover from

23 mai 2025, 18:20:04 | Fast company - tech
9 of the most out there things Anthropic CEO Dario Amodei just said about AI

You may not have heard of Anthropic CEO Dario Amodei, but he’s one of a handful of people responsible for the current AI boom. As VP of Research at OpenAI, Amodei helped discover the scaling laws

23 mai 2025, 15:50:06 | Fast company - tech
Sorry, Google and OpenAI: The future of AI hardware remains murky

2026 may still be more than seven months away, but it’s already shaping up as the year of consumer AI hardware. Or at least the year of a flurry of high-stakes attempts to put generative AI at the

23 mai 2025, 13:40:04 | Fast company - tech