The TSA’s facial recognition program could protect privacy rights. Here’s how

The Federal Aviation Authorization Bill passed last month, but one element was missing: a bipartisan amendment to pause the TSA’s ongoing facial recognition program due to concerns about travelers’ privacy rights. 

There are legitimate reasons for implementing The TSA’s facial recognition technology. Ideally, it would speed up the flow of the 2 million daily passengers traveling through TSA checkpoints, while improving identity verification. As airports grow more crowded and identity fraud increases, implementing state-of-the-art security protocols is essential.  

But our faces are our most personal and identifiable features—even more so than our names, fingerprints, or even our voices. This makes facial recognition technology a prime target for bad actors. As a privacy professional, it’s my job to advocate to protect sensitive data. And as AI cloning technology improves, and consumers increasingly use their faces as passwords, safeguarding the privacy and security of biometric data is more crucial than ever. 

The TSA’s facial recognition program (currently in use at 84 airports, with plans to expand to over 400) is still in the pilot stages. However, the agency has indicated that this technology may one day be the norm

The privacy advocacy community doesn’t have to see this as a failure. Privacy rights and security protections are not mutually exclusive. The TSA can adopt this program while also protecting the privacy rights of travelers. But the agency must take a page out of the privacy best practices handbook—implementing strict data governance protocols and oversight, and offering more reasonable and accessible opportunities to opt out of the program. Here’s what that would look like. 

Minimizing data collection and adopting strict data governance protocols

The TSA says it will not store the biometric data collected and used for identity verification at security checkpoints. One exception is data shared and used by the TSA’s parent agency, the Department of Homeland Security, (DHS) to test the efficacy of their facial recognition technology. 

The DHS said in January that it has protocols in place to define what biometric data the TSA can share for testing, how it is securely transferred, as well as who can access and use it. It also said that any data saved for evaluation has strict retention periods, after which the department will delete it. These are all promising signs that the TSA takes biometric data collection seriously. 

However, the DHS does have a history of security lapses. In 2018, a data breach exposed the personally identifiable information (PII) of over 240,000 current and former employees. In 2019, hackers stole thousands of traveler photos. I’m hopeful that the DHS learned from these incidents and improved its data security protocols. Yet opting not to store traveler information at all—a practice known as data minimization in the privacy world—remains the safest option. 

Providing clear and fair opt-out opportunities

I recently traveled through Amsterdam’s Schiphol Airport, which uses biometric data verification at passport checkpoints. While a large notice informed me of my privacy rights, I was forcefully herded into the biometric data verification line. When I visited the website on the airport signage to learn more, I found a generic overview of my GDPR rights. A security agent even reprimanded me for taking a photograph of the privacy notice. I left with a sense that while I theoretically had the option to opt-out, it would be a major inconvenience.

The TSA has an opportunity to make its consent practices outstanding in comparison. For example: 

  • Adding clear and abundant signage both in person and on the TSA’s website that describes the program and travelers’ right to opt out 
  • Making airport staff available to answer clarifying questions, or provide translated documents  
  • Creating separate, clearly marked, and well-staffed lines for travelers choosing to participate or opt out 
  • Providing travelers an opportunity to revoke consent without facing consequences

When the TSA designs an appropriate system for opting out, the agency must also consider seniors (who may be unfamiliar with what biometric data collection entails), minorities (who are at greater risk of misidentification by AI systems), and non-English or ESL speakers (who may struggle to understand checkpoint signage). 

Routine audits and public oversight 

Even the most robust data governance programs need routine audits and third-party oversight. The TSA should conduct routine privacy impact assessments (PIAs) to examine the risk potential of data collection and storage, and verified third parties—like a public oversight committee—should also review these PIAs. Privacy rights groups like EPIC, the CDT, and the 14 senators concerned about the TSA program, are all excellent candidates for an oversight committee. An oversight committee would also help build trust in the program so more travelers feel comfortable opting in.

Wide adoption of facial recognition technology is inevitable, but that doesn’t mean we have to be complacent about how organizations collect and govern our biometric data. The rollout of this facial recognition program will set a baseline standard for how other federal agencies adopt and govern similar tools. If privacy-protective best practices are the norm from the very beginning, future iterations of this technology will face the same set of checks and balances. 

The TSA has the opportunity to set a new standard for America’s security. Will they take it? 

https://www.fastcompany.com/91142081/the-tsas-facial-recognition-program-could-protect-privacy-rights-heres-how?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

созданный 11mo | 20 июн. 2024 г., 10:10:05


Войдите, чтобы добавить комментарий

Другие сообщения в этой группе

Amazon’s Grubhub deal is delivering big results

Amazon and Grubhub are entering the second year of a five-year commercial agreement that gives Amazon Prime members access to the food delivery platform’s subscription program at no extra co

22 мая 2025 г., 19:10:05 | Fast company - tech
Are OpenAI and Jony Ive headed for an iPhone moment?

Welcome to AI DecodedFast Company’s weekly newsletter that breaks down the most important news in the world of AI. You can sign up to receive this newsletter every week 

22 мая 2025 г., 19:10:04 | Fast company - tech
Crypto investors saw Trump as their champion. Now they’re not so sure

It seems like a triumph for a cryptocurrency industry that has long sought mainstream acceptance: Top investors in one of

22 мая 2025 г., 16:40:11 | Fast company - tech
Roku is doing more than ever, but focus is still its secret ingredient

It’s easy to forget how big a splash the first Roku box made when it debuted on May 20, 2008. At launch, the device wo

22 мая 2025 г., 12:10:07 | Fast company - tech
Forget return-to-office. Hybrid now means human plus AI

For the past few years, “hybrid work” has meant splitting time between home and office. And for the most part, people like it—flexibil

22 мая 2025 г., 12:10:07 | Fast company - tech
Trump’s 4,000 meme-coins-per-plate crypto dinner is an American embarrassment

On Thursday, President Donald Trump will sit down for an intimate evening at his Northern Virginia golf club with 220 of his favorite people in the world: a group of cryptocurrency speculators who

22 мая 2025 г., 12:10:06 | Fast company - tech
Why (and how) DoorDash and Uber Eats are getting into the restaurant reservations game

A decade ago, the easiest way in the front door at a

22 мая 2025 г., 09:50:02 | Fast company - tech