Why AI-powered hiring may create legal headaches

Even as AI becomes a common workplace tool, its use in hiring raises serious concerns that employers can’t afford to ignore.

Recent research suggests companies are being overwhelmed by AI-generated résumés. LinkedIn reports 11,000 applications per minute submitted through its platform, a 45% increase over the past year. The temptation for hiring managers to rely on off-the-shelf generative AI tools like ChatGPT is strong, but a new study published on Cornell University’s preprint server arXiv warns that doing so could open companies to claims of bias if a rejected candidate challenges the decision.

The study evaluated several state-of-the-art large language models (LLMs) from tech giants including OpenAI, Anthropic, Google, and Meta, analyzing both their predictive accuracy and fairness using impact ratio analysis across declared gender, race, and intersectional subgroups. These AI systems were tested on around 10,000 real-world job applications, revealing that the off-the-shelf tools most businesses would likely use to sift through résumés show significant bias.

While some LLMs, such as GPT-4o, showed near-perfect gender parity in candidate assessments, they demonstrated racial bias. When both gender and race were considered together, none of the models succeeded in achieving fair hiring outcomes, according to the researchers’ own evaluations. (The researchers did not respond to Fast Company‘s requests for comment.)

The models’ impact ratios—a metric that highlights potential disparate impact between groups, critical to fair hiring practices—fell as low as 0.809 for race and 0.773 for intersectional groups. These figures are at least 20% below the threshold typically considered impartial.

The findings offer little comfort to those who study organizational behavior and workplace dynamics. “The jobs market is chilly enough at the moment, so inflicting too much inhuman AI on job seekers seems like a cruel blow,” says Stefan Stern, visiting professor in management practice at Bayes Business School. (Stern was not involved in the study.) “There is a case for efficiency but there should also be humanity, especially if we are still interested in hiring human beings.”

Beyond legal risk, relying on AI in hiring can also alienate successful applicants, fostering a sense of distrust that can hurt the organization in the long run. Stern argues that candidates might reconsider joining a company that uses AI to screen them. “Why work for a firm that isn’t interested enough in you to get a fellow human to interview and assess you?” he asks.

In a world where artificial intelligence is becoming the norm, Stern believes that emotional intelligence—thoughtfully applied by hiring managers and leadership—can significantly improve employee well-being and retention. It can also shape a company’s culture and business practices moving forward.

“Too much heavy-handed use of AI would be a ‘red flag’ to me as a job hunter,” he says. “I want to work for and with other humans, not for and with machines.”

https://www.fastcompany.com/91365346/bosses-think-twice-before-letting-ai-make-hiring-decisions?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Vytvořeno 2mo | 11. 7. 2025 12:50:02


Chcete-li přidat komentář, přihlaste se

Ostatní příspěvky v této skupině

This man keeps buying and returning 110-pound anvils on Amazon

An Illinois man keeps buying and returning 110-pound anvils on Amazon—until “someone does something about it,” he says.

The creator, who goes by Johnbo Stockwell on

26. 8. 2025 5:30:09 | Fast company - tech
3 quick and easy ways to clear up storage space in Windows 11

Digital hoarders, unite! I have a game on my PC that I haven’t played in months, and it’s taking up more than 100 GB of disk space. There, I said it.

This is a scenario most of us find o

26. 8. 2025 5:30:07 | Fast company - tech
Texas residents push to form a new town to fight Bitcoin mining noise

For months, a group of Hood County, Texas, residents has been pushing to create a new town of their own. The effort began in March, when citizens living in a 2-square-mile unincorporated stretch o

25. 8. 2025 20:10:12 | Fast company - tech
Why AI surveillance cameras keep getting it wrong

Last year, Transport for London tested AI-powered CCTV at Willesden Gr

25. 8. 2025 13:20:05 | Fast company - tech
The gap between AI hype and newsroom reality

Although AI is changing the media, how much it’s

25. 8. 2025 10:50:11 | Fast company - tech
Big Tech locks data away. Wikidata gives it back to the internet

While tech and AI giants guard their knowledge graphs behind proprieta

25. 8. 2025 10:50:10 | Fast company - tech