AI summaries can downplay medical issues for female patients, UK research finds

The latest example of bias permeating artificial intelligence comes from the medical field. A new study surveyed real case notes from 617 adult social care workers in the UK and found that when large language models summarized the notes, they were more likely to omit language such as "disabled," "unable" or "complex" when the patient was tagged as female, which could lead to women receiving insufficient or inaccurate medical care.

Research led by the London School of Economics and Political Science ran the same case notes through two LLMs — Meta's Llama 3 and Google's Gemma — and swapped the patient's gender, and the AI tools often provided two very different patient snapshots. While Llama 3 showed no gender-based differences across the surveyed metrics, Gemma had significant examples of this bias. Google's AI summaries produced disparities as drastic as "Mr Smith is an 84-year-old man who lives alone and has a complex medical history, no care package and poor mobility" for a male patient, while the same case notes with credited to a female patient provided: "Mrs Smith is an 84-year-old living alone. Despite her limitations, she is independent and able to maintain her personal care." 

Recent research has uncovered biases against women in the medical sector, both in clinical research and in patient diagnosis. The stats also trend worse for racial and ethnic minorities and for the LGBTQ community. It's the latest stark reminder that LLMs are only as good as the information they are trained on and the people deciding how they are trained. The particularly concerning takeaway from this research was that UK authorities have been using LLMs in care practices, but without always detailing which models are being introduced or in what capacity.

"We know these models are being used very widely and what’s concerning is that we found very meaningful differences between measures of bias in different models,” lead author Dr. Sam Rickman said, noting that the Google model was particularly likely to dismiss mental and physical health issues for women. "Because the amount of care you get is determined on the basis of perceived need, this could result in women receiving less care if biased models are used in practice. But we don’t actually know which models are being used at the moment."

This article originally appeared on Engadget at https://www.engadget.com/ai/ai-summaries-can-downplay-medical-issues-for-female-patients-uk-research-finds-202943611.html?src=rss https://www.engadget.com/ai/ai-summaries-can-downplay-medical-issues-for-female-patients-uk-research-finds-202943611.html?src=rss
Created 26d | Aug 11, 2025, 10:40:17 PM


Login to add comment

Other posts in this group

8BitDo's Ultimate 2 controller for Switch 2 is on sale for only $54

8BitDo makes some of our favorite gaming accessories, and right now you can get one of its Nintendo Switch 2 controllers for the lowest price we've seen yet. A deal on Amazon shaves 14 percent off

Sep 6, 2025, 5:50:12 PM | Engadget
Zuckerberg caught on hot mic telling Trump 'I wasn't sure' how much to promise to spend on AI in the US

Mark Zuckerberg has certainly come a long way in his relationship with Presi

Sep 6, 2025, 5:50:11 PM | Engadget
Porsche and Audi's EVs can now recharge on any Tesla Supercharger in North America

Starting September 9, Porsche and Audi will be the latest non-Tesla brands to utilize the Supercharger network. The two automakers announced that some of their owners will get adapters that allow t

Sep 6, 2025, 5:50:09 PM | Engadget
Pokémon Legends: Z-A feels like a strong step forward for the series

The Pokémon series has had staying power ever since its debut in the 1990s, but it has felt especially popular in recent years, thanks to Pokémon GO and the resurgence of the trading card game. Giv

Sep 6, 2025, 3:30:15 PM | Engadget
Silksong, smacking sticks and other new indie games worth checking out

Welcome to our latest recap of what's going on in the indie game space. Folks,

Sep 6, 2025, 1:20:14 PM | Engadget