Google is indexing ChatGPT conversations, potentially exposing sensitive user data

Google is indexing conversations with ChatGPT that users have sent to friends, families, or colleagues—turning private exchanges intended for small groups into search results visible to millions.

A basic Google site search using part of the link created when someone proactively clicks “Share” on ChatGPT can uncover conversations where people reveal deeply personal details, including struggles with addiction, experiences of physical abuse, or serious mental health issues—sometimes even fears that AI models are spying on them. While the users’ identities aren’t shown by ChatGPT, some potentially identify themselves by sharing highly specific personal information during the chats.

A user might click “Share” to send their conversation to a close friend over WhatsApp or to save the URL for future reference. It’s unlikely they would expect that doing so could make it appear in Google search results, accessible to anyone. It’s unclear whether those affected realize their conversations with the bot are now publicly accessible after they click the Share button, presumably thinking they’re doing so to a small audience.

Nearly 4,500 conversations come up in results for the Google site search, though many don’t include personal details or identifying information. This is likely not the full count, as Google may not index all conversations. (Because of the personal nature of the conversations, some of which divulge highly personal information including users’ names, locations, and personal circumstances, Fast Company is choosing not to link to, or describe in significant detail, the conversations with the chatbot.)

The finding is particularly concerning given that nearly half of Americans in a survey say they’ve used large language models for psychological support in the last year. Three-quarters of respondends sought help with anxiety, two in three looked for advice on personal issues, and nearly six in 10 wanted help with depression. But unlike the conversations between you and your real-life therapist, transcripts of conversations with the likes of ChatGPT can turn up in a simple Google search.

Google indexes any content available on the open web and site owners are able to remove pages from search results. ChatGPT’s shared links are not intended to appear in search by default and must be manually made discoverable by users, who are also warned not to share sensitive information and can delete shared links at any time. (Both Google and OpenAI declined Fast Company’s requests for comment.)

One user described in detail their sex life and unhappiness living in a foreign country, claiming they were suffering from post-traumatic stress disorder (PTSD) and seeking support. They went into precise details about their family history and interpersonal relationships with friends and family members.

Another conversation discusses the prevalence of psychopathic behaviors in children and at what age they can show, while another user discloses they are a survivor of psychological programming and are looking to deprogram themselves to mitigate the trauma they felt.

“I’m just shocked,” says Carissa Veliz, an AI ethicist at the University of Oxford. “As a privacy scholar, I’m very aware that that data is not private, but of course, ‘not private’ can mean many things, and that Google is logging in these extremely sensitive conversations is just astonishing.”

Similar concerns have been raised with competing chatbots, including those run by Meta, which began sharing user queries with its AI systems in a public feed visible within its AI apps. Critics then said user literacy was not high enough to recognize the dangers of sharing private information publicly—something that later proved to be correct as personal details surfaced on the social feed. At the time, online safety experts highlighted worries about the disparity between how users think app functionalities work, and how the companies running the apps actually make them work.

Veliz fears that this is an indication of the approach we’re going to see big tech taking when it comes to privacy. “It’s also further confirmation that this company, OpenAI, is not trustworthy, that they don’t take privacy seriously, no matter what they say,” she says. “What matters is what they do.”

OpenAI CEO Sam Altman warned earlier this month that users shouldn’t share their most personal details with ChatGPT because OpenAI “could be required to produce that” if compelled legally to do so by a court. “I think that’s very screwed up”, he added. The conversation, with podcaster Theo Von, didn’t discuss users’ conversations being willingly opened up for indexing by OpenAI.

“People expect they can use tools like ChatGPT completely privately,” says Rachel Tobac, a cybersecurity analyst and CEO of SocialProof Security, “but the reality is that many users aren’t fully grasping that these platforms have features that could unintentionally leak their most private questions, stories, and fears.”

https://www.fastcompany.com/91376687/google-indexing-chatgpt-conversations?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 12d | 30 juil. 2025, 20:30:04


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

Indonesia eyes entering the AI race with a new sovereign fund

Authorities overseeing the development of artificial intelligence in Ind

11 août 2025, 17:30:06 | Fast company - tech
Inside the looming AI-agents war that will redefine the economics of the web

There’s a war brewing in the world of AI agents. After

11 août 2025, 17:30:06 | Fast company - tech
Content creators are cashing in with live events

Forget Cowboy Carter or the Eras tour, the hottest ticket this year is for your favorite podcast.  

Content creator tours sold nearly 500% more tickets this year compared to 20

11 août 2025, 12:50:05 | Fast company - tech
The British conspiracy guru building a sovereign micronation in Appalachia 

Matthew Williams has slept very little since he learned about Sacha Stone’s plan to build a “sovereign” micronation on 60 acres of land near his home in rural Tennessee. What began as a quic

11 août 2025, 10:30:08 | Fast company - tech
These 4 phones will drastically reduce your screen time

Let’s be honest: Your phone is a jerk. A loud, demanding, little pocket-size jerk that never stops buzzing, dinging, and begging for your attention. It’s the first thing you see in the

11 août 2025, 05:50:06 | Fast company - tech
This tool will help declutter your digital mess

This article is republished with permission from Wonder Tools, a newsletter that helps you discover the most useful sites and apps. 

11 août 2025, 05:50:05 | Fast company - tech