AI ‘companions’ promise to combat loneliness, but history shows the dangers of one-way relationships

The United States is in the grips of a loneliness epidemic: Since 2018, about half the population has reported that it has experienced loneliness. Loneliness can be as dangerous to your health as smoking 15 cigarettes a day, according to a 2023 surgeon general’s report.

It is not just individual lives that are at risk. Democracy requires the capacity to feel connected to other citizens in order to work toward collective solutions.

In the face of this crisis, tech companies offer a technological cure: emotionally intelligent chatbots. These digital friends, they say, can help alleviate the loneliness that threatens individual and national health.

But as the pandemic showed, technology alone is not sufficient to address the complexities of public health. Science can produce miraculous vaccines, but if people are enmeshed in cultural and historical narratives that prevent them from taking the lifesaving medicine, the cure sits on shelves and lives are lost. The humanities, with their expertise in human culture, history, and literature, can play a key role in preparing society for the ways that AI might help—or harm—the capacity for meaningful human connection.

The power of stories to both predict and influence human behavior has long been validated by scientific research. Numerous studies demonstrate that the stories people embrace heavily influence the choices they make, ranging from the vacations they plan to how people approach climate change to the computer programming choices security experts make.

Two tales

There are two storylines that address people’s likely behaviors in the face of the unknown territory of depending on AI for emotional sustenance: one that promises love and connection, and a second that warns of dehumanizing subjugation.

The first story, typically told by software designers and AI companies, urges people to say “I do” to AI and embrace bespoke friendship programmed on your behalf. AI company Replika, for instance, promises that it can provide everyone with a “companion who cares. Always here to listen and talk. Always on your side.”

There is a global appetite for such digital companionship. Microsoft’s digital chatbot Xiaoice has a global fan base of over 660 million people, many of whom consider the chatbot “a dear friend,” even a trusted confidante.

In popular culture, films like Her depict lonely people becoming deeply attached to their digital assistants. For many, having a “dear friend” programmed to avoid difficult questions and demands seems like a huge improvement over the messy, challenging, vulnerable work of engaging with a human partner, especially if you consider the misogynistic preference for submissive, sycophantic companions.

To be sure, imagining a chummy relationship with a chatbot offers a sunnier set of possibilities than the apocalyptic narratives of slavery and subjugation that have dominated storytelling about a possible future among social robots. Blockbuster films like “The Matrix” and the “The Terminator” have depicted hellscapes where humans are enslaved by sentient AI. Other narratives featured in films like “The Creator” and “Blade Runner” imagine the roles reversed and invite viewers to sympathize with AI beings who are oppressed by humans.

One reality

You could be forgiven for thinking that these two stories, one of friendship, the other of slavery, simply represent two extremes in human nature. From this perspective it seems like a good thing that marketing messages about AI are guiding people toward the sunny side of the futuristic street. But if you consider the work of scholars who have studied slavery in the U.S., it becomes frighteningly clear that these two stories—one of purchased friendship and one of enslavement and exploitation—are not as far apart as you might imagine.

Chattel slavery in the U.S. was a brutal system designed to extract labor through violent and dehumanizing means. To sustain the system, however, an intricate emotional landscape was designed to keep the enslavers self-satisfied. Gone with the Wind is perhaps the most famous depiction of how enslavers saw themselves as benevolent patriarchs and forced enslaved people to reinforce this fiction through cheerful professions of love.

In his 1845 autobiography, Frederick Douglass described a tragic occasion when an enslaved man, asked about his situation, honestly replied that he was ill-treated. The plantation owner, confronted with testimony about the harm he was inflicting, sold the truth-teller down the river. Such cruelty, Douglass insisted, was the necessary penalty for someone who committed the sin “of telling the simple truth” to a man whose emotional calibration required constant reassurance.

History lesson

To be clear, I am not evoking the emotional coercion that enslavement required in order to conflate lonely seniors with evil plantation owners, or worse still, to equate computer code with enslaved human beings. There is little danger that AI companions will courageously tell us truths that we would rather not hear. That is precisely the problem. My concern is not that people will harm sentient robots. I fear how humans will be damaged by the moral vacuum created when their primary social contacts are designed solely to serve the emotional needs of the “user.”

At a time when humanities scholarship can help guide society in the emerging age of AI, it is being suppressed and devalued. Diminishing the humanities risks denying people access to their own history. That ignorance renders people ill-equipped to resist marketers’ assurances that there is no harm in buying “friends.” People are cut off from the wisdom that surfaces in stories that warn of the moral rot that accompanies unchecked power.

If you rid yourself of the vulnerability born of reaching out to another human whose response you cannot control, you lose the capacity to fully care for another and to know yourself. As we navigate the uncharted waters of AI and its role in our lives, it’s important not to forget the poetry, philosophy and storytelling that remind us that human connection is supposed to require something of us, and that it is worth the effort.


Anna Mae Duane is the director of the University of Connecticut Humanities Institute, and a professor of English at the University of Connecticut.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

https://www.fastcompany.com/91027756/ai-companions-promise-to-combat-loneliness-but-history-shows-the-dangers-of-one-way-relationships?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Erstellt 1y | 12.02.2024, 20:40:12


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

A newly discovered exoplanet rekindles humanity’s oldest question: Are we alone?

Child psychologists tell us that around the age of five or six, children begin to seriously contemplate the world around them. It’s a glorious moment every parent recognizes—when young minds start

13.07.2025, 11:10:06 | Fast company - tech
How Watch Duty became a go-to app during natural disasters

During January’s unprecedented wildfires in Los Angeles, Watch Duty—a digital platform providing real-time fire data—became the go-to app for tracking the unfolding disaster and is credit

13.07.2025, 06:30:05 | Fast company - tech
Why the AI pin won’t be the next iPhone

One of the most frequent questions I’ve been getting from business execs lately is whether the

12.07.2025, 12:10:02 | Fast company - tech
Microsoft will soon delete your Authenticator passwords. Here are 3 password manager alternatives

Users of Microsoft apps are having a rough year. First, in May, the Windows maker

12.07.2025, 09:40:03 | Fast company - tech
Yahoo Creators platform hits record revenue as publisher bets big on influencer-led content

Yahoo’s bet on creator-led content appears to be paying off. Yahoo Creators, the media company’s publishing platform for creators, had its most lucrative month yet in June.

Launched in M

11.07.2025, 17:30:04 | Fast company - tech
GameStop’s Nintendo Switch 2 stapler sells for more than $100,000 on eBay after viral mishap

From being the face of memestock mania to going viral for inadvertently stapling the screens of brand-new video game consoles, GameStop is no stranger to infamy.

Last month, during the m

11.07.2025, 12:50:04 | Fast company - tech
Don’t take the race for ‘superintelligence’ too seriously

The technology industry has always adored its improbably audacious goals and their associated buzzwords. Meta CEO Mark Zuckerberg is among the most enamored. After all, the name “Meta” is the resi

11.07.2025, 12:50:02 | Fast company - tech