YouTube Shorts algorithm steers users away from political content, study finds

YouTube Shorts, the shortform platform from Google-owned video giant YouTube, has seen massive success since its launch in September 2020. Today, an estimated 1% of all waking human hours are spent watching Shorts, with videos amassing around 200 billion views daily.

But what users watch is ultimately shaped by YouTube’s algorithm—and a new study published in the Cornell University preprint server arXiv suggests that the algorithm nudges viewers away from politically sensitive content.

“When you start [watching] a political topic or specific political topics, YouTube is trying to push you away to more entertainment videos, more funny videos, especially in YouTube Shorts,” says Mert Can Cakmak, a researcher at the University of Arkansas, Little Rock, and one of the study’s authors.

Cakmak and his colleagues scraped between 2,100 and 2,800 initial videos across three themes: the South China Sea dispute, Taiwan’s 2024 election, and a broader “general” category. They then followed 50 successive recommendations for each video under three viewing scenarios, which varied how long a simulated user watched: 3 seconds, 15 seconds, or the full video.

The researchers tracked how YouTube presented 685,842 Shorts videos. Titles and transcripts were classified by topic, relevance, and emotional tone using OpenAI’s GPT-4o model. When engagement began with politically sensitive themes like the South China Sea or Taiwan’s 2024 election, the algorithm quickly steered users toward more entertainment-focused content. The emotional tone, as assessed by AI, also shifted—moving from neutral or angry to mostly joyful or neutral. Early in the recommendation chain, videos with the highest view counts, likes, and comments were favored, reinforcing a popularity bias.

“Maybe some people were aware of this, but I’m sure the majority of people are not aware what the algorithm is doing,” Cakmak says. “They are just going and watching.”

Neither YouTube nor its parent company, Google, responded to Fast Company’s request for comment on the study’s findings.

Cakmak doesn’t believe this is a deliberate effort to suppress political discourse, but rather a design choice focused on user engagement. “What YouTube is trying to do,” he says, “[is] remove you from that area or topic, and push you [to a happier] topic so that it can increase . . . engagement [and] earn more money.”


https://www.fastcompany.com/91369323/youtube-shorts-algorithm-steers-users-away-from-political-content?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Létrehozva 1mo | 2025. júl. 17. 10:10:05


Jelentkezéshez jelentkezzen be

EGYÉB POSTS Ebben a csoportban

The TikTok dorm water panic is officially here

Instead of worrying about making friends or keeping up with their studies, new college students have a different concern on their minds: dorm water.

“Praying dorm water doesn’t ruin my h

2025. aug. 22. 20:20:07 | Fast company - tech
Reddit—and a dash of AI—do what Google and ChatGPT can’t

Hello, everyone, and thanks once again for reading Fast Company’s Plugged In.

For years, some of the world’s most

2025. aug. 22. 20:20:06 | Fast company - tech
Angel Hair chocolate is taking over TikTok

There’s a new viral chocolate bar on the block.

Angel Hair chocolate, created by Belgian brand Tucho, launched in December 2024 and ticks al

2025. aug. 22. 15:40:05 | Fast company - tech
Cambridge Dictionary adds ‘skibidi,’ ‘delulu,’ and other viral internet words

You can now look up skibidi, tradwife, and delulu in the Cambridge Dictionary. 

Among the 6,000 or so words added to the dictionary over the past year, these i

2025. aug. 22. 15:40:03 | Fast company - tech
This startup claims it just outran Nvidia on its own turf
  • DataPelago has created a new engine called Nucleus that dramatically speeds up data processing for
2025. aug. 22. 13:20:06 | Fast company - tech