After the Trump assassination attempt, Elon Musk’s X is a misinformation factory

The moments after Donald Trump was shot by a would-be assassin in Pennsylvania on July 13 were confusing for those on the ground. For those following the events unfurl on X (formerly known as Twitter), things were no clearer, as misinformation flooded the platform. 

Despite X owner Elon Musk and CEO Linda Yaccarino’s claims that X would be a nimbler, more reliable source of information during breaking news events, it was neither. The platform’s AI-powered summaries of trending posts—a feature that was introduced after Musk laid off staff who handled misinformation—were a mess. Among the AI-generated summaries that appeared on people’s X feeds: speculation that the person shot wasn’t Trump, but vice president Kamala Harris; and that an actor from the movie Home Alone 2 was shot in the melee. (Both summaries may have been grounded in real-life events: Trump appeared in Home Alone 2 and Joe Biden mistakenly called Harris “Vice President Trump” recently.)

Handling a fast-moving situation like we saw over the weekend is a difficult task at the best of times. But on a social media platform that saw the bulk of its staff cut loose after Elon Musk took over in 2023,  things have gotten even harder. “A situation like [the assassination attempt] is a nightmare for content moderation on social media,” says Melissa Ingle, a former senior data scientist at the company who was fired when Musk took over. “In many ways it’s the perfect storm—high visibility, politically charged, and with potentially dire real-world consequences.”

Making matters worse, Musk’s decision last year to end free access to its API—which lets third-party developers gather data and was a critical tool for researchers studying misinformation online—means it’s harder than ever for academic experts to get a high-level look at how and why bunk stories travel online. “Mainstream platforms have rolled back trust and safety teams and made it harder for researchers to access data,” says Joe Bodnar, an independent disinformation analyst. “At the same time, politics have become increasingly hostile. It’s a bad mix.”

While X came in for the most criticism for how it handled the raft of speculation following the shooting, including falling for trolls impersonating the shooter and sharing AI-altered images showing a Secret Service agent smiling next to a bloodied Trump, every platform had its own failings, says Bodnar. “It’s obviously not easy for platforms to surface credible information when there isn’t any,” he says. Yet despite that, some platforms fared better than others. Meta-owned Facebook seemed largely to highlight links to traditional news websites. Meta’s other social network, Threads, has a policy to limit the spread of any political content, though Ingle says the Trump assassination attempt was “the first time I saw political information break containment [on Threads] with misinformation and speculation rampant.”

But X still stood out from its rival mainstream networks, Bodnar says, as it seemingly “acted as megaphones for conspiracies.” And the misinformation started from the top at X: Musk himself seemed to hint, without evidence, that there may have been a “deliberate” failure to stop the shooter by the Secret Service.

According to a NewsGuard analysis, there were 308,000 mentions of the word “staged” on X between Saturday and Sunday, which marks a 3,924% increase over the previous two-day period. In addition, there were 83,000 mentions on X of the phrase “inside job”—a 3,228% increase over that same time span.

By late Sunday, X’s algorithm was still pushing conspiracies into people’s feeds, including a likely false claim emanating from the online message board 4chan that a police sniper was not granted permission to shoot at the would-be assassin, Thomas Matthew Crooks, until after he shot at Trump.

That one post highlights the larger challenge that social media platforms face when confronted with major news events like this. The drive to automate humans out of the process comes with costs. “No content moderation system could fully contain the spread of misinformation when an event like this occurs, but X appeared completely overwhelmed,” says Ingle. “This showcases the weakness of their minimal strategy to contain misinfo which amounts to mostly AI and a small amount of human moderation.”

Update, July 15, 2024: This article has been updated with information from a NewsGuard analysis.

https://www.fastcompany.com/91156193/misinformation-on-x-following-trump-assassination-attempt?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 11mo | Jul 15, 2024, 7:50:02 PM


Login to add comment

Other posts in this group

AI users have to choose between accuracy or sustainability

Cheap or free access to AI models keeps improving, with Google the latest firm to make its newest models availabl

Jun 19, 2025, 5:20:04 AM | Fast company - tech
Kids are turning Roblox into a virtual protest ground against ICE

As anti-ICE protests intensify across the country, kids are turning Roblox into a protest ground online.

Last week,

Jun 18, 2025, 8:10:03 PM | Fast company - tech
Amazon opens new factory in California, aiming to build 10,000 robotaxis a year

Amazon is gearing up to make as many as 10,000 robotaxis annually at a sprawling plant n

Jun 18, 2025, 8:10:02 PM | Fast company - tech
How Trump’s disruption of the crypto supply chain could be a security risk for the U.S.

The world’s three best-selling makers of bitcoin mining machines—all of Chinese origin—are

Jun 18, 2025, 5:40:08 PM | Fast company - tech
Should drivers be forced to go slower?

It’s been almost 400 years since the leaders of New Amsterdam (now New York City) confronted a growing threat on their streets: people moving too fast. In 1652, the colonial council passed what ma

Jun 18, 2025, 5:40:06 PM | Fast company - tech
Trump to extend TikTok sale deadline for a third time

President Donald Trump will sign an executive order this week to

Jun 18, 2025, 5:40:05 PM | Fast company - tech
This 19-year-old YouTuber is directing a new A24 horror movie

The Backrooms started as internet folklore posted on 4Chan. Now it’s been greenlit by A24.

Last week, it was announced that 19-year-old YouTuber Kane Parsons will direct the sci-fi/horro

Jun 18, 2025, 3:20:08 PM | Fast company - tech