Katie Couric, Rashad Robinson, and Chris Krebs say it’s time to pull immunity for social media platforms

Misinformation hit a crescendo during the pandemic, sowing distrust in COVID-19 vaccines and inciting riots at the Capitol. Now a coalition of experts on misinformation and disinformation are making a specific set of recommendations to lawmakers on how to fix the issue–and big tech might not be so happy. Most notably, the proposal calls for changes to Section 230, the controversial part of the 1996 Communications Decency Act that protects online platforms from getting sued over user-generated content. Research center and think tank Aspen Institute brought together a who’s who commission of experts on disinformation  to illuminate the problem and offer strategic steps to address it. The commission’s chairs include journalist Katie Couric, civil rights leader and president of Color of Change Rashad Robinson, and Chris Krebs, the former director of the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency. Spread via the internet, disinformation and its close cousin misinformation have contributed to a series of public harms over the past decade, including interference in the 2016 U.S. elections, disruption of pandemic-related public health efforts, fomentation of genocide in Myanmar, and the January 6th siege of the Capitol. Disinformation, intentionally misleading information, is engineered to go viral, taking advantage of social media algorithms that favor outrageous perspectives. Misinformation, false information with no clear intent to deceive, similarly keeps slipping past social media companies efforts to curtail it. Last month, Facebook whistleblower Francis Haugen helped explain why those mitigation strategies fail. The former Facebook product manager and member of the company’s civic integrity group called out the social network for misleading the public about how much it actually does to protect users from harmful content. “The thing I saw at Facebook, over and over again, was there were conflicts of interest between what was good for the public and what was good for Facebook,” Haugen told 60 Minutes. “And Facebook, over and over again, chose to optimize for its own interests, like making more money.” For years, Facebook has ducked responsibility for content on its platform, assuring regulators and the public that it is doing its best to balance free speech while reining in misinformation and speech that incites violence and hate. Haugen’s account suggests that is not the full picture. Social media companies have not yet been held to account, shielded by Section 230. Legislators have threatened to change the law (rhetoric that reached a fever pitch after the Capitol riots), but so far haven’t touched it. The Aspen Institute’s Commission on Information Disorder Final Report suggests removing this immunity for content that advertisers have paid to promote, as well as any content that has gone viral because of a platform’s recommendation algorithms. They also note that while free speech is a constitutional right, private platforms are not the public square and companies have the right to restrict speech. The commission’s recommendations are thorough, going much farther than simply suggesting amendments to Section 230. The report faults the federal government for failing to understand the issues and create meaningful rules that protect the public. (“Congress…remains woefully under-informed about the titanic changes transforming modern life,” the authors write.) The commission also notes that despite Big Tech’s pleas to be regulated, industry leaders have “outsized influence in shaping legislative priorities favorable to its interests.” To guide future legislative efforts, the commission suggests the government force social media platforms to be more transparent through data audits. One of the biggest hurdles to understanding both the effects of disinformation and the magnitude of the problem has been a lack of cooperation from the platforms themselves. Researchers often struggle to get the depth of information they need. (Facebook has been known to outright ban researchers who attempt to get this information without the company’s explicit participation.) The report says social platforms should be required to disclose “categories of private data to qualified academic researchers, so long as that research respects user privacy, does not endanger platform integrity, and remains in the public interest.” It also says that there should be federal protections in place for researchers and journalists who investigate social platforms in the public’s interest (even if they violate the platform’s terms and conditions in the process). It suggests that Congress require social media companies to publish transparency reports that include content, source accounts, reach, and impression data for posts that reach large audiences, and offer regular disclosures on key data points about digital ads and paid posts that run on their platforms. And it calls for clear content moderation practices as well as an archive of moderated content that researchers can access. In addition to these transparency measures, the commission asks the federal government to establish a strategic approach to countering misinformation and disinformation and the creation of an independent organization devoted to developing well-informed countermeasures. This could include efforts to educate the public on misinformation and how to discern between fact and propaganda online. Finally, the report calls for investment in local newsrooms and diversity measures, both in newsrooms and at social media companies. To support newsrooms, the report points to the creation of a digital advertising tax, much like the one Maryland passed. The report says some of those proceeds should go towards struggling local newsrooms to bolster reputable reporting. The report also suggests incentivizing donations to local news operations through tax credits. The report also recommends platforms hire diverse workforces to ensure that a broad spectrum of experiences are considered when companies design rules and content mitigation strategies. Rashad Robinson, president of Color of Change and one of the report’s commissioners, says that the government could play a role here. “Diversity should be part of how the government evaluates these companies,” especially their efforts to protect users, he says. Robinson has worked for years on civil rights issues related to the web and has spent a fair amount of time talking to regulators. “These are recommendations that I fundamentally believe are actionable,” he says.

https://www.fastcompany.com/90696655/katie-couric-rashad-robinson-and-chris-krebs-say-its-time-to-pull-immunity-for-social-media-platforms?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Created 4y | Nov 15, 2021, 11:21:32 AM


Login to add comment

Other posts in this group

AI gives students more reasons to not read books. It’s hurting their literacy

A perfect storm is brewing for reading.

AI arrived as both

Aug 17, 2025, 10:20:08 AM | Fast company - tech
Older Americans like using AI, but trust issues remain, survey shows

Artificial intelligence is a lively topic of conversation in schools and workplaces, which could lead you to believe that only younger people use it. However, older Americans are also using

Aug 17, 2025, 10:20:06 AM | Fast company - tech
From ‘AI washing’ to ‘sloppers,’ 5 AI slang terms you need to know

While Sam Altman, Elon Musk, and other AI industry leaders can’t stop

Aug 16, 2025, 11:10:08 AM | Fast company - tech
AI-generated errors set back this murder case in an Australian Supreme Court

A senior lawyer in Australia has apologized to a judge for

Aug 15, 2025, 4:40:03 PM | Fast company - tech
This $200 million sports streamer is ready to take on ESPN and Fox

Recent Nielsen data confirmed what many of us had already begun to sense: Streaming services

Aug 15, 2025, 11:50:09 AM | Fast company - tech
This new flight deck technology is making flying safer, reducing delays, and curbing emissions

Ever wondered what goes on behind the scenes in a modern airliner’s cockpit? While you’re enjoying your in-flight movie, a quiet technological revolution is underway, one that’s

Aug 15, 2025, 11:50:07 AM | Fast company - tech
The case for personality-free AI

Hello again, and welcome to Fast Company’s Plugged In.

For as long as there’s been software, upgrades have been emotionally fraught. When people grow accustomed to a pr

Aug 15, 2025, 11:50:07 AM | Fast company - tech