Science & Technology
What Public Discourse Gets Wrong About Social Media Misinformation
In 2006, Facebook launched its News Feed feature, sparking seemingly endless contentious public discourse on the power of the “social media algorithm” in shaping what people see online.
Nearly two decades and many recommendation algorithm tweaks later, this discourse continues, now laser-focused on whether social media recommendation algorithms are primarily responsible for exposure to online misinformation and extremist content.
Researchers at the Computational Social Science Lab (CSSLab) at the University of Pennsylvania led by Stevens University Professor Duncan Watts study Americans’ news consumption. In a new article in Nature, Watts, along with David Rothschild of Microsoft Research, Ceren Budak of the University of Michigan, Brendan Nyhan of Dartmouth College and Emily Thorson of Syracuse University, review years of behavioral science research on exposure to false and radical content online and find that exposure to harmful and false information on social media is minimal to all but the most extreme people, despite a media narrative that claims the opposite.
A broad claim like “it is well known that social media amplifies misinformation and other harmful content,” recently published in The New York Times, might catch people’s attention, but it isn’t supported by empirical evidence, the researchers say.
“The research shows that only a small fraction of people are exposed to false and radical content online,” says Rothschild, “and that it’s personal preferences, not algorithms that lead people to this content. The people who are exposed to false and radical content are those who seek it out.”
Misleading statistics
Articles debating the pros and cons of social media platforms often use eye-catching statistics to claim that these platforms expose Americans to extraordinary amounts of false and extremist content and subsequently cause societal harm, from polarization to political violence.
However, these statistics are usually presented without context, the researchers say.
For example, in 2017, Facebook reported that content made by Russian trolls from the Internet Research Agency reached as many as 126 million U.S. citizens on the platform before the 2016 presidential election. This number sounds substantial, but in reality, this content accounted for only about 0.004% of what U.S. citizens saw in their Facebook news feeds.
“It’s true that even if misinformation is rare, its impact is large,” Rothschild says. “But we don’t want people to jump to larger conclusions than what the data seems to indicate. Citing these absolute numbers may contribute to misunderstandings about how much of the content on social media is misinformation.”
Another popular narrative in discourse about social media is that platforms’ recommendation algorithms push harmful content onto users who wouldn’t otherwise seek out this type of content.
But researchers have found that recommendation algorithms tend to push users toward more moderate content and that exposure to problematic content is heavily concentrated among a small minority of people who already have extreme views.
“It’s easy to assume that algorithms are the key culprit in amplifying fake news or extremist content,” says Rothschild, “but when we looked at the research, we saw time and time again that algorithms reflect demand and that demand appears to be a bigger issue than algorithms. Algorithms are designed to keep things as simple and safe as possible.”
Social harms
There has been a recent trend of articles suggesting exposure to false content or extremist content on social media is the cause of major societal ills, from polarization to political violence.
“Social media is still relatively new and it’s easy to correlate social media usage levels with negative social trends of the past two decades,” Rothschild says, “but empirical evidence does not show that social media is to blame for political incivility or polarization.”
The researchers stress that social media is a complex, understudied communication tool and that there is still a lot to learn about its role in society.
“Social media use can be harmful and that is something that needs to be further studied,” Rothschild says. “If we want to understand the true impact of social media on everyday life, we need more data and cooperation from social media platforms.”
To encourage better discourse about social media, the researchers offer four recommendations:
1. Measure exposure and mobilization among extremist fringes.
Platforms and academic researchers should identify metrics that capture exposure to false and extremist content not just for the typical news consumer or social media user but also in the fringes of the distribution. Focusing on tail exposure metrics would help to hold platforms accountable for creating tools that allow providers of potentially harmful content to engage with and profit from their audience, including monetization, subscriptions and the ability to add members and group followers.
2. Reduce demand for false and extremist content and amplification of it by the media and political elites.
Audience demand, not algorithms, is the most important factor in exposure to false and extremist content. It is therefore essential to determine how to reduce, for instance, the negative gender- and race-related attitudes that are associated with the consumption of content from alternative and extremist YouTube channels. We likewise must consider how to discourage the mainstream press and political elites from amplifying misinformation about topics such as COVID-19 and voter fraud in the 2020 U.S. elections.
3. Increase transparency and conduct experiments to identify causal relationships and mitigate harms.
Social media platforms are increasingly limiting data access even as increased researcher data and API access is needed to enable researchers outside the platforms to more effectively detect and study problematic content. Platform-scale data are particularly necessary to study the small groups of extremists who are responsible for both the production and consumption of much of this content. When public data cannot be shared due to privacy concerns, the social media platforms could follow the ‘clean room’ model used to allow approved researchers to examine, for example, confidential U.S. Census microdata data in secure environments. These initiatives should be complemented by academic–industry collaborations on field experiments, which remain the best way to estimate the causal effects of social media, with protections including review by independent institutional review boards and preregistration to ensure that research is conducted ethically and transparently.
4. Fund and engage research around the world.
It is critical to measure exposure to potentially harmful content in the Global South and in authoritarian countries where content moderation may be more limited and exposure to false and extremist content on social media correspondingly more frequent. Until better data is available to outside researchers, we can only guess at how best to reduce the harms of social media outside the West. Such data can, in turn, be used to enrich fact-checking and content moderation resources and to design experiments testing platform interventions.
[Annenberg School of Communications first published this piece.]
The views expressed in this article are the author’s own and do not necessarily reflect Fair Observer’s editorial policy.
Support Fair Observer
We rely on your support for our independence, diversity and quality.
For more than 10 years, Fair Observer has been free, fair and independent. No billionaire
owns us, no advertisers control us. We are a reader-supported nonprofit. Unlike many other
publications, we keep our content free for readers regardless of where they live or whether
they can afford to pay. We have no paywalls and no ads.
In the post-truth era of fake news, echo chambers and filter bubbles, we publish a plurality
of perspectives from around the world. Anyone can publish with us, but everyone goes through
a rigorous editorial process. So, you get fact-checked, well-reasoned content instead of
noise.
We publish 2,500+ voices from 90+ countries. We also conduct education and training programs
on subjects ranging from digital media and journalism to writing and critical thinking. This
doesn’t come cheap. Servers, editors, trainers and web developers cost
money.
Please consider supporting us on a regular basis as a recurring donor or a
sustaining member.
Will you support FO’s journalism?
We rely on your support for our independence, diversity and quality.
The IRS recognizes Fair Observer as a section 501(c)(3) registered public charity
(EIN: 46-4070943), enabling you to claim a tax deduction.
Support Fair Observer by becoming a sustaining member
Fair Observer, 461 Harbor Blvd, Belmont, CA 94002, USA
var amounts = { "one-time": [16, 33], "monthly": [5, 9], "yearly": [27, 51] };
function updateFooterAmounts(cycle) { $("#footer-donation-form-popup .custom_amount").val(''); var amountContainer = $("#footer-donation-form-popup .amount-buttons"); amountContainer.empty();
amounts[cycle].forEach(function(amount, index) { var activeClass = index === 0 ? 'active' : ''; amountContainer.append('$' + amount + ''); });
// Set the default selected amount to the first in the list $("#footer-donation-form-popup #amount").val(amounts[cycle][0]);
// Reattach event listeners to new amount buttons $("#footer-donation-form-popup .amount-btn").on("click", function (e) { $("#footer-donation-form-popup .amount-btn").removeClass("active"); $(this).addClass("active"); var amount = $(this).data("value"); $("#footer-donation-form-popup #amount").val(amount); }); }
$("#footer-donation-form-popup .cycle-btn").on("click", function (e) { $("#footer-donation-form-popup .cycle-btn").removeClass("active"); $(this).addClass("active"); var cycle = $(this).data("value"); $("#footer-donation-form-popup #cycle").val(cycle); updateFooterAmounts(cycle); });
$("#footer-donation-form-popup .custom_amount").on("input", function (e) { $("#footer-donation-form-popup .amount-btn").removeClass("active"); var amount = $(this).val(); $("#footer-donation-form-popup #amount").val(amount); });
// Initialize the amounts for the default selected cycle $(document).ready(function() { var defaultCycle = $("#footer-donation-form-popup #cycle").val(); if (amounts[defaultCycle]) { updateFooterAmounts(defaultCycle); } else { console.log(`Invalid default cycle: ${defaultCycle}`); } });
jQuery('.close-footer-btn').on('click', function (e) { e.stopImmediatePropagation(); $("#fixed-footer").hide();
// jQuery.cookie("isFooterDonationBannerShow", new Date().toUTCString()); // Set the cookie using jQuery.cookie $.cookie("isFooterDonationBannerShow", new Date().toUTCString(), { path: '/', domain: 'www.fairobserver.com', secure: true }); });
async function demo(){ var initial_position = await document.getElementById('amp-textarea').getBoundingClientRectAsync(); let timer = setInterval( async ()=> { var current_position = await document.getElementById('amp-textarea').getBoundingClientRectAsync(); if ((Math.ceil(initial_position.top) - Math.ceil(current_position.top)) > 1200){ document.getElementById('divToHide').classList.remove('d-none'); document.getElementById('closebtn').addEventListener('click', ()=>{ document.getElementById('divToHide').classList.add('d-none'); }); clearInterval(timer); } },2000); } demo();
Total Views: 233
Comment
members can comment. Please login to comment. Login