Social networking sites such as Facebook have a feature that allows users to ‘share’ posts, and news or topics that are likely to catch people’s attention are shared a lot and catch the attention of many users. However, a study conducted by a research team led by Professor
Shyam Sundar of Pennsylvania State University found that ‘75% of posts containing URLs on Facebook are shared without reading the content of the linked page.’
Sharing without clicking on news in social media | Nature Human Behavior
https://www.nature.com/articles/s41562-024-02067-4
Study finds 75% of Facebook shares are made without reading the content
https://www.psypost.org/study-finds-75-of-facebook-shares-are-made-without-reading-the-content/
Social media allows ordinary people to share content URLs and spread specific news and topics, but some of the news spread on social media is fake or biased. In addition, because posts can be shared with just a few taps, sometimes people only look at the news headline and the number of ‘likes’ to determine the reliability and importance of the post, and spread it without looking at the content of the link. There are concerns that such behavior could lead to the spread of fake news, especially in the political sphere.
So Sundar and his research team investigated sharing on social media. Sundar told psychology media PsyPost, ‘In my opinion, sharing is one of the most influential behaviors on social media. Not only does it create a synergistic effect of spreading information through personal networks, but it has also exacerbated the spread of online misinformation in recent years.’ ‘What most people don’t realize is that our friends and family on social media aren’t trained as journalists to vet and double-check facts before they share them. We tend to be influenced by whatever they share.’
To understand the phenomenon of ‘URL sharing without clicking on the link,’ Sundar and his colleagues analyzed a massive dataset released in collaboration between Facebook and Harvard University’s academic organization Social Science One . The dataset included billions of interactions for more than 35 million URLs shared on Facebook from 2017 to 2020.
First, the researchers used a machine learning classifier trained to identify political keywords to categorize URLs into political and non-political content, with the former including URLs related to elections, candidates and other partisan topics, and the latter ranging from entertainment to general news.
In addition, they classified Facebook users into political inclinations, such as liberal, neutral, and conservative, and analyzed whether each user’s political ideology and the content of the content are related to ‘shares without clicks.’ They also specifically investigated URLs where fact-checks were performed and identified the spread of misinformation.
Research has shown that about 75% of posts that contain URLs on Facebook are shared without clicking on the link, meaning that many users share posts that contain URLs without reading the content of the link, but instead based only on superficial information such as the post’s headline, author, and number of likes.
This trend was especially strong for posts with more extreme political ideologies, where both liberal and conservative content was shared without a click compared to neutral content. Users were also more likely to share content that aligned with their political ideology without a click, suggesting that users are drawn to headlines that confirm their existing biases. Additionally, the study found that URLs identified as false by fact checks were more likely to be shared without a click than factual content.
‘The key takeaway is that most links shared on Facebook are shared without the sharer reading the content,’ said Sundar. ‘This suggests that social media users only need to glance at the headline or blurb before deciding to push a news link to their networks. This can have a multiplier effect, spreading information rapidly to millions of people online. This can fuel misinformation and spread fake news and conspiracy theories.’
It should be noted that this study relied solely on aggregate data and did not directly observe the behavior of individual users. For example, it’s possible that users found a link to a news article they’d read elsewhere on social media and shared it without clicking because they already knew the content. Also, because this study only focused on Facebook, it’s unclear whether similar patterns exist on other platforms, such as Twitter (formerly known as X) or Instagram.
Based on their findings, the research team argues that social media platforms could help reduce the spread of misinformation by encouraging users to ‘read the post before sharing it.’ ‘If platforms could warn users that content may be false and make them aware of the risks, this could help people think before sharing,’ Sundar said.