misinformation graphic

One Billion Posts, One Election

Annenberg School Professor Sandra González-Bailón and colleagues analyzed the spread of over one billion Facebook posts to reveal how information flowed on the social network.

The 2020 U.S. presidential election took place amidst heightened concern over the role of social media in enabling the spread of misinformation. Facebook’s part was particularly concerning, given previous worries about its impact on the 2016 election.

In a study published in the journal Sociological Science, Annenberg School for Communication Professor Sandra González-Bailón and colleagues analyzed over one billion Facebook posts published or reshared by more than 110 million users during the months preceding and following the 2020 election.

“Social media creates the possibility for rapid, viral spread of content,” González-Bailón said. “But that possibility does not always materialize. Understanding how and when information spreads is essential because the diffusion of online content can have downstream consequences, from whether people decide to vaccinate to whether they decide to join a rally.”

Sandra González-Bailón
Sandra González-Bailón

The research team paid particular attention to whether political content and misinformation spread differently from other content on the platform. They also examined whether Facebook’s content moderation policies significantly impacted the spread of information.

They found that Facebook Pages, rather than individual users or Groups, were the primary drivers of content distribution on the platform, as their posts reached large audiences simultaneously. However, misinformation spread mainly through direct user-to-user sharing, suggesting an enforcement gap in the platform’s content moderation when it came to user-transmitted messages.

“A very small minority of users who tend to be older and more conservative were responsible for spreading most misinformation,” González-Bailón said. “We estimate that only about 1% of users account for most misinformation reshares. However, millions of other users gained exposure to misinformation through the peer-to-peer diffusion channels that this minority activated.”

The research highlighted three paths by which content made its way to a user’s Feed on Facebook during the 2020 election.

One involved content flowing directly from friends. The second was from Pages, the typical mechanism for celebrities, brands, and media outlets to share content. The third is Groups, which users join to connect to other users.

“We estimate that only about 1% of users account for most misinformation reshares.”

Content shared via friends, Pages, and Groups generates different propagation patterns, which the researchers mapped using “diffusion trees,” representations of the width and depth of information sharing. In addition to these patterns, the researchers also analyzed the reach of that propagation, or the number of people exposed to a given post.

“Most people online are lurkers, which means that most users view but rarely produce or reshare content,” González-Bailón said, “so merely calculating the number of reshares doesn’t show the whole picture of what happens on social media. That’s why we looked at exposures, that is, the number of views a given post or message accumulated.”

misinformation
González-Bailón's study found that misinformation trees are predominantly initiated by user posts.

During the study period, Facebook employed emergency measures that intensified its content moderation. These measures were known as “break-the-glass” because, as the name implies, they were designed to respond to extreme circumstances and mitigate heightened risks, like the “Stop the Steal” campaign that erupted immediately after the election. The researchers found that periods of high-intensity content moderation were generally associated with drops in information propagation and exposure to misinformation, specifically. These drops indicate the influence that content moderation efforts may have at crucial junctures, including the moments when those efforts are rolled back.

Social media platforms are evolving rapidly, adopting AI and other emerging technologies. With these changes comes the potential for misinformation to spread in new ways, along with opportunities to discover more effective ways to curtail it. According to González-Bailón, platforms should collaborate with external researchers to understand these changes and assess the effectiveness of their content moderation policies.

“The ability to control information flows gives much power to platforms, and this power should not be exercised outside of public scrutiny,” she says. “The public can only assess how effective platforms are in their content moderation efforts through publicly shared data and analyses.”

In this video, Professor Sandra González-Bailón provides insight into her research on how information propagated on social media during the U.S. 2020 presidential election
2025 magazine cover

Connections: A Year at Annenberg