First Findings from US 2020 Facebook & Instagram Election Study Released

Unprecedented research by Prof. Sandra González-Bailón and colleagues reveals the influence of Facebook's algorithms on political news exposure.

  • In the context of the 2020 presidential election, Facebook's algorithms were extremely influential in users’ on-platform experiences. 
  • There is significant ideological segregation in political news exposure on Facebook.
  • Facebook algorithms that determine what users saw did not sway political attitudes during the 2020 election period.

Today, academics from U.S. colleges and universities, including Annenberg School for Communication Professor Sandra González-Bailón, working in collaboration with researchers at Meta published findings from the first set of four papers as part of the most comprehensive research project to date examining the role of social media in American democracy. 

The papers, which focus primarily on how critical aspects of the algorithms that determine what people see in their feeds affect what people see and believe, were peer-reviewed and published in Science and Nature.

González-Bailón, the Carolyn Marvin Professor of Communication and Director of the Center for Information Networks and Democracy,  is the lead author of the Science paper entitled “Asymmetric Ideological Segregation in Exposure to Political News on Facebook.” She is also a co-author on the remaining three papers. 

The academic team proposed and selected specific research questions and study designs with the explicit agreement that the only reasons Meta could reject such designs would be for legal, privacy, or logistical (i.e., infeasibility) reasons. Meta could not restrict or censor findings, and the academic lead authors had final say over writing and research decisions. 

With this unprecedented access to data and research collaboration, the team found:

  1. Algorithms are extremely influential in terms of what people see and in shaping the on-platform experiences.
  2. There is significant ideological segregation in political news exposure.
  3. Three experiments conducted with consenting participants run during the 2020 election period suggest that although algorithm adjustments significantly change what people see and their level of engagement on the platforms, the three-month experimental modifications did not notably affect political attitudes.

The core research team included 15 additional academic researchers with expertise in the four areas this project focused on: political polarization, political participation, (mis)information and knowledge, and beliefs about democratic norms and the legitimacy of democratic institutions.

The team worked with Meta researchers to design experimental studies with consenting users who answered survey questions and shared data about their on-platform behavior. The team also analyzed platform-wide phenomena based on the behavior of all adult U.S. users of the platform. Platform-wide data was only made available to the academic researchers in aggregated form to protect user privacy. 

Ideological Segregation on Facebook

“Asymmetric Ideological Segregation in Exposure to Political News on Facebook,” led by González-Bailón and David Lazer from the University of Pennsylvania and Northeastern University, respectively, analyzed on-platform exposure to political news URLs during the U.S. 2020 election and compared the inventory of all the political news links U.S. users could have seen in their feeds with the information they saw and the information with which they engaged.

“This begins to answer questions about the complex interaction between social and algorithmic choices in the curation of political news and how that played out on Facebook in the 2020 election,” said Sandra González-Bailón.

Key Findings

  • Many political news URLs were seen, and engaged with, primarily by conservatives or liberals, but not both.
  •  Ideological segregation associated with political news URLs posted by Pages and in Groups was higher than content posted by users.
  • There was an asymmetry between conservative and liberal audiences, where there were far more political news URLs almost exclusively seen by conservatives than political news URLs exclusively seen by liberals.
  • The large majority (97%) of political news URLs (posted at least 100 times) on Facebook rated as false by Meta’s third-party fact checker program were seen by more conservatives than liberals, although the proportion of political news URLs rated as false was very low.

Among the co-authors on the paper are Deen Freelon, the Allan Randall Freelon Sr. Professor of Communication at the Annenberg School for Communication as well as three Annenberg alumni: Emily Thorson, Magdalena Wojcieszak, and Natalie Jomini Stroud.

Impacts of Removing Reshared Content on Facebook

“Reshares on Social Media Amplify Political News but Do Not Detectably Affect Beliefs or Opinions,” led by Professors Andrew Guess from Princeton, and Neil Malhotra and Jennifer Pan from Stanford, studied the effects of exposure to reshared content on Facebook during the 2020 U.S. election.

Key Findings

  • Removing reshared content on Facebook substantially decreased the amount of political news and content from untrustworthy sources people saw in their feeds, decreased overall clicks and reactions, and reduced clicks on posts from partisan news sources.
    • Removing reshares reduced the proportion of political content in people’s feeds by nearly 20% and the proportion of political news by more than half.
    • Although making up only 2.6% of Facebook feeds on average, removing reshares reduced the amount of content from untrustworthy sources by 30.6%.
  • Removing reshared content on Facebook decreased news knowledge among the study participants, and did not significantly affect political polarization or other individual-level political attitudes.

Impacts of altering feed algorithms from personalized to chronological

In “How Do Social Media Feed Algorithms Affect Attitudes and Behavior in an Election Campaign?,” also led by Guess, Malhotra and Pan, the team investigated the effects of Facebook and Instagram feed algorithms during the 2020 U.S. election by comparing the standard feed to a chronologically ordered feed.

Key Findings

  • Replacing study participants’ algorithmically ranked feeds on Facebook and Instagram with a simple chronological ranking, meaning that they saw the newest content first, substantially decreased the time participants spent on the platforms and how much they engaged with posts there.
    • The average study participant in the Algorithmic Feed group spent 73% more time each day on average compared with U.S. monthly active users, but the Chronological Feed group spent only 37% more.
  • The chronologically ordered feed significantly increased content from moderate friends and sources with ideologically mixed audiences on Facebook; it also increased the amount of political and untrustworthy  content relative to the default algorithmic feed. The chronological feed decreased uncivil content.
    • When presented in chronological order, political content -- appearing in 13.5% of participants’ feeds on Facebook and 5.3% on Instagram on average -- increased by 15.2% on Facebook and 4.8% on Instagram.
    • When participants viewed the chronological feed, content from untrustworthy sources, making up 2.6% of Facebook feeds and 1.3% of Instagram feeds on average, increased by 68.8% and 22.1%, respectively.
    • Posts with uncivil content on Facebook (estimated as 3.2% of participants’ feeds on average) decreased by 43% when participants saw a chronological feed. Posts with uncivil content on Instagram (estimated as 1.6% of participants’ Instagram feeds on average), however, did not decrease.
  • Despite these substantial changes in participants’ on-platform experience, the chronological feed did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes during the three-month study period. 

Impacts of Deprioritizing Content from Like-minded Sources on Facebook

Finally, the “Like-minded Sources on Facebook Are Prevalent but Not Polarizing” study, led by Professors Brendan Nyhan from Dartmouth, Jaime Settle from William & Mary, Emily Thorson from Syracuse and Magdalena Wojcieszak from University of California, Davis, presented data from 2020 for the entire population of active adult Facebook users in the U.S., showing that content from politically like-minded sources constitutes the majority of what people see on the platform, though political information and news represent only a small fraction of these exposures. The study subsequently reduced the volume of content from like-minded sources in consenting participants’ feeds to gauge the effect on political attitudes.

Key Findings

  • Posts from politically ‘‘like-minded” sources constitute a majority of what people see on the platform, although political information and news represent only a small fraction of these exposures.
    • The median Facebook user received a majority of their content from politically like-minded sources—50.4% versus 14.7% from cross-cutting sources (i.e., liberals seeing content from conservatives or the opposite). The remainder are from friends, Pages and Groups that are classified as neither like-minded nor cross-cutting.
  • Reducing the prevalence of politically like-minded content in participants’ feeds during the 2020 U.S. presidential election had no measurable effects on attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims.

Co-Authors and Future Projects

Academics from Dartmouth, Northeastern University, Princeton, Stanford, Syracuse University, University of California, Davis, University of Pennsylvania, University of Virginia and William & Mary are the lead authors of these initial studies. The lead researchers from the Meta team were Pablo Barberá for all four papers and Meiqing Zhang for the paper on ideological segregation. Meta project leads are Annie Franco, Chad Kiewiet de Jonge, and Winter Mason.

In the coming year, additional papers from the project will be publicly released after completing the peer-review process. They will provide insight into the content circulating on the platforms, people's behavior and the interaction between the two.