
Evolving Shape and Growing Strength of Political Disinformation on Social Media
By Gabrielle D. Beacken
Introduction
The theme of this year’s seminar touched upon the dilemma of the ouroboros—an infinite loop where in fact we are ‘eating our own tail’ as the ouroboros illustrates. Our systems of media, technology, policy, governance, and diplomacy are continually highly interconnected, yes, but the problem is not their continual dependence on one another. It’s that our efforts to build and refine these systems to further bolster international cooperation and democracy are not just ‘stuck’ but they are currently letting us down—eating our own tail. Our two-day seminar focused upon various facets of our struggling media, technology, and policy systems—continued attacks on independent journalism in authoritarian, democratic, and competitive authoritarian arenas, the amplified problem of state-led disinformation through the introduction of artificial intelligence and lowered guardrails of social media platforms, the heightened dependency of governments on a centralized group of tech leaders to envision, design, and enforce their hyper-tailored version of the future, media policy loopholes exploited by governments, and the extremism of political ideology not so subtlety exposing itself within our everyday, mainstream politics, and much more.
On the first day of our seminar, panelists begged the question: are we in the apocalypse now? Ultimately it was posited that we are not. Panelists considered the long tail of history and the remaining global resilience against anti-democratic pursuits as a pointer of hope and fruitful collaboration for the future. Our seminar strived to come up with solutions—something that those who wish to pursue democratically informed and encouraging work can act upon and work through. There were several of these bright moments, to name a few: the discussion of the diversification of a media organization’s portfolio to stay fiscally healthy and operational, suggestions for including environmental costs and consequences within AI policy and how to restructure the policy discussion around resource extraction, and the proposed analytic framework shift from ‘disruption’ to ‘resilience’ when studying the intersection of information flows and evolving technologies. Yet, huge challenges remain, even if the question of apocalypse now remains an open question. But in the meantime, junior scholars like myself can attempt to stave off its formidable homecoming.
Shape and Strengths of Disinformation Online
As a budding scholar of propaganda and disinformation I was particularly interested in both the anticipated of future and analysis of existing trends within these information control and influence campaigns. Especially as these questions revolve around emerging technologies and political proceedings—in particular elections. As ‘2024’ was dubbed the ‘super year of elections’, much of the discussion throughout our panels touched upon the integrity of the information environment and why it matters to the functioning of and trust in electoral processes across the globe.
Disinformation on social media—knowingly false information spread with the intent to deceive and cause harm oftentimes using covert tactics to achieve a political aim (Freelon & Wells, 2020; Prochaska et al., 2023)—is not just spread through bottom-up approaches (though this also does occur) but through the participation of multiple top-down, bottom-up tactics. Meaning, groups of hired propagandists, genuine ideological fans, or simply a paid troll farm can engage in coordinated behavior to attack platform algorithms in order to maximize a disinformation campaign’ reach, visibility, and potential effectiveness (Woolley & Howard, 2018). Yet, a key element of many of these disinformation campaigns is their reliance on an ‘influential node’ to propel these grassroots campaigns to simple chatter online to trending, omnipresent stories. As discussed throughout our two-day seminar, actors with political power (whether elected, unelected, or formerly elected to a role of authority) continue to engage in and subsequently bolster these disinformation campaigns: it can be a simple retweet of a conspiracy theory, re-sharing of a hateful meme, or even simply a validating response to a particular worldview from a well-known political figure (including politician, journalist, podcaster, influencer). With the help of these accounts with large or meaningful followings, disinformation campaigns garner vaster reach, eyeballs, attention, and engagement. The goal, as disinformation scholars posit in a sending ‘up the chain’ framework, is for potentially fringe disinformation campaigns to reach mainstream platforms and eventually be picked up and covered by the mainstream media (Donovan, n.d.; Marwick & Lewis, 2017). Which is easier to do now, argues scholars, as newsrooms are slimming in staff and increasingly relying on social media discourse for story ideas and sources (Zimdars, 2020). Political authorities, elites and influencers play a crucial role in the circulation, engagement and potential believability of modern social media disinformation campaigns.
This serves consequences for democratic institutions and processes. One is the continued degradation of what was once considered the ‘digital public sphere’—though it should be noted that the information sphere (on or offline) was never ‘pure’ and has always been full of truths, lies, and half-truths (DiResta, 2018). However flawed, voters, activists, journalists, researchers, political figures and more continue to engage with social media as a space to communicate political ideas, connect with audiences, and (at times) engage in political discussion. These spaces are still relied upon for many as a means to receive, understand, and share political information (Tucker et al., 2018). Disinformation campaigns that aim to stifle critical voices, muddy the information space, and sew doubt or fear into minds of citizens attacks this flawed but still significant space for information deliberation and meaning making. Where we consume, process, and break down information online is strategically attacked by political actors who spread disinformation to push their own disfigured political realities and interpretations.
A further consideration of these disinformation campaigns on social media, messaging apps, and more is its amplification—or ‘on steroids’ as our seminar described—through the continual sophisticated integrations of generative AI. Many instances of the use of AI—from robocalls, deepfakes of audio and video, to the creation of fake news articles and books—from both foreign and domestic entities was present in 2024 elections (Gursky et al., 2024). Using generative AI in disinformation campaigns further potentially amplifies the quantity and refines the personalized quality. Deepfakes videos can be crafted with greater ease through advancing and openly distributed technologies, while varied iterations of these videos can also be produced according to targeted audience demographics, interest, locations, and more. However, as discussed in our seminar, what tech companies, the media, and civil society organizations are also concerned with is the accurate detection and tracing of these AI disinformation campaigns. Figuring out the source of a generative AI disinformation campaign and properly labeling it as such is an obstacle. Furthermore, policymakers are left with the task of once we figure out the deepfake is indeed a deepfake and perhaps it did come from a foreign or domestic entity, then what? Besides the exacerbated inauthenticity generative AI disinformation campaigns bring to social media spaces, there are many practical and important questions left to be answered.
Conclusion
Contemporary disinformation campaigns’ amplification through emerging technologies, increasing sophistication and use, and top-down involvement poses serious threats to our information and electoral environments. It is a large, complex web of interconnected values—technology, media, policy, governance, economy—that require a holistic, interdisciplinary approach. That is why the many expert minds across disciplines present at our seminar is needed to approach and tame this—one of many—pertinent problems existent in our social and political structures today. As political actors’ tactics and uses of technology evolve, so do the shape and strengths of disinformation campaigns. We too must evolve and grow in our efforts to address, understand, and fight these deceitful endeavors. We must bolster our resilience against and ‘disrupt’ the perpetual fate of the ouroboros.
DiResta, R. (2018). COMPUTATIONAL PROPAGANDA: IF YOU MAKE IT TREND, YOU MAKE IT TRUE. The Yale Review, 106(4), 12–29. https://doi.org/10.1353/tyr.2018.0030
Donovan, J. (n.d.). The Lifecycle of Media Manipulation. DataJournalism.Com. Retrieved November 30, 2024, from https://datajournalism.com/read/handbook/verification-3/investigating-disinformation-and-media-manipulation/the-lifecycle-of-media-manipulation
Freelon, D., & Wells, C. (2020). Disinformation as Political Communication. Political Communication, 37(2), 145–156. https://doi.org/10.1080/10584609.2020.1723755
Gursky, J., Riedl, M., Rebelo, K., Saviaga, C., Savage, S., & Knight, T. (2024). Generative Artificial Intelligence and Elections. Center for Media Engagement. https://mediaengagement.org/research/generative-artificial-intelligence-and-elections/
Marwick, A. E., & Lewis, R. (2017). Media manipulation and disinformation online (United States of America) [Report]. Data & Society Research Institute. https://apo.org.au/node/135936
Prochaska, S., Duskin, K., Kharazian, Z., Minow, C., Blucker, S., Venuto, S., West, J. D., & Starbird, K. (2023). Mobilizing Manufactured Reality: How Participatory Disinformation Shaped Deep Stories to Catalyze Action during the 2020 U.S. Presidential Election. Proceedings of the ACM on Human-Computer Interaction, 7(CSCW1), 1–39. https://doi.org/10.1145/3579616
Ryan-Mosley, T. (2023, October 4). How generative AI is boosting the spread of disinformation and propaganda. MIT Technology Review. https://www.technologyreview.com/2023/10/04/1080801/generative-ai-boosting-disinformation-and-propaganda-freedom-house/
Tucker, J., Guess, A., Barbera, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3144139
Woolley, S., & Howard, P. N. (2018). Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford University Press.
Zimdars, M. (2020). Introduction. In K. McLeod & M. Zimdars (Eds.), Fake News: Understanding Media and Misinformation in the Digital Age. The MIT Press. https://direct-mit-edu.ezproxy.lib.utexas.edu/books/edited-volume/4625/Fake-NewsUnderstanding-Media-and-Misinformation-in