Dark blue to black gradient background, a digital grid of light blue lines covers half the image
Milton Wolf Seminar on Media and Diplomacy

Rewiring Democracy: Disinformation, Media, and Diplomacy in the Age of AI

By Xénia Farkas

The 2025 Milton Wolf Seminar on Media and Diplomacy focused on the infinite loop of media, democracy, and diplomacy. Across six sessions, experts from journalism, diplomatic practice, and academia addressed urgent challenges including the erosion of public trust in media and institutions, the proliferation of disinformation, and the implications of the unregulated AI revolution. This blogpost offers an overview of the seminar’s key discussions and insights with additional theoretical background.

Francis Fukuyama’s (1989; 1992) controversial thesis on “the end of history” served as a thought-provoking starting point. Although the idea that today’s global disorder is apocalyptic was critically challenged, there is no reason to rest in ease either. The growing influence of information warfare results in disinformation campaigns, hybrid threats, and the normalization of influence operations worldwide. Disinformation, in particular, blurs the traditional boundaries between soft and hard power, peace and conflict (Pamment et al., 2025). In this environment, truth becomes contested and unstable, a shift best captured by the concept of the “post-truth” era: objective facts became less important than emotional appeals and personal beliefs (McIntyre, 2018). The declining trust both in democracies and democratic institutions goes hand in hand with these trends.

In this context, social media platforms, unlike traditional media, offer minimal gatekeeping and operate in a logic of engagement, not verification – making them fertile ground for manipulation. Further, AI generated content are also integral part of social media disinformation campaigns with doctored images and videos (Vaccari & Chadwick, 2020) that deceive the public. Thereby they are used to create a viral altered reality. Beyond content creation, generative AI is used to gather all kinds of information, and through AI-driven disruptions, contributing to a broader technological shift that is redistributing power across societal domains. The relationship between states and technology companies has also become increasingly complex and influential in the age of AI: major tech firms maintain close ties with governments, shaping public discourse around key AI issues such as automation, data use, and intellectual property. Big Tech plays a significant role in policy formation, and these dynamics reflect broader trends in the datafication of state functions and a shift toward governance through data, infrastructure control, and rentier logics (Dencik, 2025). As a result, the risks associated with AI technologies are increasingly prominent in policy and scholarly debates. While the European Union’s proposed AI Act introduces a tiered, risk-based regulatory model, a more comprehensive conceptual framework remains elusive. Drawing on insights from disaster risk studies, scholars suggest that breaking down AI risk into components of hazard, exposure, and vulnerability could enhance our capacity to identify threats and design targeted interventions, particularly in the case of general-purpose and experimental AI systems (Zanotti et al., 2024).

The United States presents a distinctive context to study these processes, which were shaped by historical, political, and technological dynamics. At the heart of this landscape lies a broken consensus model, in which the shared understanding of truth has fractured. The rise of social media, echo chambers, filter bubbles (Rhodes, 2022), and the widespread use of bots (Keller & Klinger, 2019) have fundamentally altered how information circulates and is received. In this environment, truth is no longer stable. Truth has become increasingly contested, fragmented, and shaped by the platforms through which it is consumed. Conspiratorial communities thrive in this volatile space, where disinformation serves as both a symptom and a tool of division. Interestingly, many forms of disinformation are not entirely new but rather modern reconfigurations of old narratives, adapted to current concerns and cultural anxieties. These narratives often help construct a form of shared reality within ideological groups, deepening the divides between “good” and “bad,” conservative and liberal, insider and outsider (Marwick & Partin, 2024).

However, these trends are also present in other countries with different cultural and historical backgrounds. For example, in India, the intersection of media and oligarchy is reshaping democratic structures through the personalization and concentration of political and economic power (Roy, 2024). Oligarchy here involves the systemic use of extreme wealth to secure political advantage, which is distinct from isolated acts of corruption due to its scale and entrenchment. Wealthy individuals now control significant portions of the media landscape, contributing to a “lapdog media” environment marked by collusion and cheerleading narratives that frame oligarchic dominance as national strength. This totalizing media control, enabled by free-market dynamics, signals a broader democratic backsliding, necessitating a fundamental rethinking of liberal market solutions and the balance between state and corporate power. Similar dynamics have emerged in the illiberal Hungarian media landscape, where media pluralism has dramatically decreased. The state creates the illusion of media diversity by distributing funds to loyal private owners in exchange for pro-government content. These media owners face no real market risks, as their operations are fully supported by state financing. This means that the aim is not just profit but strengthening the market position of regime-friendly actors. Further, this concentration has enabled the dissemination of pro-government narratives and contributed to democratic backsliding, mirroring broader trends across illiberal regimes (Bátorfy & Urbán, 2020).

As it can be seen, the challenges are not only related to social media. The relationship between media and democracy is also under increasing pressure. This process can be best described by the phenomenon of “media capture” (Ferreira, 2024), which describes the process by which political and/or economic actors gain control over media outlets, effectively colonizing the media market to consolidate and maintain power. This capture not only threatens press freedom but further undermines public trust, particularly when journalism becomes entangled in the broader project of democratic backsliding. In this context, the role of journalism is more crucial than ever. On one hand, journalists may fail to fulfill their pro-democratic role by not reporting critically on anti-democratic developments. On the other, they may actively resist or challenge these threats through investigative reporting and public accountability efforts. These new dynamics complicate the media’s watchdog role and highlight the urgent need to defend independent journalism in defense of democracy. Thereby, we need to uphold the value of news media, ensure that journalists are secured, and we need to think courageously.

Nevertheless, solutions must go beyond journalism. The disinformation crisis cannot be solved with fact-checking alone. Nor can we rely solely on platform regulations or algorithmic transparency. There is an urgent need for interdisciplinary, long-term strategies that address the cultural and systemic roots of disinformation. These include investing in media literacy from an early age, not merely as a defensive strategy but as a way to empower citizens to navigate complex information environments, protecting journalists and promote press freedom in fragile democracies, strengthening democratic education, supporting resilient media ecosystems, and developing new institutional safeguards to protect public trust. Importantly, solutions must be grounded in both resistance and reconstruction: resisting authoritarian narratives while rebuilding democratic structures that can adapt to technological change.

The 2025 Milton Wolf Seminar made clear that the future of democracy hinges on our ability to understand and navigate the complex relationships between media, technology, and power. By fostering critical dialogue across disciplines and sectors, the seminar offered both a diagnosis of our current moment and a call to action: to think boldly, act collectively, and safeguard the democratic values that are increasingly under siege.

Bátorfy, A., & Urbán, Á. (2020). State advertising as an instrument of transformation of the media market in Hungary. East European Politics, 36(1), 44–65. https://doi.org/10.1080/21599165.2019.1662398

Dencik, L. (2025). ‘Rescuing’ data justice? Mobilising the collective in responses to datafication. Information, Communication & Society, 1–16. https://doi.org/10.1080/1369118X.2025.2465874

Ferreira, R. R. (2024). “It Forces You to Publish Some Shit”: Toward a Taxonomy of De-Democratizing Journalistic Practices. The International Journal of Press/Politics, 19401612241266556. https://doi.org/10.1177/19401612241266557

Fukuyama, F. (1989). The End of History? The National Interest, 16, 3–18. https://www.jstor.org/stable/24027184

Fukuyama, F. (1992). The end of history and the last man. Free press.

Keller, T. R., & Klinger, U. (2019). Social Bots in Election Campaigns: Theoretical, Empirical, and Methodological Implications. Political Communication, 36(1), 171–189. https://doi.org/10.1080/10584609.2018.1526238

Marwick, A. E., & Partin, W. C. (2024). Constructing alternative facts: Populist expertise and the QAnon conspiracy. New Media & Society, 26(5), 2535–2555. https://doi.org/10.1177/14614448221090201

McIntyre, L. (2018). Post-Truth. MIT Press.

Pamment, J., Smedberg, M., & Isaksson, E. (2025). National security and public diplomacy. In S. Aday (Ed.), Handbook on Public Diplomacy (pp. 449–461). Edward Elgar Publishing. https://doi.org/10.4337/9781803926568.00048

Rhodes, S. C. (2022). Filter Bubbles, Echo Chambers, and Fake News: How Social Media Conditions Individuals to Be Less Critical of Political Misinformation. Political Communication, 39(1), 1–22. https://doi.org/10.1080/10584609.2021.1910887

Roy, S. (2024). The political outsider: Indian democracy and the lineages of populism. Stanford University Press.

Vaccari, C., & Chadwick, A. (2020). Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News. Social Media + Society, 6(1), 2056305120903408. https://doi.org/10.1177/2056305120903408

Zanotti, G., Chiffi, D., & Schiaffonati, V. (2024). AI-Related Risk: An Epistemological Approach. Philosophy & Technology, 37(2), 66. https://doi.org/10.1007/s13347-024-00755-7

More in Rewiring Democracy: Disinformation, Media, and Diplomacy in the Age of AI