
Tech Elites, Climate, and the Politics of Imagination
By Laura Bullon-Cassis, PhD
The growing power of technology elites - sometimes described as a new form of oligarchy - is widely recognized as a defining feature of contemporary politics. While their influence over elections, speech, and diplomacy has been extensively debated and researched, far less attention has been paid to their role in environmental governance. And yet, it is at this intersection that some of the most pressing political challenges and imaginative constraints of our time are emerging: climate politics cannot be disentangled from the infrastructures, ideologies, and imaginaries produced by these elites. Their influence is reshaping not only governance but also the horizons of what can be imagined as politically possible.
The current moment is often cast in apocalyptic terms, but as various participants at the 2025 Milton Wolf Seminar noted, the word’s original meaning - “revelation” - may offer a more useful lens. What is being revealed now, among others, is the extent to which environmental crises and technological disruption are not parallel developments but deeply entangled ones. These dynamics are driven by overlapping political economies - such as platform capitalism and financialization - shaped by similar visions of progress, and circulated through the same media infrastructures. The seminar’s guiding metaphor, the ouroboros, evokes this recursive and thus limiting pattern - a system that devours itself under the illusion of constant renewal.
From Platform Power to Political Imagination
To understand this entanglement, we must first trace how digital infrastructures are transforming the conditions of environmental knowledge and political agency. Climate change and media transformations are often treated as separate domains. One is managed through frameworks of mitigation, risk, and resilience. The other is approached through concerns about disinformation, surveillance, and platform governance. But this division obscures a deeper convergence. Technology firms now control many of the infrastructures through which environmental knowledge is produced, circulated, and legitimized - for example, Earth observation data is central to climate adaptation.
Their hold on data and discourse is not simply a matter of access: it is reshaping the conditions of political agency itself. Artificial intelligence (AI) is a particularly stark example of this shift. The global AI race is driven by logics of growth, national sovereignty, and innovation, which leave little room for ecological limits or democratic debate. In these frameworks, systemic change gives way to promises of efficiency, optimization, and scale.
What makes this power so consequential is that it is not only infrastructural but also epistemic. Large language models and other generative systems are often presented as neutral tools, but their architectures are shaped by the political, social, and economic imperatives of their makers. They do not merely catalog knowledge; they shape it. These are not simply knowledge aggregators: they are belief-making machines, systems that influence what counts as credible, authoritative, or true. As algorithmic systems mediate more aspects of daily life, they increasingly blur the line between information and instruction. The authority to define knowledge itself is being quietly consolidated.
Shrinking Spaces for Climate Action
This entanglement coincides with a broader weakening of the public institutions tasked with climate governance, raising the stakes of private influence. The dismantling of environmental departments within multilateral bodies such as the UN reflects more than budgetary pressure resulting from populist and nationalist policies. It signals a political shift, reinforced by media dynamics that prioritize spectacle, conflict, and technological breakthroughs over structural insight. In short, climate is becoming a marginal concern precisely when it demands central attention. Attending the last session of the Intergovernmental Panel on Climate Change last February, in the first-ever absence of the United States, was a sobering reminder of the growing fragility of global climate multilateralism.
Meanwhile, the public sector is increasingly dependent on private technology firms. Public cloud services, procurement contracts, and digital outsourcing reveal an evolving entanglement. As a seminar panel reflected on, the state no longer simply regulates platform capitalism - it co-produces its infrastructure. This convergence has been accelerated by crises like the COVID-19 pandemic, during which governments turned to platforms for communication, education, and welfare delivery.
Epistemic Authority and the Limits of Transparency
As power becomes more opaque, calls for transparency alone are insufficient. What is at stake is who defines the terms of knowledge, trust, and legitimacy. In other words, the implications of this transformation are both material and epistemological. The infrastructures that sustain AI systems - data centers, rare earth mining, energy-intensive computation - are rarely scrutinized in public debates. These are not just technical issues but political questions about who builds the future, and for whom. Transparency alone cannot address this: even open-source models and public licensing regimes, as used by organizations like the messaging app Signal, operate within an ecosystem dominated by platform dependencies and economic asymmetries.
There are examples that contrast with opaque systems: Signal’s model is one of them, offering a glimpse of what it means to resist the dominant paradigm. As a nonprofit in a world of for-profit tech giants, it is structured around values of privacy, transparency, and verifiability. Its commitment to end-to-end encryption and open-source code - despite the operational challenges of scaling - serves as a “protective layer” against commercial logics. But Signal is the exception that proves the rule: most technologies depend on surveillance-based business models that monetize data and centralize power. This model, born out of policy decisions like the U.S. Telecommunications Act of 1996, normalized advertising as the economic engine of the internet and legitimized mass data collection. As AI advances, we are told to trust systems that see everything, learn from everything, and increasingly anticipate what we will do. This is not just surveillance; it is a reconfiguration of authority.
Civil Disobedience and Counterpublic Futures
In response, new forms of resistance are emerging. These include movements engaged in climate civil disobedience, which now target not only fossil fuel companies but also media monopolies and digital infrastructures. These actions recognize that visibility, legibility, and influence are the terrain on which ecological futures will be contested. Media systems, long relegated to the periphery of climate politics, are now acknowledged as central.
At stake is not only regulatory reform but narrative power: as Sarah J. Jackson emphasized during the seminar, soft power creates the hegemonic conditions under which certain futures appear legitimate and others unthinkable. Engaging with the role of tech oligarchs - often framed as libertarian visionaries - is essential to understanding the deeper consolidation of influence. We are not just confronting a policy vacuum but a form of techno-feudalism, where a handful of firms define infrastructure, identity, and belief at planetary scale.
This epistemic dominance - arguably the defining feature of today’s oligarchy - manifests through forms of epistemic injustice that structure how power operates in contemporary digital systems (Zuboff 2019; Birhane 2021). As with historical forms of concentrated power, it is being met with proposals for resistance: regulation, collective bargaining, infrastructural alternatives, and counter-narratives. Yet the transnational scope of today’s platforms presents new challenges. When a handful of actors control everything from education to communication to finance, the terrain of common cause expands - but so does the difficulty of organizing. These acts of resistance are not only political interventions. They are also acts of collective imagination, gestures that point beyond critique and toward the construction of alternative futures.
Imagination as Infrastructure
This is precisely what the Swiss National Science Foundation scientific communication project Stories of the Future set out to explore. In this project, close to 1000 schoolchildren across Switzerland co-wrote speculative stories about the political and social futures they want to inhabit - around AI, ethics and democracy. It also functioned as a participatory intervention, fostering civic agency and deliberation in young publics. The project affirms that imagination is not a luxury but a political capacity. It is also central to work on AI literacy, which asks how publics can interpret and intervene in the systems that increasingly shape their lives. Indeed, if this is indeed a “revelatory” moment, it demands more than critique. It requires reclaiming narrative, renewing institutional capacity, and expanding democratic agency.
Cooper, M. (2008). Life as Surplus: Biotechnology and Capitalism in the Neoliberal Era. University of Washington Press.
Couldry, N., & Mejias, U. (2019). The Costs of Connection: How Data Is Colonizing Human Life and Appropriating It for Capitalism. Stanford University Press.
Crawford, K. (2022). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press.
Zuboff, S. (2019). The Age of Surveillance Capitalism. PublicAffairs.
Birhane, A. (2021). “Algorithmic injustice: a relational ethics approach,” Patterns, 2(2).