Skip to Content

Reimagining the Internet’s Future to 2075: Technology, Democracy, and Power

From Utopian Promise to Critical Divide

The Internet was originally envisioned to be a tool that would tear down structures of power and circulate knowledge at a rate never previously possible. However, the consequences of its design evolved along with the infrastructure and became an ecosystem for surveillance and control. This essay examines that change using both critical analysis and creative imagination, illustrating a dystopian future of what 2075 might look like if we continue on this course. The described future is not merely fictional but a look at the result of the moral and political decisions made by the technologies of today.

Understanding this shift in thought must begin with how utopia and dystopia were initially defined. According to Eduardo Barreto (2020), utopia is a pursuit of social perfection and is frequently imagined through ideal societies based on equality. Dystopia, on the other hand, highlights the risks of these same goals, demonstrating how striving for perfection can lead to oppression and conformity. Barreto (2020) argues that utopia and dystopia are moral equivalents rather than opposites, with utopias expressing hope for a better future and dystopias serving as cautions against abuse of power. These ideas serve as the foundation for recognizing how academics have viewed the internet as a place of both control and freedom.

Fisher and Wright (2001) use William Ogburn’s theory of cultural lag to explain the dichotomy of optimism and fear that characterized the early debate surrounding the internet. Since “both utopian and dystopian accounts of technologies such as the Internet are more likely to reflect authors’ own preferences and values rather than an account of the technology’s impact,” they argue that when society encounters a new technology, discourse reflects ideology rather than understanding (Fisher & Wright, 2001).

Theorists like Yochai Benkler and Manuel Castells presented utopian views of the networked age within this cultural lag. Castells (2000) describes a “network society” in which information technologies replace hierarchies with flexible and decentralized networks. In contrast to the “centred, hierarchical forms of organization” that dominated the industrial age, he contends that these networks represent “a new social structure” (Castells, 2000, p. 695).

Benkler (2006) expands on these concepts by describing the internet as a “networked public sphere” that facilitates communication and challenges power systems. This enables users to avoid centralized power, whether it be “held by authoritarian governments or by media owners,” in contrast to the mass-media model (Benkler, 2006, p. 271). For him, the digital public sphere represents an evolutionary advancement over mass media: a communication system that is more resilient, participative, and resistant to centralized control (Benkler, 2006, pp. 271-272).

From Networked Freedom to Platform Power

As the internet developed, this utopian narrative was challenged by academics who framed it as a structure of control, commercialization, and surveillance. Christian Fuchs (2022) argues that capitalism has turned the internet into a digital public space that is colonized and feudalized. It perpetuates “economic, political, and cultural asymmetries of power” instead of empowering users (Fuchs, 2022, p. 283). The commodification of data and attention turns users into tools of digital alienation, and their behaviours are taken advantage of (Fuchs, 2022, p. 284).

Merlyna Lim (2025) expands on this by examining how the designs of digital platforms uphold authoritarianism and inequality. According to Lim, the “rich-gets-richer” dynamic of social media’s “scale-free networks” naturally concentrates power among dominating nodes (Lim, 2025, p. 3). Her theory of algorithmic politics describes how social media platforms favour content that is emotional and polarizing, resulting in “algorithmic enclaves” that strengthen ideological homogeneity (Lim, 2025, p. 10). These enclaves are the result of platforms controlled by algorithmic bias and marketing logic. Social media’s transformation into a tool of “authoritarian resilience” has weakened the potential of a digital public sphere (Lim, 2025, p. 6).

In his interview with The David Pakman Show, Douglas Rushkoff describes the digital economy as a continuation of exploitative structures dating back to monopolies that valued financial profit for the elite over public well-being (David Pakman Show, 2019). Developers are employing strategies like “captology,” which is the use of digital technology to attract attention. Rushkoff claims that corporate platforms have replaced the internet, formerly seen as a peer-to-peer community, by manipulating with “addiction algorithms” and replacing contact with exploitative engagement (David Pakman Show, 2019).

The Politics Embedded in Technology

The theoretical framework that connects these shifts is from Langdon Winner (1980). Winner (1980) argues that “the machines, structures, and systems of modern material culture… embody specific forms of power and authority” (p. 121), challenging the notion that technologies are neutral instruments. He makes a distinction between technologies that are deliberately developed to accomplish social objectives, like Robert Moses’s low overpasses that barred bus passengers from public areas, and those that are fundamentally political and require particular organizational structures to function, like the atomic bomb’s need for hierarchical control (Winner, 1980, pp. 124, 131).

The transition is made clear by this approach to thinking. Castells (2000) and Benkler (2006), for example, initially anticipated that these technologies would democratize society; however, Fuchs (2022) and Lim (2025) revealed that the designs reflected inequality and control of the capitalistic systems they are integrated into. Winner’s (1980) claim that “technological innovations are similar to legislative acts… establishing a framework for public order that will endure over many generations” (p. 127) is best shown by the development of the internet, from decentralized to corporate control. The principles ingrained in the internet will influence its potential rather than it developing on its own.

The Grid and the Architecture of Control

By 2075, the Internet is the main framework of social order rather than a supplementary tool. Neural nodes connect each person to the network’s central systems, continually transmitting information, communication, and decision-making. Once a “network society” by Castells (2000), this concept has developed into an extensive and integrated framework that controls the state, the economy, and individual identity. Authority is no longer exercised through visible organizations or leaders; rather, it is built into the nature of the Internet itself. The structure, not authoritarianism, is how society is governed. Participation in the Internet system, now called the Grid, is mandatory.

Early on, the Grid was welcomed as a neutral infrastructure that would advance democracy and knowledge. It was envisioned as an open system that would empower people, dismantle hierarchies, and enable new kinds of community and collaboration, echoing the utopian ideals that accompanied the early Internet. It was believed that technology itself would serve as an impartial platform for human interaction and creation, free from ingrained systems of supremacy. It is an example of what Winner (1980) referred to as a political artifact since, rather than being neutral, its entire design upholds authority and order (p. 121). Originally intended as a tool for liberation, it has since evolved into an artifact that structures authority, integrating governance into its practices and strengthening the exact hierarchies it was thought to dismantle.

Every individual has a distinct data identity, which is an evolving record of analytics that establishes one’s eligibility for things such as civic rights, employment, and healthcare. The Grid’s structure reflects what Lim (2025) identified as a scale-free network, where influence concentrates around dominant hubs. Like the main platforms of today, these hubs have developed into centralized groups that have gotten stronger with engagement and visibility. A “rich-get-richer” hierarchy that relates credibility and value to popularity is the end outcome (p. 3). Those that have significant influence are rewarded with more influence and privileges through nodes, whereas those with less are suppressed.

In 2075, the foundation of social classification is hubs. Lim’s (2025) concept of algorithmic enclaves, or digital environments that perpetuate homogeneity (p. 10), has emerged as the main approach to interaction. Communication between different hubs is limited using algorithms designed to maintain internal consistency and reduce ideological conflict. Supported by systems that encourage conformity and penalize deviation, these enclaves uphold their ideologies while giving priority to content that is emotionally charged and divisive to maintain engagement with the Grid.

The image shows 1-0 binary codes, created to describe "algorithmic enclaves" by Merlyna Lim
“Coded” by Merlyna Lim (2018) – part of Hands: Medium & Massage series.

Life Inside the Grid

This pattern contradicts Benkler’s (2006) initial optimism that the Internet would become a public space where “everyone is free to observe, report, question, and debate” (p. 272). Instead, the Grid reinforces ideological division. Members of these communities largely interact with content that mirrors and amplifies their profiles, rarely encountering viewpoints different from their own. This dynamic, according to Rushkoff, is part of the internet’s “anti-human agenda,” where developers replace real interaction with exploitation for engagement (David Pakman Show, 2019). This logic in the Grid places an emphasis on efficiency, behavioural control, and emotional regulation, where unpredictability or irregularity is viewed as a problem that needs to be fixed rather than as a human characteristic.

Early promises about democracy have been replaced by automated participation. Analytics are now employed to analyze public data collected in the neural nodes and produce policy outcomes. Predictive governance and automated participation demonstrate how systems developed more quickly than social structures could respond. Although everyone can still participate, there is no longer actual choice, especially as hubs are the main mode of interaction. This contradicts Castells’ (2000) argument that networks inherently foster democratic adaptability (p. 695).

Economically, Fuchs’ (2022) critique of digital capitalism emphasizes how online attention has been transformed into a commodity. Productivity in the Grid is assessed in terms of interaction and attention rather than actual labour; each activity is tracked and exchanged in the Algorithmic Exchange, where attention serves as the main currency. The dominance of media companies and celebrities, as noted by Fuchs (2022, p. 286), shows that Internet platforms have consolidated control over visibility and discussion rather than creating a truly participatory culture. Emotional content is amplified for financial benefit, leading to a political economy of attention. In this way, users are like tenants, whose contributions keep the system going while value builds up around a few powerful hubs.

The Grid’s functionality indicates that politics is built into the technology. Its rules act like a digital constitution, limiting what people can see, do, and say. Algorithms govern the flow of information, emotions, and interactions, replacing traditional leadership with code. Technology never evolves in a vacuum; it reflects the social, economic, and political beliefs of its creators. These forces are fully established within the Grid by 2075. This future is not a randomized outcome but is the result of decisions that prioritize control over freedom and efficiency over empathy. Whether this remains fiction depends on if or how the politics embedded in our machines today are recognized, confronted, and dismantled.

References

Barreto, E. (2020, April 24). Lecture 3: Utopias and dystopias. YouTube. https://www.youtube.com/watch?v=q4BC95DGEYc

Benkler, Y. (2006). Political freedom part 2: Emergence of the networked public sphere. In The Wealth of Networks (pp. 212–272). Yale University Press.

Castells, M. (2000). Toward a sociology of the network society. Contemporary Sociology, 29(5), 693–699. https://doi.org/10.2307/2655234

David Pakman Show. (2019, May 10). Is our technology future utopian or dystopian? YouTube. https://www.youtube.com/watch?v=ZGPdkP4gdgs

Fisher, D. R., & Wright, L. M. (2001). On utopias and dystopias: Toward an understanding of the discourse surrounding the internet. Journal of Computer-Mediated Communication, 6(2). https://doi.org/10.1111/j.1083-6101.2001.tb00115.x

Fuchs, C. (2022). The structural transformation of the public sphere and alienation. In Digital Democracy and the Digital Public Sphere (pp. 271–301). Routledge.

Lim, M. (2025). Introduction. In Social Media and Politics in Southeast Asia (pp. 1–11). Cambridge University Press. https://doi.org/10.1017/9781108750745

Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121–136.

Author’s Bio

Erin Ferguson is a fourth-year Communication and Media Studies student at Carleton University. Her academic interests include media culture, digital communication, and critical approaches to understanding complex social and cultural issues. She is especially interested in research, writing, and problem-solving as tools for engaging with contemporary media environments.