THE POWER YOU CAN’T SEE: HOW DIGITAL DEPENDENCE WORKS

NETWORK POWER • SOCIABLE POWER • FLOW GOVERNANCE • INFRASTRUCTURE

Anyone who has tried to switch messaging platforms knows the problem isn’t technical: it’s where the network lives—and exit becomes socially costly.

Anyone who has tried to switch messaging platforms knows the problem isn’t technical. Signal works just as well as WhatsApp. The problem is that your contacts are elsewhere, your work groups live elsewhere, your everyday conversations happen elsewhere. The choice of tool doesn’t depend on its features, but on where the people you want to talk to already are.

This dynamic has a precise name in the theory of digital power, but it’s not a single mechanism. Network power is the constraint produced by coordination on a standard: the more a network converges on one tool, the more rational it becomes to comply, because compatibility is where social and operational value accumulates. In network-society terms, that pressure is enforced through different layers at once: networking power (access and exclusion), network power (protocols and rules of compatibility), networked power (asymmetries inside the network—admins, gatekeepers, dominant nodes), and network-making power (the ability to program/reprogram networks and “switch” connections between them). What looks like preference is often the compounded effect of these layers.

The dependence it creates isn’t only infrastructural. It’s social. The point isn’t “which app is better,” but where relational life concentrates: conversations, groups, invitations, implicit confirmations, response times. A power that doesn’t order you to stay, but makes exit socially dysfunctional. You drop out of coordination: you miss threads, timing, and shared context. The network doesn’t shut the door: it lets you leave, but makes you pay in isolation, friction, marginality.

Digital dependence as a network effect: people, chats, and groups converge on a dominant standard that makes exit socially costly.
IMAGE 1 — THE CHOICE ISN’T TECHNICAL: IT’S WHERE THE NETWORK LIVES.
I

I — THE MECHANISM: WHY WE CONVERGE WITHOUT BEING FORCED

In 2008, legal philosopher David Singh Grewal published Network Power: The Social Dynamics of Globalization, systematically analyzing this dynamic. The starting question: why do certain standards spread globally even without direct imposition?

The answer overturns the common intuition about how power works. Explicit coercion becomes unnecessary when you can create dependence through the structure of the network itself. The key mechanism is what economists call “network externalities”: the value of a good increases with the number of people who use it. The telephone is the classic example. An isolated device is useless. When everyone has one, not having it means being unreachable. Every new user who joins increases the value for those already inside, attracting more users in a self-reinforcing cycle.

CONVERGENCE DOESN’T NEED A BAN: THE INCENTIVE STRUCTURE IS ENOUGH.

Grewal calls this mechanism “coordination power.” A choice becomes progressively inevitable when everyone else has already made it: the context shifts, alternatives degrade, convergence appears as natural rationality. Laws aren’t required. The incentive structure is enough.

This dynamic cuts across different phenomena: the spread of English as an international lingua franca, the dominance of Windows in the 1990s, the hegemony of contemporary digital platforms. The underlying logic is identical: whoever reaches critical mass first builds an advantage that competitors struggle to dislodge even when they offer technically superior solutions.

The telephone took decades to trigger this mechanism. Digital platforms compressed the same process into just a few years. WhatsApp has two billion users; Google handles over ninety percent of global searches. Numbers that don’t represent simple commercial success: they are coordination points where convergence becomes necessary to participate in conversations, transactions, and information exchanges that happen there—and only there.

II

II — NETWORK POWER: WHO GOVERNS NETWORKS? CASTELLS’ CONTRIBUTION

Sociologist Manuel Castells explains who governs that convergence.

Castells, author of the trilogy The Information Age and Communication Power (2009), starts from a basic observation: we live in a “network society,” a society organized through digital networks that mediate growing aspects of social, economic, cultural, and political life. But networks are not neutral grounds where the best simply emerges. Someone builds them, programs them, decides which functions to embed in the architecture.

To program a network is to organize collective attention. It means deciding what becomes visible and what stays in the shadows, what is frictionless and what meets resistance. When YouTube changes its recommendation algorithm, it is choosing which content to amplify, which creators to reward, which videos to circulate. When Instagram privileges Reels over photos, it steers the cultural production of millions of people. The platform doesn’t dictate what to publish; but those who ignore algorithmic preferences progressively lose relevance. In the digital world, where attention is scarce, irrelevance equals disappearance.

Castells introduces the concept of “switching power”: the ability to activate, deactivate, and reroute flows inside the network. Google determines the hierarchy of search results. Facebook filters which posts reach the feed. Apple decides which apps can exist on iOS. This governance of flows operates through interfaces that look technical and functional, but the consequences are political: they determine who gets visibility, access, opportunity.

POWER DOESN’T ORDER YOU TO STAY — IT MAKES EXIT IMPRACTICAL.

The difference from traditional power lies in how it operates. The state works through laws, sanctions, and visible coercive apparatuses. Network Power works through control of standards and platforms that make certain choices too costly to sustain. Alternatives remain formally available, but practically unusable. Power is not exercised on the network from the outside: it is built into the architecture, into operating rules that appear as technical constraints rather than contestable decisions.

Identifying it, contesting it, resisting it becomes difficult precisely because it has become the environment we operate in.

III

III — FOUR DIMENSIONS OF NETWORK POWER

The Grewal–Castells framework makes it possible to map how Network Power operates in contemporary digital life. It appears through four overlapping and reinforcing dimensions: identity, distribution, visibility, infrastructure.

Identity. “Sign in with Google” across dozens of services: one password instead of many, one click instead of repetitive forms. Convenience creates dependence that only becomes visible when something breaks. If an account is shut down—often by automated systems, under opaque criteria, with ineffective appeal procedures—you lose access to everything tied to that identity: email, storage, authentication for external services. Google and Apple have become guarantors of online identity, necessary intermediaries between people and the digital ecosystem. A corporation unilaterally deciding who retains their digital existence. Switching power in its purest form.

Distribution. On iPhone every app goes through the App Store, where Apple decides what can be distributed and takes thirty percent on every purchase. The Epic Games case (2020–2021) shows how difficult it is to challenge this control even with significant resources. Android theoretically allows sideloading, but over ninety percent of users rely exclusively on Google Play. The default path—integrated, secure, with automatic updates—becomes the only path actually used. Coordination power at work: the choice remains formally free, but the incentive structure makes it obligatory.

Visibility. Google Search handles over ninety percent of global searches. Three quarters of users never go past the first results page; the top three links capture sixty percent of clicks. Appearing—or not—on that first page determines public existence online. The ranking algorithm includes hundreds of factors that no one outside the company fully understands. In the digital world, algorithmic visibility shapes perceived reality.

Infrastructure. Amazon Web Services, Microsoft Azure, and Google Cloud together hold over sixty-five percent of the global cloud market. They provide the computational backbone of much of the internet. Thousands of organizations depend on these providers; switching takes months of work and carries significant operational risk. When AWS terminated Parler’s hosting in January 2021, the social network stayed offline for weeks. The deepest layer of Network Power: control over the material conditions that make any digital activity possible.

These dimensions intertwine, amplify one another, and create stratified dependencies.

Layers of network power: login, app stores, ranking, and cloud as coordination infrastructures that determine access and visibility.
IMAGE 2 — POWER ISN’T A “BAN”: IT’S A STACK OF LAYERS THAT MAKES EXIT IMPRACTICAL.
IV

IV — NETWORK POWER: INFORMATION ECOSYSTEMS

The phenomenon scales differently when multiple services from the same company integrate with one another. Dependencies don’t add up: they multiply.

Meta controls Facebook, Instagram, WhatsApp, Messenger. Apps presented as distinct, but with growing technical integration: a single account, automatic cross-posting, converging messaging. Exiting one platform compromises the others, disperses audiences built over years, and abandons configured campaigns.

Google offers an even more pervasive integration: Search, YouTube, Gmail, Maps, Chrome, Android, Workspace, Cloud. A single account spanning search, video, communication, navigation, browser, operating system, productivity, server infrastructure. Synchronized preferences, accumulated data, habits sedimented over years. Losing that account means simultaneously losing correspondence, documents, photos, a history that functions as external memory, and authentication for third-party services.

DEPENDENCIES DON’T ADD UP — THEY MULTIPLY.

YouTube embodies Network Power on three levels at once. Creators can’t migrate their audience: followers don’t transfer, monetization is platform-bound. Viewers can’t easily give up the world’s largest video catalog. Advertisers can’t ignore where the public actually watches video.

These integrations are experienced as convenience—and they are. But every added component increases the cost of separation. Switching apps becomes rebuilding workflows, identities, cognitive habits.

V

V — THE FRAGMENTATION OF PERCEPTION

The filter bubble Eli Pariser described in 2011 was only the beginning. TikTok and X represent an evolution toward more sophisticated forms of fragmentation.

TikTok built something different from earlier platforms. Its For You algorithm doesn’t primarily show content from people you follow; it shows content the system predicts will hold attention the longest. Learning happens quickly—what gets watched to the end, what gets skipped, what triggers interaction. A few sessions are enough to build a precise behavioral profile.

The result is engagement-effective personalization that pushes users into increasingly narrow niches. Someone interested in a topic is exposed to progressively more intense versions, because emotionally charged content holds attention longer. The system doesn’t pursue an ideological agenda; it nonetheless produces a progressive specialization that can take on traits of radicalization. This applies to any area: fitness, finance, relationships, diets, spirituality.

Facebook still provides a minimum anchor to pre-existing networks: you see what people you know share. On TikTok, social mediation disappears: the algorithm becomes the only curator.

X under Musk introduced different dynamics with convergent effects. Blue verification turned into a subscription grants algorithmic priority: visibility becomes purchasable regardless of quality. Engagement monetization incentivizes content that generates strong reactions, including negative ones. Anger becomes an economic strategy.

The algorithm tends to show each group the most irritating versions of others’ positions, because irritation drives engagement. Different groups see distorted representations of one another. Genuine encounter between divergent views is replaced by mediated exposure to the least defensible version of “the other.”

The shared informational space dissolves. In its place: algorithmic bubbles that display different realities based on prior behavior and pre-existing affiliations.

VI

VI — CULTURE AS INFRASTRUCTURE + ORIENTING YOURSELF

Netflix and Amazon Prime Video together reach hundreds of millions of users, investing tens of billions per year in production. They are among the world’s largest cultural producers. Concentration produces concrete effects on what gets created, distributed, and kept accessible.

Production is commissioned based on algorithmic forecasts: which content will generate retention and new subscribers. This favors consolidated formulas and genres with measurable track records. Experimentation becomes harder: it requires investment without data to support success probability. An economic logic that systematically leans toward what predictive models deem “safe.”

Original content remains exclusive platform property. It doesn’t migrate elsewhere, doesn’t enter syndication, doesn’t become available as a single purchase. To watch later seasons, the subscription must stay active—turning an optional choice into a continuing necessity.

Discovery happens primarily through algorithmic recommendations: around eighty percent of what gets watched on Netflix comes from system suggestions. Two users see different homepages. Netflix continuously tests different cover images, showing each user the thumbnail the algorithm predicts will maximize clicks. The same film is presented in radically different ways to different users.

The shared schedule disappears. Cultural experience fragments into personalized paths that reinforce existing tastes.

Shows get canceled according to internal metrics not shared with creators. Unlike traditional TV, where a show could find a new home, streaming content remains contractually tied to the platform. Narratives get cut off, stories remain incomplete. Creators can’t take their work elsewhere, and viewers can’t follow it.

Recognizing these dynamics doesn’t mean rejecting the technologies that embed them. It means developing awareness of how the digital environment shapes everyday possibilities.

Immediate convenience has a counterpart in dependence that accumulates over time. Free services are sustained through data extraction and attention capture, generating lock-in that makes moving costly. Every design choice—unified authentication, recommendation algorithms, app store policies, cancellation systems—encodes a distribution of power that favors whoever controls the infrastructure.

Responses depend on circumstances: diversifying the platforms where you build presence, keeping independent backups, supporting regulatory initiatives for interoperability and portability, exploring open alternatives where feasible.

What’s at stake is whether digital spaces remain environments where power can be contested and alternatives can emerge, or whether they consolidate as territories governed by a few actors controlling the mandatory passage points. A political—not technological—question: it concerns the distribution of power, the governance of infrastructures, the rights of those who use them.

UNDERSTANDING NETWORK POWER IS THE FIRST STEP. WITHOUT IT, WE REMAIN GOVERNED BY INVISIBLE ARCHITECTURES, TOWARD OUTCOMES WE DID NOT CHOOSE.
READ MORE ↗ APRE IN UN’ALTRA PAGINA
Fonti

Post scriptum

Media sociologist Fausto Colombo calls this device “sociable power”: a power that hides behind ease of use and apparent equality, producing soft and continuous constraints.

Colombo warns that it will be difficult to navigate freely online without recognizing “the subtle web of its sociable power.” Network power is not only convergence toward technological standards, but a regime of social coordination that turns exit into friction. It operates through interface defaults, ranking logics, and governance decisions—then stabilizes through peer expectations (presence, reputation, responsiveness). Dependence becomes structural: technology plus relationships plus habits.

youtube placeholder image

Watch on YouTube

Similar Posts