Australia: banning social media for minors is hitting the wrong target
The Social media ban for those under 16 doesn’t solve the problem. It’s the proprietary architecture of platforms that needs to be regulated.
Social media ban: the mirage of the simple solution
Australia has passed a law banning access to social networks for minors under 16. Effective from December 10, 2025. Facebook, Instagram, TikTok, X, YouTube, Snapchat, Reddit, Kick: all off-limits. Fine for platforms? Up to 49.5 million Australian dollars per violation.
A response that seems decisive. Unequivocal.
Yet it’s the wrong answer to the right question.
Concern for young people’s mental health is real. Documented. A 2022 survey by Headspace – Australia’s national youth mental health center – reveals that 57% of young Australians believe their mental health is worsening. 42% point the finger at social media. In 2018 it was 37%.
The alarming data is there. But the alarm risks generating wrong reactions.
The South Korean precedent: when prohibition becomes evasion
History has already tried this. And it has already failed.
2011, South Korea: the “Shutdown Law” arrives – nicknamed “Cinderella Law”. It prohibits minors under 16 from playing online between midnight and 6 am. The goal? Prevent video game addiction, guarantee adequate sleep hours.
Ten years later, in 2021, the law is repealed.
The balance sheet? A 2011-2012 study certifies it: statistically significant but practically irrelevant effect. Increase of barely 1.5 minutes of sleep. Young people bypassed the restrictions with embarrassing ease: accounts with parents’ credentials, foreign servers, identity theft. Meanwhile, the ban paralyzed the local video game industry – Minecraft suddenly became an adult game with Xbox Live integration – while mobile games, not covered by the law, exploded to represent 54% of the market in 2018.
The Korean Herald commented on the repeal without mercy: scientific evidence on the restriction’s effectiveness was lacking. The real reason for Korean students’ sleep deprivation? Academic pressure, not nighttime video games.
Social media ban: the failure of proibitionism
Turning off social media is hitting the wrong target.
The problem isn’t interaction in digital environments. It’s the subjugation to economic and proprietary interests of social and entertainment platform architecture.
Manuel Castells, in his “Communication Power” (2009), defines contemporary society as “network society” – a social structure built around (but not determined by) digital communication networks. His thesis is revolutionary: where does power reside in the network society?
The answer is clear: power resides in those who control the “programs” of networks and the “switches” between different networks.
Castells identifies two fundamental forms of power: “network-making power” – the ability to program and reprogram networks according to specific objectives – and “network power” – the power that derives from inclusion or exclusion from networks.
Social platforms embody exactly this dual form. They program the algorithms that determine what we see. They decide who can participate and under what conditions. Power doesn’t lie in technology itself, but in the proprietary control of the “programs of the networks”.
Castells: “This autonomy is not given automatically, but must be conquered”.
Fausto Colombo, in his “Il potere socievole. Storia e critica dei social media” (2013), deepens the contradiction. Social media are not neutral tools but carriers of “precise entrepreneurial philosophies” and “market economic turns”. Behind the apparent openness and democracy hide “less obvious, and perhaps more obscure powers”.
Social surveillance is no longer just vertical. It has become “horizontal” or “lateral”: users themselves become active subjects of power through the “voluntary” surrender of personal data in exchange for services.
Jürgen Habermas fears that in the digital public sphere “the possibility of an inclusive, society-wide discourse – which, despite all the risks of domination and manipulation, still existed in the mass media public sphere – will be irretrievably lost”.
The data that unmask the ban’s ineffectiveness
Data collected by the Australian eSafety Commissioner in February 2025 reveal the extent of the problem – and the illusory nature of the proposed solution.
80% of Australian children aged 8 to 12 used one or more social media in 2024. Despite existing age limits. We’re talking about approximately 1.3 million children. The most popular platforms? YouTube (68%), TikTok (31%), Snapchat (19%). 84% of young people stated that their parents were aware of their accounts.
Julie Inman Grant, eSafety Commissioner, admits it without mincing words: “Few platforms have truly rigorous measures to accurately determine age at sign-up, so nothing prevents a fourteen-year-old from entering a false birth date and creating an adult account without restrictions”.
The engineering of addiction: anatomy of harmful design
Proprietary platforms don’t just “host” content. They implement design mechanisms specifically engineered to create addiction.
The American Psychological Association documents this in a 2024 report. APA’s Chief Science Officer, Mitch Prinstein: “Over half of adolescents report at least one symptom of clinical addiction to social media”.
Infinite scroll: Eliminates natural stopping points. The design uses what psychologists call “variable-ratio reinforcement schedule” – the same reinforcement pattern used in highly addictive slot machines.
Autoplay: On Instagram Stories and TikTok, subsequent videos start automatically. Removes the need for conscious decision. Maintains users in a passive flow state.
Personalized recommendation algorithms: Meta uses algorithms that present content with a “variable reward schedule” – the same slot machine strategy. It activates dopamine release, the neurotransmitter associated with pleasure anticipation.
Default push notifications: A 2022 American Academy of Sleep Medicine survey: 93% of Gen Z students delayed bedtime because of social media.
Frances Haugen, whistleblower: Meta knew perfectly well that Instagram was “toxic for teenage girls”. She compared Instagram’s effect on adolescents to “cigarettes” – it creates addiction while damaging health.

Harvard study from 2024: social platforms generate nearly 11 billion dollars annually in advertising revenue from ads targeted at users aged 0 to 17. This incentivizes platforms to continue their harmful practices.
Toward real regulation: architecture before prohibition
The goal should have been imposing “public” rules in managing these spaces. Not access prohibition, but transformation of the logics governing platforms.
Banning or limiting proven addictive features: New York’s Stop Addictive Feeds Exploitation (SAFE) Act requires parental consent for notifications between midnight and 6 am. These interventions regulate infrastructure, not access.
Mandate independent algorithmic audits: States should require social platforms to undergo algorithmic risk audits conducted by independent third parties and publicly disclose results.
Allow opt-out from the algorithm: Recent studies demonstrate that chronological “Following Only” feeds produce healthier consumption patterns than algorithmic feeds.
Limit targeted advertising and data tracking: The advertising surveillance-based economic model – generating 11 billion dollars annually from adolescents – is the real problem, not users’ age.
Create public and decentralized alternatives: The Fediverse – the network of decentralized social platforms like Mastodon – operates without proprietary engagement-maximizing algorithms, without targeted advertising, with chronological feeds and community governance.
The evidence that regulation works
The European Digital Services Act: Article 38 of the DSA obliges platforms to offer users the ability to disable algorithmic recommendation systems and switch to chronological feeds.
Evidence on chronological feeds’ effectiveness: A 2025 study on CESifo Working Paper analyzes the introduction of Instagram’s algorithm in 2016. The result is unequivocal: the algorithm had a “negative impact on adolescents’ mental health”.
Evidence converges: when you regulate proprietary architecture – algorithms, addictive design mechanisms, surveillance-based economic models – you get measurable results. When you simply ban access, young people find ways to bypass restrictions and migrate to even less regulated spaces.
The Australian ban for minors under 16 represents yet another example of technological solutionism applied to complex problems requiring articulated responses.
Social media ban: platform wars
The enemy is an economic model that monetizes attention, that transforms personal data into merchandise, that sacrifices users’ wellbeing on the altar of engagement. It’s this system that needs to be regulated, not users’ age.
The alternative to prohibition isn’t digital laissez-faire. It’s imposing a public logic in managing digital spaces: algorithmic transparency, limitation of targeted advertising, platforms’ responsibility for caused damages, investments in critical literacy, creation of quality public alternatives.
It’s building a neutral, open network, governed by democratic principles rather than profit logics.
As long as we continue treating social media as the problem rather than as one of many symptoms of a digital architecture enslaved to private interests, we’ll keep shooting in the dark. And the youngest – who are natives, not guests, of this network – will continue paying the price of our myopia.
Social media ban – Sources
- Headspace National Youth Mental Health Survey 2022
- eSafety Commissioner – Australian Youth and Social Media Report 2025
- Korea Times – Shutdown Law Abolished After 10 Years
- Quarterly Journal of Economics – South Korea Gaming Regulation Study
- Manuel Castells – Communication Power (2009)
- Fausto Colombo – Il potere socievole. Storia e critica dei social media (2013)
- Jürgen Habermas – A New Structural Transformation of the Public Sphere (2022)
- Theory, Culture & Society – Digital Public Sphere Study (2022)
- Nature Human Behaviour – Meta-Analysis of Algorithmic Audits (2023)
- Frontiers in Psychology – TikTok Misogynistic Content Study (2025)
- CESifo Working Paper – Instagram Algorithm Impact on Mental Health (2025)
- American Psychological Association – Social Media and Youth Mental Health Advisory (2024)
- Meta Platforms Inc. – SEC Filings on Youth Advertising Revenue
- Frances Haugen Congressional Testimony – Facebook Whistleblower (2021)
- American Journal of Law & Medicine – Social Media Revenue from Minors Study (2024)
- QUT Digital Media Research Centre – Age Verification Technology Analysis (2025)
- European Union – Digital Services Act (DSA) Official Text
- Internet Policy Review – DSA Data Access for Researchers (2023)
- New York State Senate – SAFE for Kids Act (2024)
- California Attorney General – Age Appropriate Design Code
- ACM FAccT Conference – Software Design for Regulatory Compliance (2025)
- arXiv – Algorithmic Transparency Reports for Platforms (2025)
- Mastodon – Decentralized Social Network Model
- American Academy of Sleep Medicine – Gen Z Social Media Sleep Survey (2022)





