100 days of Australia’s social media age restrictions: lessons for the next phase of policy

Australia’s under-16 social media ban is a starting point, not an end point. The first 100 days offer early lessons on what will matter most if the policy is to work for young people.

Australia’s under-16 social media ban is a starting point, not an end point. The first 100 days offer early lessons on what will matter most if the policy is to work for young people.

20 March 2026

It’s been 100 days since Australia implemented world-first legislation restricting under-16s from holding accounts on major social media platforms. Since its launch, millions of accounts have been deactivated, and Australia has been positioned as a global reference point for digital age regulation.

However, early implementation reveals significant behavioural adaptation by young users, raising critical questions about whether the ban alone can achieve its intended outcome.

To realistically achieve its goals, the next phase of policy must move beyond simply counting account closures. It must instead focus on whether the ban improves young people’s safety and wellbeing, supports privacy-preserving enforcement and addresses the social drivers of young people’s online engagement.

Accounts closed

The social media ban prohibits under 16s from holding accounts on platforms whose primary function is online social interaction, and includes Instagram, Snapchat, TikTok, X, YouTube and Reddit.

It was driven by growing evidence linking heavy adolescent social media use with mental health risks, exposure to harmful content and addictive platform design.

In the first 30 days, 4.7 million accounts were deactivated. Snapchat reported over 400,000 account removals, TikTok 200,000 and Meta reported 550,000. Australia has approximately 2.5 million 8–16-year-olds, many of whom have multiple social media accounts. Without baseline data on accounts before the ban, or how many have been created since, the reduction in participation remains unclear.

This uncertainty is exacerbated by media and anecdotal accounts from parents, educators and researchers suggesting widespread workarounds, including using family accounts, misrepresenting their age and moving to platforms outside the legislation.

However, a value-add early outcome of the ban is that Australia has become a live policy laboratory for governments around the world exploring children’s access to social media. Indonesia, Spain and Denmark are now exploring similar models. Australia has made its mark.

What the first 100 days suggest

Though the policy has been framed by government as reshaping digital environments, early implementation suggests that regulating online behaviour is fundamentally distinct from regulating physical spaces.

Challenge 1: The internet has no front door

Unlike restricting alcohol sales in physical spaces with doorways, the internet is a distributed digital ecosystem with no boundaries, where users easily move between platforms. Restrictions in one place are often bypassed through alternative pathways.

International experience demonstrates that when access is restricted, user behaviour often adapts rather than disappears. In China, gaming restrictions for minors were circumvented through adult accounts. In the UK, the introduction of age verification checks saw the use of VPNs surge to bypass them.

Effective digital regulation must therefore anticipate behavioural adaptation and include mechanisms to monitor, evaluate and respond to it.

Challenge 2: It’s a social, not just technical, problem

The current policy treats social media use as a technical problem requiring a technical solution such as age verification and account controls.

However, young people’s use of social media is not simply technical. It is deeply social and cultural, a space where identity, communication, peer relationships and belonging are formed.

Restricting access does not address the underlying social motivations driving use.

Effective responses to social media harms therefore require broader social strategies alongside regulation, including youth engagement, digital education and family support.

Challenge 3: What workarounds teach young people about rules

Young people develop their understanding of rules, responsibility and authority through everyday experiences.

If children learn that online restrictions can be easily bypassed through misrepresenting their age or using alternative platforms, they may begin to see digital rules as flexible or optional.

This thinking can extend to future policy changes in technology safety and beyond, which means consideration must be given not only to whether regulations are technically enforceable, but also to the behavioural messages they send about rules and responsibility.

Challenge 4: The ban vs parental agency

A blanket ban removes parental agency in deciding whether their child may use social media before the age of 16.  For some families, this creates tension between government regulation and parental responsibility.

Online safety is shaped by regulation but also by family values, cultural expectations and parenting approaches. Families navigate children’s technology use depending on their circumstances, raising an important governance question about how regulation can support child safety while maintaining trust and cooperation with parents who guide children’s online behaviour.

Challenge 5: Privacy vs protection

Enforcing age restrictions often requires users to upload government identification or biometric data to digital platforms. This creates tension between improving safety online and protecting citizens’ privacy and cybersecurity.

It also runs counter to long-standing cyber safety lessons that adults and children have learnt about not sharing personal details and identification online.

To maintain public trust, age assurance systems must verify age while minimising the collection and storage of sensitive personal data.

Next phase: Reframing Success

Success is undefined. Account closures are not the goal; they are only a limited enforcement metric. It buys time.

Adolescence is short and foundational, and we do not have years to waste while a generation learns that digital rules are optional. What’s needed now is precise, informed action.

Young people will inherit a digital world we cannot yet imagine. They will need to navigate its challenges and seize its opportunities. The question is not whether we can keep them out, but whether policy equips young people to live well within it. Therefore the goal needs to shift from restriction to readiness.

I propose five long-term indicators that would meaningfully assess whether the policy is achieving its intended purpose:

  1. Measuring digital displacement, not just the platforms covered by the restriction: Are former users now on less regulated, riskier platforms? Are they hidden within family accounts or anonymous apps with fewer safeguards? If workarounds push young people into invisible, unregulated spaces, the policy has failed its core purpose. This will require ongoing monitoring and reporting led by the eSafety Commissioner, supported by independent research.
  2. Social media literacy, not just risk awareness: Embedding social media literacy into education would improve knowledge about platform design, data privacy and the difference between healthy and compulsive use. This is not about teaching fear, but about developing young people’s agency so they can make independent and well-informed choices. This sits primarily with state education departments and schools, supported by the Australian Communications and Media Authority’s public awareness initiatives.
  3. Family-guided engagement, restored agency: Parents need support, not just regulation. Success means families feel equipped to have ongoing conversations about online activity, to model healthy use and to guide their children through digital dilemmas. This restores parental agency rather than removing it. Delivering this will require coordinated action across education and health sectors, alongside the eSafety Commissioner, to develop national evidence-based guidance and community-led delivery.
  4. Privacy-preserving age assurance: The tension between protection and privacy must be resolved. Success means developing and mandating age verification systems that confirm age without collecting or storing sensitive personal data or biometric information. Public trust depends on this. This will require leadership from the Department of Infrastructure, Transport, Regional Development, Communications, Sport and the Arts, working with regulators and industry.
  5. Platform accountability for healthy design:Ultimately, the burden should not fall solely on children and families. Success means platforms are incentivised, through regulation and public pressure, to design environments that prioritise wellbeing over engagement. Age restrictions are one tool, but they are meaningless if the underlying design remains harmful. This will require coordinated action with regulators, including the eSafety Commissioner, the Australian Communications and Media Authority, and the Office of the Australian Information Commissioner, alongside industry.

Restriction is a starting point, not an end point. But if this legislation fails to account for the complexity of social media restrictions, then we are simply buying time.

We are teaching an entire generation that legislation can be bypassed and that government policy lacks credibility. That would be actively harmful.

Using the findings from the first 100 days to build literacy, family guidance and healthier platforms, we will have given young people the ability to live well in a digital world.

The next phase of policy must therefore begin with a clear goal: that success is measured not by what we take away, but by what we build.

 

Dr Jo Orlando is an Associate Professor at Western Sydney University whose work focuses on AI and Digital wellbeing, with a particular emphasis on education, childhood, and family life. She holds a Doctor of Philosophy (PhD) from the University of Technology Sydney, where her research examined children, technology, social change and education.

She is currently an APPI Policy Fellow, where she is developing the Panoramic EdTech Policy Framework, which examines how education technology governance can better address emerging and often competing policy challenges.

Image credit: Canva

Subscribe to The Policymaker

Subscribe to The Policymaker