Four Months Into Australia’s Social Media Ban: What We’re Learning

“I love the Australia ban,” a parent said to me recently. “I wish we would do something like that in our country.” 

“Say more,” I responded with curiosity. 

“Honestly, it just feels good to see someone actually do something. I’m not sure if it will work, but it just feels good to take action.” 

The Relief of Taking Action

I hear this kind of reaction from parents all over the country. Doing something feels a lot better than doing nothing. Parents are inundated with news headlines about technology’s harm, frustrated by a maze of parental controls, and grappling with screen-related challenges at home. Awareness of a problem without collective response strategies is not our optimal state of being. 

As someone who spends a lot of time digging into the data (and as a parent of two adolescents), I’m not as certain that something is always better than nothing. Yet in the absence of meaningful legislation holding tech platforms accountable here in the U.S., the Australia approach certainly scratches an understandable itch among many caregivers.

I wrote before the ban about the case for and against age restrictions. Here we are four months into Australia’s experiment. This is a time to resist early assessments of whether a policy “succeeded” or “failed.” Any quick judgment is likely more about our own biases than the messy data and insights that emerge from complex systems over time. 

This is, however, an opportunity to pay very close attention. For many, the ban felt good to pass. Many other countries are following suit. But what’s working so far? What’s not? What are the emerging benefits? What are the seen and unforeseen consequences? 

How It’s Going So Far?

News outlets have been talking to Australian youth on the ground and their experiences are mixed. One young person told the BBC, “I feel free” from the pressures of Snapchat streaks while another told the Guardian, “Getting around the ban was so much easier than expected.” Some young people acknowledge the issues with social media but resent the ban. As one teen told Teen Vogue, “Banning the apps feels like avoiding the real problem instead of helping us deal with it.”

Now Australia’s eSafety Commissioner has released an official report Social Media Minimum Age: Compliance Update that gives us some very early initial clues. What did they find?

  • There has been a reduction in the proportion of children ages 8-15 having accounts across ten age-restricted platforms. 
  • This reduction was mainly driven by platforms deactivating accounts for children. 
  • Despite overall reductions, a substantial proportion of children under 16 still have accounts. 
  • Of the parents who reported that their child had an account prior to the ban, around 70% reported that their child still had an account on Facebook, Instagram, Snapchat, or TikTok. 
  • Children aged under 16 continue to report harm to eSafety and there has not been a discernible drop in overall numbers.
  • The government has not observed a notable change in the number of cyberbullying and image-based abuse complaints involving age-restricted platforms in January and February 2026 as compared to the same period last year.

Pointing at Platforms

This early compliance data shows that tech platforms have not done the best job implementing the new law. The report outlines several poor practices, including relying on self-declared age, allowing children to repeatedly attempt age assurance methods until they gain access, and unreliable mechanisms for reporting and following up on reports of underage accounts. 

It makes sense that a relatively new set of age verification technologies haven’t been immediately successful at this scale. With legislation and regulation spreading quickly across the world, more reliable age assurance methods may improve over time. Australia is also investigating tech platforms for failing to comply with the new law and committing more resources to enforcement.

Yet in reading through this report, there is another storyline that sits right below the compliance data: a reminder of the limits of simple solutions to complex problems. 

Predictable Concerns?

While bold action might feel compelling, advocates across the world didn’t greet the Australia approach with universal enthusiasm last December. Many child-safety advocates urged caution prior to the ban. They pointed to evidence of young people’s mixed experiences online, the likelihood of workarounds, and the possibility that restrictions on major platforms might just push youth toward other, less regulated platforms. Under the policy, young users without accounts can also still access harmful online content without logging in. 

All of this becomes especially concerning when you take into account that age restrictions alone don’t push tech firms to invest in child safety, content moderation, or to consider youth mental health in the design of their products. Indeed, if the companies on paper can argue that children under sixteen are no longer their responsibility, the pressure to design for user wellbeing in general may diminish.

Youth Mental Health Is Not Just a Compliance Problem

For anyone who pinned a lot of hope on a simple and fast solution, this initial data is discouraging. On one hand, it’s only been a few months and moving the needle will take time. There is more that we don’t know yet about the impact of this policy than we do know.

On the other hand, if the ultimate goal is healthy development in the digital age, compliance and enforcement of social media minimum age restrictions is unlikely to get us there. Even Australia knows this. They’re currently developing a digital duty of care, an acknowledgment that keeping kids off certain platforms isn’t the same as keeping kids safe online. Safety also isn’t the same thing as thriving. 

The architecture of children’s wellbeing online can’t be built on a single policy lever, especially one that focuses only on keeping children away from harm rather than also addressing prevention or preparation. Our nation’s pediatricians are pointing in the same direction. They recently released a brand new policy statement on kids and screens. The title? Digital Ecosystems, Children and Adolescents.

The ecosystem approach is a powerful reminder that kids’ outcomes won’t be shaped by bans alone. A system-wide approach to supporting healthy development in the digital age means asking questions like: 

  • How do we take into account the evidence showing that different children experience both benefits and harms online?
  • How do we invest in broad systems of support for children and families? Investing in libraries, community centers, and paid parental leave may impact “screen time issues” as much as tech policy. 
  • How do we center children’s healthy development at the start of the tech design process rather than as a legislative or litigative afterthought?
  • How do we shift responsibility for healthy development upstream?

The questions worth asking go well beyond how to keep children out of a small subset of platforms until a certain age. 

What are we willing to create for and with kids?

Mister Rogers went into television not because he loved it, but because he didn’t like what he saw in commercial media. The show he created was a master class in quality programming. But it wasn’t because he was a master of technology. It was because he understood children. He said, “Please think of the children first. If you ever have anything to do with their entertainment, their food, their toys, their custody, their day or night care, their healthcare, their education – listen to the children, learn about them, learn from them. Think of the children first.” 

We are only four months into experiments with social media age restrictions, and it’s too early to declare victory or defeat in some imagined battle. But in our desire to do something about adolescent mental health, let’s not put all our chips on simple solutions or ramped up enforcement and compliance. This is a time to broaden our view and ask bigger questions. What are we willing to build for and with children and youth? Are we willing to think of them first? We all benefit from the kinds of ecosystems that follow.