top of page

The Australian eSafety Commissioner has put Facebook, Instagram, Snapchat, TikTok and YouTube Officially On Notice.

Three months.

That is all it has taken for The Asutralian eSafety Commissioner Julie Inman Grant and her team to do what most governments have spent years talking about and never had the courage to actually pursue, to place some of the most powerful companies on earth on notice for something measurable, provable, and for too long, deliberately ignored.

The evidence is being gathered. The cases are being built. Penalties of up to $49.5 million are on the table, and the Commissioner has made clear that non-compliance will follow these platforms into every boardroom, every government meeting, and every market they operate in globally. 

This is what it looks like when a regulator refuses to blink. 

This reform, in her own words, is “unwinding twenty years of entrenched social media practices” and that does not happen in a news cycle, it does not happen in a petition or a pile-on or a LinkedIn post with twenty people tagged at the bottom. It happens the way all durable, generational change happens. Slowly, deliberately, and with enough evidence that no one can argue their way out of it. We should all be watching. We should all be grateful. And we should all be very, very clear about what has just begun.

On the other side of the line drawn by the Australian Social Media Minimum Age Law on December 10th 2025 are statements no doubt being drafted, talking points being polished and a pivot is being rehearsed. If you have been watching this industry for more than five minutes, you already know exactly how this goes.  It wasn't us. It was the app stores. It was the devices. It was the parents. It was honestly a bit of a grey area when you really think about it. Have you considered that perhaps the real issue here is the definition of "reasonable steps"? Our team of forty-seven lawyers certainly has.

What you will not see, at least not while there is a quarterly earnings report to protect is anyone standing up and saying we had the technology, we had the resources, and we made a choice. Instead, you will see a masterclass in the slow, expensive, choreographed art of not quite admitting anything while also not quite denying it either. Responsibility and accountability will be reframed, and the centre of gravity will shift just enough, just far enough so that by the time anyone works out who owns the failure, everyone has quietly left the building. When you read the compliance report carefully the pattern becomes impossible to miss. This is not a failure of technology. It is a failure of will.

For those of us who have spent years in this space, who have sat in the rooms and read the research and watched the decisions get made in slow motion, that pattern is not a surprise. We always knew and we also knew that Julie Inman Grant and her team needed the evidence to be undeniable enough that no one could look away. Now they have it.

The report documents that platforms prompted children to attempt age verification even after those children had already declared themselves underage. Yes that is happening. Children telling these platform how old they are, and the platforms knowing this and the Australian Law, offering them another go. That is not a glitch. That is not an algorithm misfiring in some distant server room. This is deliberate, designed, and monetised. It is the digital equivalent of a bouncer at a pub leaning down to a fourteen-year-old outside a pub and whispering: try again through that other door, maybe this time you'll get in. The only difference is that the bouncer would lose their job.The report also documents that children could attempt the same age verification process over and over until the system yielded. Not once. Not twice. As many times as it took. This is persistence rewarded,  a door that was built to open.

And then there are the parents. The people who are actually trying to do something are being met with walls like this: Want to report your child's account? Fill out this form. Then this one. Provide a legal letter. Wait. Receive an automated response that says nothing and does nothing. If you have ever tried to have a child removed from one of these platforms you know exactly how this feels. It feels like the whole system was designed to make you give up. Because it was.

We have struggled to say out loud for almost two decades that children are not just users in these systems. They are future revenue streams. The earlier a child enters the ecosystem, the deeper the behavioural patterns take hold. More time spent means more data collected. More data means more precise targeting. More precise targeting means a more valuable user. It is an elegant, efficient, and until now largely unchallenged loop. So when we see evidence that children are being guided through age gates rather than stopped by them, we are not looking at oversight. We are looking at a system behaving exactly as it was built to behave.

The Commissioner has been unambiguous about what comes next. The platforms have the capability to comply. They simply do not have the will. And this, she notes, is right out of the playbook the same one automakers ran in the seventies when seatbelts threatened their margins, the same one Big Tobacco ran for decades while quietly building its next generation of customers. The industry will push back. It will argue that app stores should verify age, that devices should carry the burden, that the policy is flawed. But the Commissioner has drawn a line that is both simple, profound and world leading. If you profit from the environment, you own the door. We do not blame the brewery for what happens inside the bar. We hold the bar accountable.

For those demanding instant results, the report offers something more valuable than a headline. It offers a reality check. The New Mexico Attorney General's case against Meta took three years from investigation to verdict. Three years of quiet, methodical, unglamorous work. The kind that does not trend. The kind that wins.


Parents who have felt powerless now have something they did not have before. A law that backs their instincts. A regulator willing to act. A narrative that finally shifts the burden back to where it belongs. Educators who have watched children arrive exhausted, distracted, and emotionally frayed now have a framework that recognises those impacts as systemic not incidental, not a parenting failure, not a school problem.

The platforms are being asked, for the first time, to prove not what they say but what they do.There will be pushback. There will be carefully worded statements about innovation and user choice and unintended consequences. There always are. But underneath all of that noise sits a truth that is no longer deniable.

Feedback and submissions form that includes a full description of what the eSafety Office can and can't do can be found here: Social Media Minimum Age form | eSafety Commissioner


Want to read more?

Subscribe to safeonsocial.com to keep reading this exclusive post.

 
 
 
bottom of page