top of page

The Australian Minimum Age Regulatory Framework Decoded

ree

The Social Media Minimum Age (SMMA) legislation was passed in 2024. Until last week, the details of how it would work were unclear. With the release of the regulatory framework, the expectations are now visible and unambiguous.

Here’s what it means in practice:

  • If a child under 16 is using an “age-restricted” social media platform, that platform must remove their account.

  • If a child tries to create an account, the platform must stop them before they get through the sign-up process.


  • If a removed user tries to create a new account, the platform must detect and block that too.

This applies to Australian children who are ordinarily resident in Australia. Not tourists. Not short-term visitors. This is about kids who live here.

December 10th is not a finish line. It’s the start.


For Schools, Parents, and Communities

The release of this framework means schools and families need to prepare. Thousands of children under 16 will lose access to their social media accounts in December. Some will be relieved. Some will be angry. Many will be caught off guard.

The questions are inevitable:

  • What if they’re nearly 16?

  • What if their whole friendship group is online?

  • What if this is the only place they feel seen?

These questions matter. And they need to be met with respect, not dismissal.

But the line is firm. Social media accounts are not a right. They are environments designed with adult users in mind. The new law simply draws the line where developmental science already has. To download our free guide on how to prepare click here A Systems Change, Not a Gesture

This framework forces the tech industry to move from rhetoric to infrastructure.

Platforms will have to:

  • Build real detection systems

  • Prove they’re effective

  • Respect privacy in the process

  • Continuously adapt as technology and risks evolve

The message is simple: if you want to operate in Australia, you must protect Australian children. Not through promises. Through enforceable, auditable systems.

What Counts as an “Age-Restricted Social Media Platform”?

Any platform where:

  • Social interaction is a key purpose

  • Users can follow, friend, tag, message or comment on each other

  • Users can post content (videos, photos, stories, etc.)

This covers the platforms you’re thinking of TikTok, Instagram, Snapchat, but also any new or emerging service that fits the same risk profile.

How Will Platforms Know Who’s Under 16?

The framework makes it clear: platforms must move beyond self-declaration and install age assurance systems tools that estimate, verify, or infer a person’s age.

These systems can include:


  • Facial age estimation (e.g. to estimate age using AI)

  • Behavioural inference (e.g. looking at content patterns, language, login times)

  • Date of birth checks against verifiable data (e.g. from a device or app store)

  • Multi-step ‘successive validation’—starting with light checks and escalating only where needed

Platforms must use the most reasonable, proportionate, and privacy-aware combination of tools for their service. These are not enough on their own:

  • A date-of-birth dropdown

  • Parental vouching without validation

  • Systems that demand government-issued ID without offering a privacy-preserving alternative


What Happens to the Kids?

The guidance requires accounts to be removed with kindness, care, and clear communication. (The Australian government has requested that thiis happen but this can never be guaranteed) That includes:

  • Notifying the child before removal

  • Offering a way to download their data

  • Giving them a path to challenge the decision if they believe it’s wrong

This is not about punishment. It’s about protection. And platforms must build wellbeing resources, review options, and transparent messaging into the process.

What Happens If Platforms Don’t Comply?

The penalties are severe. Fines of up to $49.5 million. Public exposure. Court orders.

Non-compliance won’t be judged on whether a handful of underage users slip through. It will be judged on whether the platform can demonstrate it took reasonable, evidence-based steps to comply.



 
 
 

Comments


Online Safety & Wellbeing.
By the Ctrl+Shft Coalition.

500 Terry Francois Street, San Francisco, CA 94158

ctrl-shft

Online Safety Pty Ltd - All rights reserved 

Stay Tuned.

Get the latest updates from Ctrl+Shft in your inbox.

Thanks for subscribing!

bottom of page