top of page
0_2-8.jpg

Why Under-16s Can Still Access Social Media in Australia — and the Crucial Detail Most Parents and Educators Are Missing

  • Kirra Pendergast
  • 5 days ago
  • 4 min read

Confusion has a way of spreading faster than truth, and right now, confusion is spreading through our school communities about Australia’s "ban" on social media for children under 16 years. 

I can hear it in the voices of parents who are worried their child has done something illegal. I can see it in students who believe they are one login away from punishment. I can feel it among staff who are unsure where their legal responsibilities begin and end. 

Australia did not ban children from social media. Australia banned social media companies from providing services to children below the minimum age. 

That distinction matters more than most headlines ever allowed. 

Under the Online Safety Act 2021, which you can read in full at https://www.legislation.gov.au/Details/C2021A00076, the legal responsibility sits with platforms. The law requires social media services to take reasonable steps to prevent access by minors under 16. It regulates industry behaviour. It does not criminalise childhood, and it does not ban the internet.  

Somewhere along the way, this became publicly shortened to “the ban.” It is a convenient phrase. It is also misleading. When language is careless, fear fills the gap. 

If your child entered a fake date of birth to open an account, they have not committed a crime. If your thirteen or fourteen-year-old has a social media profile, they are not breaking Australian law. You, as parents, have not committed a criminal offence. Educators are not legally liable because a student under sixteen has an account. The legislation does not create criminal penalties for young people who are on platforms below the minimum age. 

This law was designed to shift accountability upward, not downward. It targets corporate systems, not children navigating a digital world built by adults for them. 

Why this clarity matters is not abstract. It is practical and urgent. 

Many young people under 16 in Australia can still access social media. Regulatory reform does not flick like a switch. It embeds over time. Platforms adjust age assurance processes. Enforcement frameworks evolve. Compliance tightens in stages. Sometimes access changes slowly. Sometimes it shifts overnight. 

We have been preparing students for 10 December 2025, assuming that digital access, account continuity, and platform reach could change quickly, because responsible risk governance does not gamble on best-case scenarios. It prepares for disruption before it arrives. Exposure does not move in a straight line. Platform restrictions can intensify without warning. The only defensible position for a school community is readiness grounded in education, not panic driven by rumour. 

But there is another layer to this that concerns me even more. 

When young people believe they will be punished simply for being on social media under 16 years, they go quiet when something goes wrong. Silence is the oxygen of harm. Cyberbullying festers in private messages. Image-based abuse circulates in hidden folders. Grooming thrives in secrecy. Sexual extortion escalates in isolation. 

If a child believes admitting they are on a platform will get them into trouble, they are less likely to report when a conversation turns threatening, or an image is shared without consent. That delay can be the difference between swift intervention and lasting trauma. 

We need students to understand a fundamental concept. Their safety matters more than a sign-up form. If something online makes them feel uncomfortable, frightened, pressured or unsafe, they will not be punished for having an account. They will be supported. 

Parents, the same principle applies at home. If your child comes to you about an online safety issue, the priority is their well-being, not whether they complied with a platform’s age requirement. The first response must be protection and care. Consequences can be discussed later if necessary. Safety cannot. 

Schools play an important and steady role in guiding students to build safe and responsible digital habits, responding thoughtfully when online issues affect well-being, supporting families as they navigate digital challenges and foreseeable risks, meeting legal reporting responsibilities where serious harm arises, and nurturing a culture where students feel safe to seek help early rather than carrying concerns alone. 

If a child is experiencing cyberbullying, image-based abuse, sexual extortion or any form of online harm, support exists beyond the school gates. The eSafety Commissioner provides reporting tools and direct assistance at https://www.esafety.gov.au  The Australian Centre to Counter Child Exploitation operates through the Australian Federal Police at https://www.accce.gov.au  Kids Helpline offers confidential counselling for young people on 1800 55 1800 and at https://kidshelpline.com.au

These services exist because online harm is real, measurable and growing. According to the eSafety Commissioner’s own research, a significant proportion of Australian children report negative online experiences before the age of sixteen. This is not a theoretical risk. It is a lived experience. 

Clarity is protection. Precision is protection. Calm, accurate language is protection. 

Let us place responsibility where the law places it, on the companies that design, profit from and control these environments. Let us prepare our young people not with fear, but with literacy, boundaries and open channels of communication. 

We can build a culture where a child who is scared of what is happening on a screen feels safer walking into a classroom or a kitchen and saying, I need help. 

That is the standard we need to set. 


 
 
 

Comments


bottom of page