The ones left in the middle
- Kirra Pendergast

- 14 minutes ago
- 7 min read
“I’m 16 but I look 14, so how will I get past the facial recognition? It estimates your age, and it will say I’m too young. Is there another way to confirm it, like a driver’s licence? Because social media is how I talk to all my friends, and I don’t use numbers.”
This was an email I received this week. And it hasn’t left me. I could feel the panic through the text and the need for support. I received another 6 messages from this young person in the following hour through our contact email on www.ctrlshft.global

Beneath the noise of social media age delay, there’s a quieter story. One that’s not being told loudly enough. It’s the story of the young people in between. Sixteen-year-olds who look a lot younger. Teenagers standing in the gap between policy and lived experience.
We have to step up and help young people move through this transition with just 47 days left until the Australian Social Media Minimum Age Delay kicks in. This isn’t only a tech change. For many, it’s a social life change. The place they talk, laugh, learn, and belong is shifting under their feet.
This young man wasn’t worried about losing followers. He was worried about losing his friends. Because when you look younger than sixteen, algorithms don’t see your fear of losing connection, they just see your face. And if the system gets it wrong, you’re suddenly cut off from your entire peer world.
Are they the forgotten ones in this?
The kids who are old enough to know what they’re losing but too young to have any control over the systems deciding their access?
We can back the delay and still back them. That means creating safe alternative spaces, giving clear information, and truly listening to their concerns so no one gets left behind in the name of safety.
We also need to help them talk to their friends urgently. Help them understand how to stay connected and support anyone who suddenly loses access while they’re proving their age or waiting to get their account back. Most of all, we need to guide them through this change now, showing them where to go, how it will work, and that they won’t be alone in it.
We have just 47 days until December 10th and then just a couple of weeks later it will be the longest school holiday of the year…we have a lot of work to do.
Below are the questions that came from my back-and-forth conversation with that young person, and a few others this past week, along with clear answers on what the new law actually means.
Please read them. Share them. And remember that behind every account, every verification check, is a young person who just wants to stay connected safely, honestly, and with dignity.
Does removal only happen if you’re actually under 16, or can it happen if you’re just suspected?
The obligation is to remove/deactivate accounts that belong to Australians under 16. To work that out, platforms will use “age checking” systems and signals. If you’re flagged by mistake, platforms must give you accessible review options to show you are 16 or over (for example via a face‑based age estimate, an ID check, or other options).
Put simply you could be asked to prove your age if you’re suspected, but there must be an easy way to fix errors before or after any action is taken.
Are Snapchat, Instagram, TikTok and Facebook covered?
Yes mainstream social networks where people post, connect and interact meet the law’s definition of an “age‑restricted social media platform,” so they’re in scope in Australia. Private messaging or gaming chat is only covered if it meets the same “social platform” test.
Facial Age Estimation is only one of them… you can upload an ID if they get it wrong what does that mean?
There isn’t one single age check. Platforms can use a mix of methods:
Face‑based or voice‑based age estimate (a quick check that guesses your age range from a selfie or voice sample),
Age inference (patterns from your activity and signals the platform already has), and
Age verification (checking a trusted source for your date of birth, e.g., an ID).
You must be given a choice.
If a platform offers an ID upload, it cannot make government ID the only option. It has to offer a non‑ID alternative (for example, a face‑based estimate or other review paths). This is in the law and the eSafety.gov.au guidance. This is not a Digital ID requirement and not government surveillance.
If a check is wrong, you can ask for a review and choose another method to prove your age. Platforms should keep this simple and explain clearly what to do.
How long do I get to prove my age before my account is removed?
The guidance doesn’t set a fixed number of days. Instead, it requires platforms to give clear warnings, explain what’s happening, tell you how to download your data, and provide a fair, timely review process (including telling you how long a response will take). Each platform’s notice will spell out the deadline you’re given.
If I complain to eSafety and prove I’m 16, will eSafety restore my account if it was deleted?
eSafety doesn’t run an individual appeals service for getting accounts back. The guidance says platforms themselves must handle your reports, reviews and disputes. eSafety oversees whether platforms are taking reasonable steps overall and can enforce against systemic failures. For privacy complaints, you can also go to the OAIC Office of The Australian Information Commissioner.
eSafety has told platforms to handle this with care. For existing under‑16s, there’s a strong preference for deactivation/suspension (with your data kept safe) rather than permanent deletion, so you can reactivate at 16 or if a mistake is fixed. Whether a deleted account can be restored depends on each platform’s process, but you must be allowed to download your data and there must be a way to challenge the decision.
How will they determine my age? (Step‑by‑step)
Platforms are expected to use a layered approach so that most people pass quickly and only uncertain cases are asked for more. In everyday terms, this is what you’re likely to see:
If you’re signing up after 10 December 2025
If there’s any doubt, you’ll be offered a choice of quick checks (e.g., face‑based or voice‑based estimate).
If you’re under 16, you won’t be able to create the account; if you’re 16+, the account proceeds. Platforms should explain what they did and why.
If still unclear, or if the estimate says “likely under 16”, you’ll be offered stronger checks, such as an ID‑based check or another non‑ID option.
If you already have an account
Platforms will use signals they already have (age on file, account age, patterns) to find likely under‑16 accounts.
If your account is flagged, you should get a clear in‑app notice explaining why and offering options to prove you’re 16+ (face estimate, ID, or other alternatives).
If you don’t respond or the checks show under 16, your account will be deactivated or removed with care you’ll be told how to download your data and how to appeal.
Where do you prove your age?
Inside the app/website through official, clearly branded flows. The guidance asks platforms to make reporting and review easy, in‑service, and to explain what genuine checks look like to avoid scams. Avoid following links from random messages; use the platform’s official prompts.
If eSafety approves my age, how will platforms know?
eSafety doesn’t “approve your age” for a specific account and doesn’t send platforms a pass/fail about individual users. Your platform handles your review and is expected to fix mistakes quickly. eSafety’s role is to monitor and enforce the overall system (and can investigate or take action against platforms that aren’t meeting the standard).
My mates say if you’re suspected under 16 they’ll delete your accounts and you can never get them back.
That’s not what the guidance expects. Platforms must use fair processes, give warnings, offer review options, and let you download your data. eSafety even highlights a preference for suspension/deactivation (so accounts can be reinstated at 16) rather than outright deletion for existing under‑16 users but know one really knows if the platforms will do that so we need to think about that and download and save data just in case. But yes if you’re actually under 16, the platform must remove or deactivate the account.
What this means for you vs. for platforms
For platforms (Snapchat, Instagram, TikTok, Facebook, etc.) Take reasonable steps to detect and remove under‑16 accounts, offer non‑ID options, build clear review and support paths, and communicate clearly (including how to download data). Major penalties apply for systemic non‑compliance up to $49.5 million.
For you (a user 16+)
If flagged, you’ll get a notice and choices (face estimate, ID, or another path) to prove your age.
If there’s a mistake, use the in‑app review and download your data if they plan to deactivate.
If the issue is about privacy, you can complain to the Officer of the Australian Information Commissioner (OAIC); eSafety monitors whether platforms are doing the right thing system‑wide.
Will this only apply if I’m actually under 16?
It targets under‑16 accounts, but you might be asked to prove you’re 16+ if the system isn’t sure. There must be a fair review process.
How long do I get to prove my age?
No fixed number in the guidance. Platforms must give clear warnings and tell you the timeframe in the notice.
If I prove I’m 16, can I get my account back even if deleted?
Appeal with the platform first. The guidance encourages deactivation/suspension (so you can reactivate) and requires data download options, make sure you download everything just in case. Restoration after deletion depends on the platform’s process, but mistakes should be corrected. eSafety does not run personal reinstatements.
Where will I upload ID / prove my age if there’s a mistake?
Inside the app or official website through the platform’s review flow. You must be offered a non‑ID alternative if you don’t want to use government ID.
If eSafety “approves” me, how do platforms know?
eSafety doesn’t approve individual ages; the platform must handle your review and fixes errors.
The whole point of the law is child protection and harm reduction, not punishing older teens or adults. That’s why the guidance requires choice of age‑check methods, clear communication, easy appeals, data downloads, and a careful approach to deactivations.




Comments