What Parents Deserve to Know, and What Centres and Schools Must Confront
- Kirra Pendergast
- 5 days ago
- 5 min read

Last week, like so many others, I sat with the weight of the news. The devastating revelations of abuse in early childhood education centres across Australia weren’t just shocking, they were heartbreaking. But what struck me most was what didn’t make headlines. The quiet architecture of risk. The systems that allowed harm to go unnoticed, not out of malice, but out of assumption.
I did what I always do in moments like this, I went down a rabbit hole of trying to find the fixable fault lines. I opened up the publicly available policies on the websites. I downloaded dozens of policies from early childhood providers, documents on privacy, digital safety, social media, and codes of conduct. I wanted to see how our systems are protecting children as physical and digital become more blurred by the minute.
The truth? Most aren’t.
What I found were policies that haven’t been meaningfully updated since 2016. Several still included references to “facsimile machines.” Most treated digital risk like an afterthought, something to file under “IT” or “parent permissions,” not child protection.
Digital life isn’t optional anymore. It’s the operating system of childhood. Children aren’t just visiting digital spaces, they’re living in them. Learning through them. Being documented within them from before they can speak. Yet the systems meant to keep them safe in those spaces are still built for an internet that existed 15yrs ago.
It’s not just the apps. It’s the architecture firm that posts the site plan of a new school build. The tradesperson on-site, unvetted, because they were only there for an hour. The educator capturing photos on a personal phone. The parent uploading birthday pictures and the end of year concert. The group chat with no boundaries. The “update” that never came.
This is where harm hides. Not in the dark corners, but through the front door.
Not because people don’t care but because the systems weren’t built to hold the complexity of the lives they’re meant to protect.
Educators are doing extraordinary work. Every day, they navigate apps, parent messaging, consent forms, privacy settings, digital learning tools. But they’re being asked to carry a level of risk they were never trained to hold, without the technical language, policy infrastructure, or professional support to do so safely.
And into that vacuum steps a new market: a wave of well-meaning consultants offering “policy updates” on LinkedIn promising to be able to prevent abuse, most with no experience that spans what’s actually required. Child safety. Privacy law. Digital governance. AI. Ethics. Online harm. It’s not one lens that is needed it’s all of them. Templates are being sold, policies get a cosmetic refresh. But what’s missing is the architecture of protection the kind we at CTRL+SHFT build every day, across systems, sectors, and at scale.
The policies I read weren’t bad because the people behind them didn’t care.They were bad because they were built for a world that no longer exists.
The digital world moves fast. Harm moves with it. Unmonitored apps, messages sent between staff, photos shared to facebook pages without informed consent to parents that don't understand that image can been screenshot and become a deepfake nude used for CSAM in under three minutes. Photos can even be taken by what looks like a pair of reading glasses. And still, many systems rely on outdated codes of conduct, blanket consent forms, and compliance checklists that offer the illusion of safety but none of the substance. What parents are asking for isn’t panic, it’s preparation.
They will want to know:
Who is thinking further ahead than next week’s newsletter?
Who has asked the hard questions about what happens to their child's data?
Who is standing at the digital gates, not just assuming they’re locked?
And those questions aren’t just for early childhood services. They’re for every primary school, every secondary college, every board, every system.
This is about leadership. Leadership that shows up not in reaction, but in full review and redesign.
Digital safety isn’t a feature it’s now foundational. And failing to treat it as is becoming more and more indefensible. So if you are a school leader, a centre director, a board member, or a member of the P&C this is the invitation. Not to defend what was. But to build what’s needed.
And if you’re ready to build, really build, we are here to help. That’s what we do. Not because it’s our job. Because it’s the only work that matters now.
Five Areas for Immediate Reassessment
1. Education Platforms Are Collecting More Than We Realise
Modern EdTech tools do more than support learning. They may also capture:
Location data, device type, and login behaviour
Facial imagery, voice recordings, written work, and shared photos
Learning patterns, emotional tone, participation levels
Some of this data may be used for purposes beyond education—such as product development, AI training, or third-party analytics—often without clear visibility to the school or the family.
“The scale of data collected is enough to build a full digital biography of a child—identity, behaviours, abilities, and vulnerabilities.”—UK Digital Futures Commission
Schools are doing their best. But the nature of these tools means that even with the best intentions, understanding how all this works can be limited.
2. Behaviour Tracking Is Becoming the Norm, Quietly, and Automatically
Many apps now include behaviour tracking features:
Points systems and “badges” for compliance
Mood indicators or real-time engagement scores
Behavioural data that may be seen by other families or educators
These are designed to support learning. But over time, they can create profiles of children that may follow them, misrepresent them, or reduce them to patterns.
3. Consent Practices are Overdue for a Refresh
Most schools rely on consent policies drafted years ago often before AI, analytics, or hybrid learning were common practice. As a result:
Consent is too broad, too passive and needs to be fully informed
Parents and Educators may not know where the child’s data is going
Teachers may be working without clear guardrails
This isn’t a failure of schools. It’s a reflection of how much the environment has changed.
4. External Links Can Introduce Invisible Data Flows
Many educational platforms integrate with or link to:
YouTube, Google Maps and more
Cloud-based storage providers
Tools with their own privacy and advertising models
Schools don’t always have the ability to audit or restrict these flows, especially when they’re part of “core functionality.” And families are rarely told when their child crosses into a less protected zone.
5. Schools/Centres Hold the Legal Duty
Legally, schools/centres are classified as data controllers responsible for student privacy and protection. But in reality:
EdTech contracts are often fixed and non-negotiable
Training on digital governance is rare or outdated
Platform complexity makes it difficult to track what’s being shared or stored
Laws already exist that can attract large fines for how photos are handled
_________________________________________________________________
1200+ organisations have turned to our team for expert support, and policy updates that can match the moment. Contact hello@ctrlshft.global or www.ctrlshft.global
Comentarios