Nine Out of Ten School Apps Are Already Tracking A Child Before They Log In
- Kirra Pendergast

- 12 hours ago
- 6 min read

After years in staffrooms and leadership briefings, rewriting and updating policy and working through risk reviews around the use of EdTech tools and social media, I have witnessed the same thing quietly surface almost every time, only in different words.
Many decisions about which technologies to use are being made by people who simply have not been given the chance to know what is actually happening inside the tools we are putting in front of children. Not in the gentle, hand-wavy sense of nobody really knows how any of this works. In the specific, practical sense, the adults who are responsible for these children, the ones who have signed the permission slips and approved the procurement and installed the software on the school-issued devices, cannot yet tell you what that software is tracking. And that is not because they haven't cared enough. Most of the adults in the room are doing their absolute best with the information they have been given, which is not enough, and they are not always sure what questions they should be asking. That is often why they ask me to sit down with them and help work it through.
For a long time, raising this has been treated as being a bit intense, given the risk and policy background my experience comes from. The responses that have come back more times than anyone could count are: The school is using a free version of whatever platform. Parents just sign off on things without fully understanding what they are. The benefits outweigh the risks. The department has already checked. Schools have policies and departments have protocols and that is enough. None of that has ever been held against the people who said it. They were doing what reasonable adults do when they trust the system around them. What we need to look at now, gently but honestly, is whether that system has earned the trust it has been given.
Because we have evidence now.
Real, independent, Australian evidence. The UNSW study, which anyone can read in full at https://www.ndss-symposium.org/wp-content/uploads/usec26-25.pdf, has laid it out in a way that is no longer easy to set aside. Nearly nine in ten educational apps begin transmitting data before a child does anything at all. Not after login. Not after a consent form is signed by a parent who has been given the chance to read it properly. Not after a teacher has introduced the platform and explained what it does. The data moves immediately. The identifiers are attached. The profile begins to build. All of this happens before a child has pressed a single button that any reasonable adult would recognise as interaction. That finding confirms something many of us have been trying to help schools understand for more than a decade, and something that we have been quietly building a practical solution for. The risk is not what children do inside these environments. The risk is what the environment is doing to them from the moment they enter it, and what that looks like years from now when those identifiers are still moving.
In the professional development sessions we run, there is always a point where the conversation turns to the gap between policy and proof. Schools have policies. They have acceptable use agreements folded into enrolment packs. They have consent forms signed by parents at the start of every year. But almost none of them have real visibility into what data is actually being collected, where it is going, when it starts, or whether any of it aligns with what the school was told by the vendor. And parents, very rarely, are giving genuinely informed consent. They do not know where the data is stored, how it is stored, or how they would retrieve or remove their child's data if they ever needed or wanted to, now or later.
The UNSW research shows that this gap is systemic. Only about a quarter of the apps examined were fully consistent with their own privacy policies. Many claimed minimal or no data collection while actively transmitting persistent identifiers within seconds of launch. This is the precise reason I have been offering, to anyone who will sit down with us, a different way of thinking about this. Compliance on its own is not the same as safety and somewhere in the middle is a staff member using something they don’t fully understand, or a young person whose parents have been relying on a whole series of measures put in place by people who, through no fault of their own, do not always know what they do not know, and do not always know how to ask. Right now, our schools are operating inside a model built on assumed trust. If it is approved, it is safe. If it has a policy, it is compliant. If it is widely used, it is trusted. And underneath every one of those assumptions, the technology itself is built on continuous data extraction. Those two realities are fundamentally misaligned, and children are sitting in the middle of the misalignment, waiting for the adults around them to notice.
I am very actively pro the right technology position, the kind that has had genuine privacy and risk assessment done on it in the most current and thorough way available. What sits underneath all of this is a deep, professional concern about how uncritically some of these systems have been embedded into classrooms. We are placing tools into the hands of children and educators that shape behaviour, track interaction, build identifiers, and operate largely invisibly. We are doing it in environments where children already have less autonomy than they do almost anywhere else in their lives. Teachers are under enormous pressure to adopt. Schools are expected to manage risk without being given the tools to see it. Parents think it is connection, and in some ways it is. And the children themselves are being presumed to understand something that the adults around them cannot yet fully explain.
The UNSW researchers called this a culture of compliance. It is worth extending that. It is also a culture of outsourced responsibility, and that is said without blame, because outsourcing is what reasonable people do when they are overwhelmed and under-supported.
Parents assume the schools have vetted the tools. Schools assume the departments have vetted them. Departments assume the vendors are compliant. And underneath all of that quiet, well-meaning assumption, the data is already moving. It has been moving the whole time. Before anyone has asked a single meaningful question.
This is the point where the conversation has to shift, together. We now have empirical, Australian evidence showing that data collection is immediate, consent is effectively bypassed, policies are unreadable in any practical sense, and the alignment between what these platforms say and what they actually do is inconsistent at best. This is not a future problem. It is core infrastructure, and it is sitting inside the rooms where children are learning to read.
After all these years inside the space, what surfaces now is a quiet sense of relief that the ground has finally caught up with what some of us have been seeing, feeling and trying to articulate for years without being able to point to anything this concrete. Now we can point to it and change the conversation to help. Because from here, it is no longer about just raising awareness. It is about accountability, and about foreseeable risk, and about whether we are prepared, as parents, as educators, as a country, to properly look after the systems we have allowed into the most important environments our children occupy. This is work that no one person, and no one school, should have to do alone. It is the kind of work that happens best when the people who have been in the rooms for a long time sit down with the people who are trying to hold those rooms together now, and something more considered than what we inherited is built in the process.
The tools are already inside the classroom. What remains is whether the adults responsible for those classrooms are willing to ask, properly and out loud, what those tools are really doing. And when they are ready to ask, there are those of us who have been waiting a very long time to help answer.
Contact us at hello@ctrlshft.global for information on how we can help.



Comments