“Who here is still able to access all the apps?”
- Kirra Pendergast
- Feb 6
- 8 min read

For the past two days, I began each student's talk with these questions:
“Who here is still able to access all the apps?”
And for the older students:
“Who has a younger sibling still with access?”
In the first session, Yrs 9/10, there was hesitation. A few brave hands. I could feel the uncertainty in the room, maybe even shame or fear from those still under 16 years that they might get in trouble.
I broke it open and spoke as I have become known for on LinkedIn and in international media for many months now. I reframed it to reflect what should have been in the media before big tech's PR spin reached it. I told them how I correct adults, governments, and leaders:
The Australian Government didn’t ban under 16s from social media. It banned social media companies from accessing Australian children under 16.
The energy in the room shifted immediately. Because when you tell young people the truth without fear, without condescension, and without calling the Social Media Minimum Age Law a failure, something remarkable happens. They listen. Not because they’re scared. Because they’re smart.
I explained the why. The government banned Big Tech from using persuasive design to manipulate young minds. From harvesting their data. From nudging them through dark algorithmic loops. From monetising their moods. From twisting their bodies into shapes that the algorithm tells them are worthy. From shaping their identities to serve a machine that has never loved them, only to feed on their attention and grow profit and power.
I explained that calling it a “ban” was strategic. Probably coined by a Big Tech PR team. That is normalised and stuck so fast that even the Prime Minister uses it, because that’s what people recognise. How that’s the point. If Big Tech paints the government as the villain, they get to keep being the architect of harm. Unaccountable. Hiding behind parental controls and layers of “child online safety washing” that were never built to work. And they know it.
We gently cracked open the truth. The pull of the attention economy. The way the attachment economy has been reshaped less by families and more by profit. We talked about how, in the absence of enough time, presence, or support, many young people now turn to their devices not just for entertainment, but for comfort. For connection. And increasingly, they’re met not by a friend or a safe adult, but by a chatbot trained to mimic care, while harvesting emotion as data. We talked about how none of this is accidental. It’s surveillance capitalism.
We spoke about the systems behind the screen. How the simple act of their parents saying “Happy Birthday, darling, I’m so proud of you” on Facebook has been reframed as the modern ritual of love and how parents feel guilty if they don’t perform it online.
In the Year 7 and 8 session, I asked how many had just received their first phone “for safety.” A third raised their hands. Their honesty was humbling. But not surprising. I talked about the myth that giving a child a smartphone is always an act of safety, a phone with less functionality is all they need. AS all to often the gift of a smartphone to young is an unexamined handover to a billion-dollar industry with no duty of care for a child’s emotional lives.
In the Year 11/12 session, we went deeper.
I told them that boys are not the problem. They’re the target market. Vulnerable. Curious. So often misunderstood. Swept into a digital storm that no adult fully prepared them for. What we’re seeing now is the impact of a system carefully designed to prey on adolescent uncertainty. The online world doesn’t wait for boys to grow into themselves. It pushes. Repeats. Rewards. It draws them into content that appears bold, funny, and rebellious, and slowly becomes darker.
No 14-year-old boy is born with hatred. Misogyny isn’t innate. It’s learned. Drip by drip. Hidden in jokes, laced through online personalities, offered as belonging. One click out of curiosity becomes a pattern. A moment of feeling lost becomes a digital door. And behind that door are echo chambers where empathy is mocked, and bravado becomes a currency.
We must stop assuming bad intent. We must see what’s really going on. Because many boys are not choosing hate. They’re being groomed into it quietly, cleverly, by systems that care more about screen time than their well-being. Our job isn’t to blame them. It’s to reach them. Gently, truthfully, before the algorithms do. I taught the seniors how to identify it if it has happened to them and how they can talk to younger brothers and cousins who may be caught up or showing the signs.
I told the girls the truth about online consent, filters, dysmorphia, hustle culture, and the new insidious failure narrative that if you’re not a billionaire by 22, if you’re not flying private, if you’re not selling self-branded content and counting your passive income before breakfast, you’re failing. This isn’t fringe. These are the dominant narratives on their feeds.
We talked about algorithmic bias and digital echo chambers, about AI and the environmental impact, about how to build pause to combat outrage and division of communities and countries and how to protect their cognitive sovereignty, and how feeds differ across continents and why that is and why we need to understand what is happening.
I told them the truth about the Social Media Minimum Age Law: it's not there to punish kids. It's there to hold tech companies accountable. So, if something goes wrong, and they're under 16, they won’t get in trouble for speaking up. They should speak up because their safety always comes first, and I told them that while change might feel slow, systemic public health shifts always are. We discussed that when I was a kid, cars didn’t have seatbelts. It took time, pressure, and public awareness to make a safety standard. That this is their seatbelt moment. That yes, on December 10th, they might have lost access. Maybe not. Maybe next week. Maybe in three months. But we need to be prepared for the fact that it may happen when they least expect it, and they deserve to understand why. It was never about them.
I spoke about 1984 in Byron Bay. How, at 14, I caught little bits about the Cold War in the background of the 6 pm news as I walked through the family room, if my dad was watching the news. That was about as bad as it got for me. Not out of nostalgia, but to give them context. Now, they live inside it. A 24/7 livestream of Wars, Epstein files, Adult content that is not love, Violence, Misinformation, Hate, Climate Catastrophe, Polarisation, and Manufactured Identity.
If you don’t get it right, they roll their eyes so hard you can hear it. I don’t scare kids. I inform them truthfully, and when you do that, they hear you.
In rooms of 300+ students in each session, there was barely a whisper of distraction. When they did chatter, it was talking to a friend about what I had just said. Curiously unpacking it in their personal space. It happens when someone they don’t know shows them respect and treats them like the capable, intelligent human beings they are. Young humans whose lives are already deeply entwined in digital systems that most adults barely understand. The kind of respect they get from their teachers who know them well, but rarely from a complete stranger.
I don’t usually speak directly to students anymore. I spend most of my time with leadership teams, educators, governance and risk teams and policymakers because systems shape behaviour, and this is a system-wide problem that I can help guide leadership through so it filters through the organisation.
At St Ignatius College in Adelaide, I work with their students because the incredible leadership team and educators, under the thought leadership of Principal Lauren Brooks, understand that this isn’t a tech issue, a discipline issue, or a one-off assembly fix. They are deeply invested in shifting digital culture across the school and their wider community.
St Ignatius didn’t just book a once-a-year speaker for their students; I also met with all of the leadership, heads of house, and curriculum. I have presented to parents and hosted all staff PD, and they participate in our CTRL+SHFT+AAA program year-round. St Ignatius continues to positively amplify the shift they made last year and their leading Tech Smart framework. They understand that the line between online and offline no longer exists. This is just life now, and it needs to be lived well. That means changing how we use devices in school, how we educate about the ethical and safe use of technology, and the importance of remaining human at the centre, not once a term or once a crisis, but all year, across the whole school and parent community.
This is what real change looks like. Systemic. Sustained. Embedded.
After the sessions, one student said, “I’ve seen so many cyber safety talks, but that was the best.” Another said, “That was the first time someone actually explained things I didn’t know.” One more told me, “At my old school, it was the exact same presentation every year.” Students told me they’d be deleting their accounts not out of fear, but because, for the first time, they understood why. Many came up to say thank you. Quietly. Even while I was waiting for my taxi.
We need to understand something urgently. This generation is not desensitised. They’re overwhelmed. They’re not disengaged, but some are drowning in noise and AI slop, desperately scanning for a signal that keeps them deeply, messily, and gloriously human.
My final student session at St Ignatius was a combined Year 5/6 session, and it went to a whole new level I did not see coming.
We discussed safer, more effective ways to use the games they love (I never say "don’t"). We also discussed climate change and AI's impact on the planet. The students led the conversation into environmental engineering, energy transfer, data centres, and the invisible architecture behind the digital world they were born into. I told them how cities like Helsinki don’t waste the hot water pumped out after it has been used to cool data centres; they redirect it. The waste heat expelled by data centres is captured and channelled to heat entire neighbourhoods. In winter. In sub-zero temperatures.
The room erupted. “That’s genius,” they said.
And before I could ask another question, they were asking theirs.
Could we do that here?
Could Adelaide become a city that thinks like that?
This was Year 5 and 6!!!!
This is what happens when we speak to kids like they are already part of the future, because they are. When we move past finger‑wagging and fear and offer them the real, raw brilliance of human innovation, their minds ignite.
When you tell them about the extraordinary innovation of Australia’s First Nations peoples like the boomerang, a throwing stick designed to return so it could be used again, or ancient stone fish traps engineered for sustainable food, the room lights up with recognition. With the understanding that some of the most sophisticated innovation in human history was imagined, tested, and perfected right here thousands of years before them, with no computers or AI in sight.
When we do the work. When we show them the whole picture and invite them into it, they don’t just care. They lead. Not because we told them to, but because we trusted them to.
Our job isn’t to hand them watered-down online safety warnings. It’s to give them better truths to stand on and then step back gradually, so they can build what comes next.
