Search Results.
87 results found with an empty search
- Understanding Incels
If you've been watching Netflix’s Adolescence and are trying to grasp the weight of the term "incel," you’re not alone. The word is often tied to news stories of violence and extremism, but there’s a much bigger conversation we need to have. One that isn’t driven by fear but by reality. Watching it, I felt sick. Not just because of what unfolded on screen but because I was so triggered as I have seen and experienced these behaviours firsthand from children. Children under 13, some as young as 10. Late last year, I recorded myself sobbing after witnessing this exact behaviour. A raw, unfiltered heartbreak caught on film and shared online over what’s happening to this generation. It went viral because I wasn’t alone. Thousands of people contacted me, primarily teachers, who see the same thing and feel the same gut-wrenching fear for our kid's future. Yet... parents are buying devices to ensure kids don't feel socially isolated, or pretending this isn’t an emergency. What kids say, how they flirt, fight, cope, or collapse, it’s not random. It’s been engineered. The rage, the numbness, the warped views on sex, love, and identity? That’s not just teen angst, it’s programmed behaviour. They’re not choosing this. It’s choosing them. Incel rhetoric? Misogyny? Nihilism disguised as “edgy humour”? It’s not coming from some dark corner of the web. ... it ’s right there on TikTok, YouTube, Discord. Mainstream. Normalised. Embedded in memes, lyrics, livestreams, and “relatable” content. The algorithms didn’t just feed it to them, they trained them. Trained them to crave validation, to perform outrage, to confuse cruelty with confidence, to see intimacy as weakness, and in some cases women as enemies or trophies. And here’s the kicker are parents are funding the whole thing? Handing over smartphones, tablets, consoles, not out of ignorance, but fear. Fear that if their kid isn’t online, they’ll be left out. Isolated. Uncool. So they buy the iPhone, they install the apps, they turn a blind eye.....because “all the other kids have it.” They’re not just buying access.They’re buying exposure. No line between online and offline, just one continuous, hyper-connected reality where this poison flows 24/7. By the time a parent, teacher, or anyone else clocks the shift, the damage is already done. This isn’t a glitch in the system. This is the system. So what can we do about it? ....first we need to understand what we are dealing with. "Incel" stands for involuntary celibate, a term used by specific online communities of men who feel rejected by women and believe they are doomed to a life without romantic or sexual relationships. Some incels quietly battle loneliness, but others spiral into blame, resentment, and misogyny. In the most extreme cases, their frustration manifests in radicalisation and violence. But here’s what isn’t often talked about. Most incels are not violent extremists. They are young men drowning in loneliness, depression, and a profound lack of belonging. What the Research Says A groundbreaking UK government study, the largest of its kind, revealed a crucial truth. Incels need mental health support , not just counter-terrorism interventions. The research found that: Many incels suffer from deep depression, anxiety, and self-hatred, not just anger toward women. They feel socially isolated and struggle with self-esteem. The most significant risk factor for extremism isn’t ideology; it’s hopelessness and a sense of disconnection from the world. A small fraction become radicalised, yes. But the vast majority? They are in distress. They need help. Not punishment. Not demonisation. No shame. Help. Adolescence is already a minefield. Add social media, isolation, and toxic online spaces, and you have a recipe for a generation of boys struggling to find their place in the world. If a teen, or even a preteen, seems withdrawn, hopeless about relationships, or angry about gender issues, they might already be engaging with incel ideology. Digital boundaries matter more than ever. Education, digital, AI and algorithmic literacy matter more than ever. We’ve been saying it for years, tech boundaries at home aren’t just about screen time. They’re about safety. Mental health. Critical thinking. Here’s what parents can do No Phones in Bedrooms at Night – Late-night scrolling is when kids get pulled into the internet's darkest corners. Set a household rule devices charge outside bedrooms overnight. Regular Tech Talks – Handing a child a smartphone isn’t just giving them a device, it’s handing them the entire world. And giving the entire world access to them. Start conversations early. Keep them going. Understand Online Codes (Emojis Matter!) – Many online groups, including incel communities, use emoji-based language to communicate in ways adults won’t recognise. 💊 (Red Pill Emoji) – "Seeing the truth" about gender dynamics, often linked to anti-feminist beliefs. 💯 (100 Emoji) – Used in the "80/20 rule" in incel theory (the idea that 80% of women only want the top 20% of men). 🫘 (Kidney Bean Emoji) – Self-identification within incel groups. 🤡 (Clown Face Emoji) – Used to mock others, often in an incel context. 💔 (Broken Heart Emoji) – Symbolizes resentment toward relationships or women. 💙 (Blue Heart Emoji) – Often used by men who feel "emotionally numb" or disconnected. 🖤 (Black Heart Emoji) – A sign of deep anger, nihilism, or rejection of mainstream society. 💜 (Purple Heart Emoji) – Sometimes used to signal lust or a desire for physical intimacy. 💛 (Yellow Heart Emoji) – Represents platonic relationships or "friend-zoning." 💚 (Green Heart Emoji) – Can indicate jealousy or bitterness about relationships. ❤️ (Red Heart Emoji) – Often used sarcastically in incel contexts to mock love or "normies" in relationships. If you see your child using these symbols frequently, don’t panic. Ask. "Hey, I’ve noticed you use that emoji a lot. What does it mean to you?" The goal is dialogue, not shutting them down. Keep the Conversation Open – If your son expresses frustration about dating, self-worth, or feeling "unwanted," listen . Dismissing or shaming him only pushes him further into these spaces. Encourage Real-World Socialising – Isolation fuels incel beliefs. Help your child build confidence through hobbies, sport, group activities, and in-person friendships. Teach Digital Awareness – Many incels are radicalised online. Help your child understand that algorithms reinforce negativity. Teach them to question what they consume. Offer Mental Health Support – If they show signs of depression, hopelessness, or extreme frustration, professional help is critical . Challenge Negative Beliefs – If they start expressing resentment toward women or society, ask questions. Instead of arguing, try "What makes you feel that way?" "What experiences led you to that belief?" Avoid Judgment and Shaming – Shaming your child will not "fix" their beliefs. It will just make them hide them. Approach these conversations with curiosity, not condemnation . This isn’t just about "dangerous men on the internet." This is about boys— kids —growing up in an online world that moves faster than their emotional development. A world that tells them they are either "winners" or "losers," and if they fall into the second category, they are doomed. But they are not doomed. They need guidance. They need connection. They need us to show up. The question isn’t whether your child will be exposed to these ideas. They will be. The real question is, Will they be able to talk to you about it? Make sure the answer is yes.
- Instagram just launched a School Partnerships Program
Instagram just launched a School Partnerships Program in the US inviting schools to sign up for “prioritised reporting,” “educational resources,” and a badge that shows they are a “trusted” partner of Meta. This is another public relations stunt dressed up as student safety. This program doesn’t protect students. It protects Instagram’s reputation at a time when public trust is collapsing and scrutiny is rising. The pitch states that schools can escalate harmful content faster. They’ll receive resources to help guide students. And their Instagram profiles will get a visible banner to show they’re working with the platform. But this entire setup depends on one thing...that schools continue posting student data on one of the most controversial platforms on earth. And that’s exactly the problem. Schools are being asked to believe that a company profiting from algorithmic addiction, biometric surveillance, and emotional manipulation now wants to help protect kids. It’s absurd. Let’s stop pretending Instagram is a neutral tool. It isn’t. It’s a machine designed to monetise attention and manipulate behaviour. It doesn't just show people content, it decides who gets seen, who gets buried, who gets reported, and who gets ignored. It’s opaque by design, unaccountable by structure. Schools participating in this “partnership” will offer it legitimacy. Telling students and families that this platform is safe, trustworthy, and endorsed by your educators. That is a staggering failure of responsibility. You say we care about digital wellbeing. You run assemblies about online safety. You tell students not to post personal information or share too much. And then, turn around and post their full names, faces, uniforms, school locations, and achievements to a platform designed to extract value from visibility. Once that photo goes up, it’s already been scraped. Indexed. Trained into a model. Matched against databases. Students don’t even have a say. Their digital identities are being built for them, without consent, without limits, and without the ability to erase any of it. This is not communication; it is exposure. And if anything goes wrong, if a face ends up in the wrong dataset, or gets pulled into a deepfake, or is matched to a future search result, guess who’s liable? Not the platform. The school. We need to stop kidding ourselves that this is harmless. Every school account on Instagram is a quiet endorsement of systems that prey on young people’s attention, hijack their self-worth, and convert every scroll, every like, every click into a data point for profit. It lures schools deeper into dependence on a platform that has never prioritised student wellbeing, only user engagement. It trades safety for symbolism. It wraps surveillance in school colours and calls it community. We don’t need more badges. We need boundaries. We need to model digital behaviour that puts students before systems. That means pulling back from platforms designed to manipulate. That means building school communication on our own terms — not Meta’s. So, no, the “School Partnership Program” is not a step forward, and anyone who says it is is probably on the Meta payroll.
- The Ban Is Coming. And the Irony?
Let’s start with the part no one wants to say out loud. Schools are still posting student photos and events on platforms that students themselves are being banned from. Schools still upload assemblies, sports carnivals, class photos, and Principal’s messages to Facebook and Instagram. In contrast, governments are preparing to roll out an under-16-year ban. Meta (Facebook + Instagram + Whatsapp) is under fire worldwide for its role in harming mental health, spreading misinformation, and harvesting user data. Yet somehow, schools still use those exact platforms as one of their primary communication channels. If a school hasn’t revisited the platform’s community standards recently, they probably don’t realise just how much has shifted and not in a good way. The rules have been quietly rewritten to give platforms more power to suppress entire voices. That means whole swathes of your community may be being shadowbanned, deprioritised, or outright removed without any transparency or recourse. And the school, by continuing to post and engage there, unknowingly aligns and is seen to support those invisible, unaccountable systems. “Take digital wellbeing seriously,” we say while doing the opposite. How can we expect students to protect themselves online when we’re putting them in danger? Last year, I was sent hundreds of screenshots of a State Education Department that had posted full names, faces, and even the schools of kids on a public Facebook page. They were celebrating twins starting school. Sweet, right? Except now, any predator, stalker, or data broker knows precisely where to find them. That’s not just careless. That’s digital negligence, and it came from the adults in charge. “We’re just showcasing school pride. Celebrating student achievement. Promoting our community.” But behind every smiling photo, every student achievement post, and every uniformed kid standing in front of a school sign…something darker is happening. “But the parents want it.” That excuse doesn’t hold up anymore. Parents might want an easy communication channel. However, that should never cancel a school’s legal and ethical responsibility. You wouldn’t post medical records because a parent asked. You wouldn’t hand out addresses for a “community moment.” So why are you posting student faces permanently feeding facial recognition systems and the potential for a child to be deepfaked? This isn’t harmless. It’s exposure. And if something goes wrong, guess who’s liable? Not the parents. The school. Being trusted with children means doing what’s right, not what’s popular. Schools are turning students into data. And not just any data. Biometric data. Every face posted to a public account becomes another line of code in someone else's machine. It trains facial recognition systems. Feeds algorithms. Builds digital profiles students never asked for and can’t erase. Even if a school deletes the photo, it’s already too late. AI web crawlers have scraped it. It’s been indexed. Archived. Monetised.Forever. And what happens next? Those same student faces could be: Matched to future online accounts Linked to public records, arrest logs, or news articles Pulled into surveillance databases without consent And Schools could be held liable . Posting a child’s identifiable image without airtight informed consent? Up-to-date permission to publish forms that highlight all the risks that you may not know to exist? In some jurisdictions, that’s not just reckless. It’s legally risky. We’re not just talking about privacy anymore. Facebook and Instagram are not neutral tools. They’re precision-engineered machines designed to: Hijack attention Feed addiction Harvest data Manipulate emotion Polarise communities They’re not built for education. They’re built for engagement at any cost. And when schools post there, they’re not just “sharing updates.”They’re endorsing these systems. They’re legitimising these values. And students are watching every move. “Limit screen time. Be kind online. Don’t compare yourself to others. The number of likes doesn't matter.” That contradiction is not lost on young people. It undermines every well-being lesson. It teaches them that what we say matters less than what we do. Schools say they’re “future-focused,” building resilient, critical thinkers ready for a digital world. But how can we say that while their attention spans, mental health, and empathy are burned to the ground? You can’t teach digital literacy in the morning and feed the algorithm by lunch. If we’re serious about raising ethical digital citizens, we must model ethical digital behaviour. That means getting off platforms designed for exploitation, not education. What Can Schools Do Instead? Build a clean mailing list. Send high-impact updates, newsletters, and event invites — straight to parents and carers. No algorithm gatekeeping. Make your school website your home base. Keep it updated, beautiful, and student-centred. Get them involved. Virtual tours, testimonials, and galleries are all in your control. Host real-world events. Art shows how to use tech nights and cultural days. Create moments that matter, not posts that disappear in 24 hours. Use secure, school-specific tools built for privacy, not profit. Regional radio, local newspapers, and community podcasts are powerful channels that reach your community. Still the most powerful marketing tool on the planet. Create experiences worth sharing, and your school community will do the talking. Think billboards, flyers, murals, and chalkboards outside school gates. A bold message at your local train station? It has way more impact than a buried Reel. Every decision a school makes sends a signal. If we keep aligning with platforms undermining student well-being, attention, safety, and trust, we’re not preparing students for the future. We’re prepping them for manipulation. You don’t need to be everywhere. You need to be intentional. You don’t need to go viral. You need to be valuable. It’s time to walk the talk. Get off the platforms. Build something better. And let your community see what leadership looks like.
- Look Up!
Yesterday, I was out for a walk with my love and our wildly cute, fiercely independent Scottish Terrier, Monte, who struts through our adopted home of Florence like she owns the place, collecting every “bellina!” thrown her way like it’s her royal due. She’s small, but she’s got presence. More than most people, honestly. We passed through Piazza della Signoria, the beating heart of Florence. There, in the middle of the Piazza, stands a 3.6-metre sculpture by British artist Thomas J Price. A girl, staring down at her phone. Life-size times ten. Back turned to everything. The Uffizi. The Loggia dei Lanzi. The Palazzo Vecchio. To Venus, to Medusa, to David. She’s part fiction, part technology, part traditional craft — but she’s very real. She’s us . Everywhere I looked, the irony played out in real-time. People standing right in front of masterpieces — heads down, scrolling. Lining up for selfies, but never stepping inside the Uffizi. Taking photos of themselves with buildings they’ll never learn the names of. I’ve been documenting this for a while now (with my Nikon, not my phone), my eye, and a growing frustration. The world is right here. Alive. Glorious. And people are missing it. At lunch yesterday, the woman next to us was playing solitaire on her phone while her boyfriend sat in silence. At dinner a few months back, the guy at the next table was swiping through Tinder. I could see the screen clearly — tables are close in Europe — while his mother, visiting from the U.S., tried to have a real conversation. It seems most locals hate the statue. Almost as much as they hate the thousands of people eating massive Instagram-famous sandwiches on the steps of their cathedrals, totally unaware of the level of disrespect they are showing to the people of this city, completely checked out, face in phone, sandwich in hand, crumbs on stone that’s older than some countries. This is a city built by the hands of giants. Not metaphorical giants — real ones . Michelangelo carved his David just a few steps from here, chiselling defiance into marble that still dares you to look away. Leonardo da Vinci walked these same cobblestones, sketching the future while studying the flight of birds. Botticelli gave us The Birth of Venus — a goddess emerging from the sea with grace so eternal it still quiets rooms centuries later. Brunelleschi raised the Duomo’s dome against every architectural odd and made the impossible stand. Dante reshaped language. Galileo redefined the stars. Machiavelli changed politics forever — right here in the shadow of the Palazzo Vecchio. And now? People come all this way to sit on cathedral steps, scrolling TikToks about places they don’t bother to look at. They don’t see the stories, the struggle, the staggering beauty created by minds that cracked open the world because their heads are buried in screens, chasing digital dopamine instead of wonder. That sculpture? Temporary. Maybe it’ll be gone in a month. But what it reflects is a crisis that’s already rooted deep. We’re not just looking down. We’re checking out. From each other. From the moment. From meaning. From the kind of beauty that used to demand reverence. And here’s what we’re forgetting, as Laith and I discussed yesterday...... boredom is essential . Boredom is the quiet room where creativity shows up. It’s the pause that gives space for an idea to land, a memory to rise, or a question to take shape. Without boredom, there is no depth. Just noise. The great artists and thinkers who walked these streets before us? they were bored sometimes. They stared at the ceilings. They wandered. They waited . And in that stillness, brilliance emerged. We’ve lost that. We fear boredom now like it’s a failure instead of recognising it as the birthplace of imagination. We’re scrolling past our own lives. We need to look up. Not just for art. For context. For connection. For a shot at waking up to what it means to be alive . Monte, at least, gets it. She struts through Florence like a queen who knows the weight of her kingdom. Tail up, ears perked, fully present. No screen, no feed. Just the world, in all its messy, magnificent realness. Maybe we should follow her lead. Because while you're chasing likes, life....real life.....is walking right past you. Tail high. Nose in the air. And not waiting for you to catch up.
- Empty Promises, Exploitation, and What Actually Helps
Right now, across Australia and around the world vulnerable people are being targeted twice. First by sextortionists, and then by opportunistic companies promising to "make it all go away" for a fee. A recent exposé from the United States revealed disturbing practices at Digital Forensics Corporation (DFC), a company that claimed to help sextortion victims. Instead, it reportedly used high-pressure sales tactics, fear-based marketing, misleading “helplines,” and extortionate fees to prey on people in crisis. Similar tactics are emerging here in Australia, and the trend is growing. It’s not just these for-profit “digital clean-up” firms. Increasingly, apps and platforms are surfacing that claim to block abuse, remove explicit content, or "protect you from predators" but only if you pay. And if they’re free? You're paying with your personal data. Let’s be clear, this isn’t support. It’s digital ambulance chasing. When you’re in crisis, fear becomes a currency. And too many are cashing in. There is no magical fix. No button. No software that can erase sextortion material once it's in the hands of criminals especially those operating anonymously or across borders. Be wary of anyone claiming they can "delete files from a scammer’s device" or "scrub the internet clean." Even major platforms like Meta, Google, and TikTok, with all their resources, can't remove what hasn’t been uploaded or what circulates in private, encrypted channels. When companies charge thousands for “content removal plans,” or push monthly fees for abuse-blocking services, they’re not solving the root problem. They’re monetising fear. And too often making promises no ethical professional would dare to offer. They’re not empowering people. They’re profiting off panic. If You or Someone You Know Is a Victim of Sextortion in Australia, Here’s What Actually Helps Cut Off Contact Do not reply, do not pay, do not negotiate. Block them everywhere. Report Immediately eSafety Commissioner www.esafety.gov.au/report Australian Centre to Counter Child Exploitation if Under 18yrs www.accce.gov.au The eSafety Commissioner and ACCCE have real regulatory power. Their help is free, confidential, and trauma-informed. Do Not Pay Paying a blackmailer rarely ends the abuse it usually escalates it. Instead, take screenshots, save usernames and URLs, and share this evidence with authorities listed above. Secure Your Accounts Change all passwords. Enable two-factor authentication. If you’ve shared login details, assume those accounts are compromised. Talk to Someone You Trust Shame is a weapon these criminals rely on. You are not to blame. Speak to a trusted adult, friend, GP, teacher, or support helpline. Reach Out to Official Support Kids Helpline : 1800 55 1800 Lifeline : 13 11 14 1800RESPECT : For trauma-informed counselling Watch for Red Flags Promises of permanent deletion or “guaranteed removal” Urgent, high-pressure “phases” of expensive service Requests for your passwords or social media access Demands for secrecy (don’t tell parents, don’t tell police) What Actually Works? Education. Technology evolves rapidly. But long-term safety isn’t found in fear-based solutions or false guarantees—it’s rooted in knowledge. We must invest in: Students , with honest, age-appropriate education on risks Parents , with the tools to have open, shame-free conversations at home Teachers , with resources to embed online safety into the curriculum Workplaces , with clear policies on reporting digital abuse You Are Not Alone If you’re facing sextortion, you didn’t cause this.You are not naïve.You have options. And real help never comes with a price tag. We must collectively reject the predatory businesses profiting from people’s darkest moments and instead build a future grounded in up to date education, truth, safety, and support.
- "We Don’t Care About Privacy" Is The Biggest Lie This Generation Has Been Tricked Into Believing
I hear it all the time. From kids, from parents, from entire classrooms."Privacy? Who cares?"I can almost hear the collective eye roll when I mention it, like I’m just another adult warning them about "stranger danger" on the internet. But here’s the thing. Within minutes, those same kids are sitting up, leaning in, and actually listening. Why? Because I don’t just repeat the same tired advice. I know exactly how this system works from the inside out. I understand the tech, the algorithms, and the psychological tricks being used to make them not care. And when I lay it out in real, undeniable terms the way their data is being harvested, how AI is already shaping their future, how Big Tech sees them as a product, not a person…..it stops being boring. It becomes personal. And that’s when it clicks. T hey don’t care about privacy because they were never given the chance to. From the moment their parents picked up a screen, privacy was dead to them. Their data was collected before they could say "data." Their faces were scanned before they understood what AI was. Their preferences, emotions, and behaviours were mapped, sold, and manipulated before they even knew what privacy meant. And now? They’ve been conditioned to believe privacy is irrelevant. That it’s old-fashioned. That it doesn’t matter. Not because they decided that. But because Big Tech, AI, and surveillance capitalism decided for them. How They Hijacked an Entire Generation They Made Privacy Inconvenient Ever tried adjusting privacy settings? It’s a nightmare on purpose . The more confusing it is, the less likely kids are to care. They Made Oversharing a Status Symbol Viral culture rewards exposure. The more raw, personal, and outrageous their content, the more the algorithm favours them. Privacy? That’s for people who don’t want to be seen. And in their world, not being seen = social death. They Profited From Their Digital Footprints Everything they type, watch, like, share, or even pause on is tracked, analysed, and sold. Every habit, insecurity, and desire is stored not for them, but to control them. They Built AI That Knows Them Better Than They Know Themselves Imagine a version of them that exists in Big Tech’s servers one that knows what they’ll buy before they do, what will make them angry, what will keep them scrolling. That’s the real them now. So How Do You Make Them Care About Privacy? Show, Don’t Tell If you just say, “Privacy matters,” they’ll roll their eyes. Instead, they should be shown how their data is being used against them. Find a creepily personalised ad. Show them how facial recognition tracks them even if they delete posts. Make it real. Make It About Power, Not Fear They hate being controlled. Privacy isn’t about paranoia it’s about outsmarting the system. Challenge them “Right now, you’re letting companies own your identity. Are you cool with that?” Why We Have to Change the Way We Teach This A one-and-done internet safety talk from an out-of-touch adult isn’t cutting it. Kids tune out at lectures. They roll their eyes at outdated advice. We need to completely overhaul in how we teach digital privacy. Privacy education must be constant, relevant, and built into their world year-round not just a single session. Teachers, parents, and schools must treat it like any other life skill, like financial literacy or critical thinking. Because, privacy is power.
- Why Schools and Parent Groups Should Stop Using WhatsApp for Communication
WhatsApp has become a default communication tool for many school communities, offering a quick and familiar way for parents to stay connected. However, while it may seem convenient, WhatsApp is not designed for school-related communication. Using it for parent groups comes with serious risks including privacy breaches, misinformation, and legal liability that many people don’t realise. If your school or parent group relies on WhatsApp, it’s time to rethink that decision. One of the biggest problems with WhatsApp parent groups is the complete lack of moderation tools. Unlike dedicated school communication platforms, WhatsApp does not allow group admins to filter messages, flag inappropriate content, or prevent harmful discussions before they escalate. This creates major risks: Harmful or false information spreads unchecked – Anyone in the group can post messages, including misinformation about school policies, unverified rumours, or even defamatory statements about school staff. Admins have no real control – The only way to remove harmful content is to delete messages (which is only possible within a limited time frame) or remove members after the damage is done. Legal liability for admins – In some cases, group admins have faced legal consequences for failing to prevent harmful discussions or inappropriate content, especially when minors are involved. WhatsApp’s automatic media-saving feature also means that images sent in a group are stored in members’ camera rolls by default unless individuals manually change their settings. This creates huge risks if sensitive student-related images are shared, as they can quickly be forwarded outside the group without consent. Most WhatsApp parent groups start with good intentions sharing school updates, organising events, and supporting each other. But without constant oversight, they quickly spiral into negativity, conflict, or misinformation. To be brutally honest we have had so many complaints and requests for assistance with out of control Whatsapp groups I have lost count. Common problems include: Unverified complaints about teachers or school policies Arguments between parents over discipline, classroom concerns, or student behaviour Misinformation spreading about school events, safety issues, or even medical advice Even if a group is “unofficial”, if it becomes widely used by parents in a particular school, it can become official by default and the school’s reputation can be affected. Schools have faced serious challenges when parents share false or inflammatory information that damages trust in teachers, administrators, or other families. What starts as a helpful group chat can quickly become a breeding ground for unnecessary divisions and conflicts within the school community. WhatsApp also exposes every group member’s phone number, creating major privacy concerns. This is especially problematic for: Parents in sensitive situations (e.g., domestic violence survivors, custody disputes) who may not want their contact details shared. Families who value digital privacy and prefer not to share personal information beyond what is necessary especially to apps known to be gathering information through their algorithms that include absolutely everything in Whatsapp. Situations where phone numbers are later misused for unwanted messages or personal conflicts. Beyond exposing personal contact details, WhatsApp groups can easily cross legal lines by sharing sensitive school-related information. Common mistakes include: Posting private details about students (academic performance, behaviour, birth dates health issues). Sharing complaints about teachers in a public group. Discussions that could violate privacy laws , such as the Australian Privacy Act or GDPR (General Data Protection Regulation) in Europe . Schools and parents must take these obligations seriously to avoid potential legal risks. Meta, WhatsApp’s parent company, has openly admitted to collecting data from WhatsApp group chats to improve its advertising and AI models. This means that even so-called “private” discussions in parent groups may be analysed for advertising purposes. For schools and families who care about protecting student and parent data, this should be a major red flag. Encrypted does not equal safe and private. If the goal is to improve communication between parents and schools, there are far safer and more effective options: Platforms like Sentral, SkoolBag, ClassDojo, and Edmodo are specifically designed for secure and structured school communication. They offer: Moderation tools to prevent harmful content. Privacy protections that keep personal data safe (if the services is paid for - if it is free you and all of the data is the product). School oversight to ensure accurate information is shared. Schools can send regular email updates or use a secure online portal to share important news without relying on third-party apps. Many education departments offer secure messaging platforms for direct parent-teacher communication. These systems are regulated, ensuring privacy and compliance with school policies. Parent demand should not be the primary driver of decisions regarding the use of WhatsApp in schools. Student privacy should. While parents may advocate for WhatsApp due to its widespread use and convenience, the platform lacks protection of data. Schools operate across varying legal requirements, making it essential to prioritise secure, education-specific communication platforms that offer end-to-end encryption, appropriate protections, and institutional control over student data. Allowing parental preference to dictate these decisions can expose schools to legal risks, data breaches, and safety concerns, undermining their duty to protect student privacy in a globally compliant manner. If your school needs help updating policy etc reach out for details on our advisory packages. hello@safeonsocial.com
- Should Devices Be in Your Kid’s Bedroom at Night?
Short answer: Absolutely not (unless it is used as a medical device). Here’s why. Their Phone Is Robbing Their Sleep (and Sanity) That glowing screen isn’t just harmless fun, it’s wrecking their sleep. Blue light shuts down melatonin production, tricking their brain into thinking it’s still daytime. Instead of winding down, they’re wired up, scrolling past midnight. And trust me, they’re not reading Shakespeare.......they’re deep in TikTok spirals, group chat chaos, or bingeing “just one more” episode. Midnight Messages = Total Exhaustion You know those hundreds of pointless group chat messages blowing up at 12 AM, 1 AM? If you think they’re ignoring them, you’re fooling yourself. And what are they? A blurry ceiling. Someone’s half-eaten burger. A random “wyd.” Nothing. But they have to check. Whether they’re replying or just scrolling to see if they got mentioned, their sleep is under attack— all night long. This isn’t just about being a little tired. Sleep-deprived teens are running on empty. Their brains can’t focus. Their emotions are on edge. Their grades are slipping. Their mental health? On a downward spiral. And for what? A streak of unread nonsense that’ll be forgotten by morning. Anxiety in Their Pocket Social media doesn’t shut off at bedtime not even close. Think parental controls help? There are thousands of YouTube tutorials showing how to bypass them in seconds. Changing the time zone to bypass app limits? Easy. Using a secondary account parents don’t even know about? Standard. If their phone is within reach, they’re checking notifications, overanalysing why their friend hasn’t responded in 60 seconds, and sometimes doomscrolling through war, violence, and chaos. Graphic, unfiltered, and relentless. Most of us didn’t go to bed with a front-row seat to global atrocities. But they do. Every. Single. Night. Instead of winding down, for some their brain is stewing in negativity, spiralling into stress, anxiety, and depression. Their body is in bed, but their brain may be stuck in fight or flight , all night long. Their Grades (and Mood) Take a Hit Less sleep = worse focus, lower grades, and more irritability. Teens running on empty don’t just struggle in school; they’re also more impulsive, more emotional, and more prone to burnout. It’s not just about tired mornings, it’s about long-term well-being. The Easy Fix.......Get Phones Out of the Bedroom Buy an actual alarm clock (yes, they still exist). Charge phones outside the bedroom —in the kitchen, living room, anywhere but bedside. Encourage books, music, or journaling instead of screens at least an hour before bed. Your kid might resist at first, but protecting their sleep is one of the most powerful things you can do for their mental health, focus, and happiness . And let’s be honest........we all need the reminder to do this for ourselves, too.
- It’s Not in the Script
When you are at the age where you are quoting your grandparents, it’s fair to say your youth is behind you. While the years ahead are fewer than ever before, I take comfort in knowing that I have gotten most of my material life lessons out of the way and am maximizing my age garnered wisdom. That said, realizing why my grandparents said the things they did and wishing I had listened more closely is a feeling I cannot quantify. Imagining how much easier my life would have been if I actually listened to all that good advice brings me down a rabbit hole that goes for miles. My latest ‘when I was a kid’ moment left me shaking my head with the way the world has changed. I wrote a friend’s daughter a thank you note for helping me with a project. The extra hands were helpful and appreciated, so I wrote her a thank you note and put a $50 bill inside so she could treat herself to something. (If you are reading this in the U.S, that will merely buy her a dozen eggs and a gallon of milk, but that’s another, ‘back in my day’ story for another time.) The fifteen year old was a bit bemused when I handed her the card. I knew she wasn’t expecting anything, but the way she looked at the card with her name on it seemed out of place. She hesitantly opened the envelope, took out the notecard, and looked inside. Her expression changed yet again, this time to what I can only describe as complete confusion. She had the $50 bill and the card in her hands, and she stared. And stared some more. First at the money, then at the card. Her brow furrowed. She looked from the card to her mom, to me and then back to the card. “I can’t read this,” she said. Now it was my turn to be confused. This girl is an honor’s student. Of course she can read. “What do you mean, Honey?” was the only natural response, so I asked. “This writing; I can’t read it,” she said. As someone who went to Catholic school, I take pride in my penmanship, which I admit is a lost art, so my own confusion deepened. I know my writing is not only legible, it’s quite lovely. “What do you mean?” I asked again. “I can’t understand it. It’s all loopy and curvy” she said. I can’t read this. It was then that her mom and I realized that she couldn’t read cursive. She can only read printed words, not script. I was truly speechless. This is an articulate, straight A student. How could this be? I know school’s stopped teaching penmanship, but to not even be able to read it? Apparently, that’s the case. So, I took the card and read it to her. She was appreciative but still not fully relaxed. I asked her if everything was all right and she sheepishly asked me if I could Venmo her the $50 as she didn’t own a wallet and wasn’t sure stores, “took paper money anymore.” There are no words for how antiquated I felt in that moment. So much for being the cool grownup. At least I have a Venmo account…..
- The Kids Are Watching.
I had Yum Cha for lunch today with one of my oldest and dearest friends, one of the wisest women I know. A mother of a spectacular young woman in her early twenties with the world at her feet. My friend told me what I am hearing almost every day. Something my friend Maggie Dent and I discussed at lunch this past week. The kids are scared. Yes, they are. They are watching a world that feels like it’s unravelling. Every day, there’s something new. Fires raging, floods, ecosystems breaking, economies trembling, institutions stretched to their limits, even The Pope's illness. In my old rock chick mind, the timing of all of this unravelling uncannily started when David Bowie died, and it hasn't stopped. Kids are not just hearing whispers of it in the background. It’s in their online feeds, all day, every day. They scroll through their feeds, and it’s relentless. Endless discussions about collapse, not from conspiracy theorists or doomsday preppers, but from some of the most intelligent, most informed people alive. Researchers. Scientists. Economists. Historians. People who understand patterns, who study cycles, who can see what’s coming before it arrives. And the kids are asking amongst themselves..........Just as I did during the Cold War and we did not have devices in our hands connecting us to it 24/7 Is this it? Is this how it all ends? And the answer? Maybe. Because collapse isn’t a singular, cataclysmic moment. It isn’t an explosion that leaves nothing but dust. Collapse is a process. A slow-motion breaking of the old, a crumbling of systems that have outlived their usefulness. It’s unsettling. It’s terrifying. It’s real. But we need to get educated and get talking to them. Now. Because here’s the part no one tells them… Collapse is also a door. And doors swing both ways. One way leads to deeper chaos. To panic. To fighting over scraps instead of planting new seeds. If we let fear dictate our response, if we cling to the old ways just because they are familiar, we risk losing the opportunity that stands in front of us. The other way? Reinvention. Because what we see isn’t just destruction it’s the clearing of space. The failing systems were built for a different time, for a different world. They have been failing us for a while now. Maybe, just maybe, they were never designed to last forever. Maybe they were never meant to. And the online world, the same one feeding the fear is also proof that we are already rewriting the future in real-time. Ideas spread in seconds. People connect across borders, sharing knowledge faster than institutions ever could. The internet has fueled a broken attention economy, amplified division, and gamified outrage but it has also given us direct access to some of the greatest minds alive. To new ways of thinking, new ways of working, and new systems of value. Yes, some of what we’ve built is broken. Some of it needs to collapse. But not all of it . There are brilliant minds already working on solutions, proving that innovation doesn’t come from institutions, it comes from people willing to think beyond them. If we accept that collapse is happening, then we must also accept that we have a choice: We don’t have to rebuild what was. We don't have to do what we have always done. We can build something better. If the structures around us are failing, let them. Burn them down. Let something new take their place, something more sustainable, just, and more human. We don’t have to wait for permission to create change. Communities are already forming new ways of living, working, and supporting each other. Change does not come from the top it comes from those of us who are bold enough to see what could be instead of clinging to what is or was. We are not just witnesses to this moment. We are the architects of what follows. And let’s talk about money for a second. Because, yes, people need to be paid. But how much do you really need? At some point, wealth stops being about survival and starts being about hoarding. If you’re a billionaire.....congratulations, you’ve won the game. Now start giving it away. Invest in people. In ideas. In solutions. Share knowledge, fund innovation, fund a teen with a great idea to change the world for the better before they end up somewhere and are taught to think another way. Hire free-range thinkers. Because power and wealth aren’t in how much you accumulate it’s in what you help create. The future isn’t written by those who panic it’s written by those who prepare. Learn. Innovate. Organise. Shift power to the people who will actually build what comes next. Those who aren’t so caught up in their own egos that they refuse to share what they’ve learned. This is the moment to rethink everything from how we grow food to how we power our homes, how we care for one another, and how we define success. To the kids who see what’s happening and feel the weight of it all. I won’t tell you not to be afraid. I won’t tell you that everything is fine. Because it’s not. But fear is not the final word. Fear is a signal that something is shifting. And when something shifts, that means there is space to create something new. Look around. The old world is trembling, but the new world is waiting. And it belongs to you.........the ones who are brave enough to build it.
- How Social Media Stole Our Presence and Why We Must Reclaim It
There’s a particular kind of silence that comes when you step away from social media. Not just muting notifications but truly disconnecting. It’s unsettling at first, like walking into a room you forgot you left. The mind, accustomed to the constant hum of updates, arguments, and algorithmic suggestions, twitches for stimulation. But then, something profound happens…….you begin to notice the world again. The rustling of leaves. The shadows. The colour of an apple. The weight of your thoughts settling in. The long-lost patience for a whole conversation without the impulse to check a screen. In its current form, social media is more than a mere distraction. It’s a force eroding our ability to be present, think critically, and take meaningful action. It has not only reshaped how we communicate but also how we process reality itself. And in doing so, it fuels the very crises we should be addressing: climate collapse, mental health struggles, and political instability. Social media platforms thrive on outrage, fear, and impulsivity because these emotions drive engagement, which in turn fuels profits. A slow, thoughtful conversation? That doesn’t go viral. An in-depth examination of a complex issue? The algorithm isn’t interested. What it wants is division, urgency, and compulsive scrolling. We are trapped in a system that sells us the illusion of connection while isolating us from real human engagement. It numbs us to the urgency of climate collapse, reducing the biggest existential crisis of our time to bite-sized infographics and fleeting hashtags. It exacerbates our mental health crisis, feeding us a relentless stream of comparison, unattainable ideals, and dopamine-driven validation. And it keeps us locked in political polarisation, where outrage replaces real discourse and meaningful change. All the while, it strips us of the ability to be present in our own lives. What if the collapse of social media as we know it isn’t a tragedy but an opportunity? A breaking point that forces us to step away from the algorithm-driven chaos and reclaim a more intentional way of living? Radical reconnection doesn’t mean rejecting technology altogether, it means rejecting its control over our attention, our emotions, and our sense of reality. Relearning deep attention. Reading books, having uninterrupted conversations, allowing ourselves to sit with boredom without reaching for a screen. Prioritising real-world activism over performative online engagement. Organising, voting, showing up. Real actions that can’t be reduced to a viral tweet. Reclaiming community. Gathering in spaces where conversations unfold organically, without the distortions of algorithms that prioritise conflict over connection. Detoxing from the dopamine economy. Teaching ourselves to find joy in the slow, the quiet, the unmonetised moments of life. So How Do We Talk About This With Young People For those who have never known a world without social media, this conversation is even more urgent. Young people are often the most immersed in these platforms, yet they are also the ones experiencing the highest levels of anxiety, loneliness, and burnout. The solution isn’t to scold or shame them for their screen time but to engage in honest, meaningful discussions about how social media shapes their lives. Ask, don’t lecture. Instead of telling students and young people that social media is ruining their minds, ask them how it makes them feel. Do they feel more connected or more isolated? More informed or more overwhelmed? Help them see the design of the system. Explain how algorithms work, how engagement is monetised, and why platforms are designed to be addictive. When young people understand that they are not the problem….the system is…..they become more empowered to make conscious choices. Model presence. If we want young people to engage more deeply in the real world, we have to do it ourselves. That means putting our own phones down, prioritising face-to-face conversations, and showing them that there is life beyond the scroll. Challenge them (and yourself) to take breaks from social media, even if just for a weekend. Help them notice what changes—do they sleep better? Feel less stressed? Connect more deeply with friends? The cost of staying plugged into this system is clear. But the reward of stepping away? Presence. Depth. A return to what it truly means to be human. And maybe, just maybe, that’s the reconnection we’ve been waiting for.
- Childcare Provider and School Apps: Do You Know Exactly What You Have Signed Up For?
As a parent, especially new parents, we go through many emotions. Handing our children over to be cared for whilst we attend to working or other life commitments can be daunting and sometimes even terrifying. We worry about whether they will get the same care and attention we give them or what will happen if they get sick or hurt. Local community social media groups frequently feature callouts from parents asking for recommendations for the best childcare centre, or sometimes advice on which ones to avoid. Many centres are recommended based on the use of their centre App, and how engaged the service is, with the App keeping you up to date on everything your child is doing during the day. The App, you will be told, makes life so much easier. You’ll be sold on how much information is put into it. General newsletters, the weekly menu, photos of your child, observations, and reflections, you’ll know just how much they slept and ate and all about their toilet habits. App and software developers have seen the benefit of creating childcare (and school) Apps. The industry is lucrative and seemingly has an infinite future supply of user accounts. According to the Australian Government Department of Education, the most recent data published in the September quarter of 2021, 1,398,050 children attended a childcare service. As our population grows and our economy continues to change, this number will only increase yearly. When we as adults download an App, we consent to the Terms and Conditions, whether we read them or not, and honestly, how many people really read the fine print? By simply downloading and ticking that little ‘I agree’ box, we are consenting for our own data to be used by the App as well as consenting to permissions such as accessing our photos, comments, phone contacts, and sometimes even our location. We understand that we are adding to our own digital footprint, and for the most part, that’s fine, we’re adults, and we get to make that decision. But what happens to the rights of our children? What data are the apps collecting, and who can see it? How will the data these apps collect affect them in the future? These are the questions we raise regarding the Apps that most childcare centres enforce for communication purposes when a child first attends their centre or when the Centre chooses to introduce one. WHAT APPS ARE CURRENTLY USED IN AUSTRALIA? The list is quite exhaustive, with new Apps popping up all the time; however, some of the most popular Apps that childcare providers are currently using include (but are not limited to): Xplor OWNA HiMama Brightwheel KidKare Sandbox Kidsoft Primary and Secondary Schools are also mandating the use of Apps, including Sentral, SeeSaw, School Stream, Skool Loop, and SkoolBag, to name a few. The use of Apps in a childcare and school setting is very common place but what many don’t realise, is that when we as parents and/or carers agree to these Apps on behalf of our children, we are aiding in the creation and building of their digital footprint, a footprint that they have no control over, and this footprint can be very sensitive. Arguably, your childcare service provider does not have control over this footprint either. Who does? you may ask Well, when your childcare service provider enters into an agreement with an App provider, the contract of service is between the service provider and the App provider. The parent or guardian is merely a user, not a party to the contract. In basic legal terms, if you are not a party to a contract, you cannot enforce it or seek a remedy for any breach or damage. WHAT HAPPENS WHEN YOU SIGN YOU AND YOUR CHILD UP TO YOUR SERVICE PROVIDER’S APP? The process is very similar across most Apps: When you submit your completed enrolment forms, the childcare service provider enters the personal details of parents/guardians in the administrators’ side of the App. Your children’s data is also entered into the App, including sensitive medical information, their Medicare card details, any medical providers, and medical reports. The provider then uses the App to upload daily routines, toileting data, sleep data, feeding data, photos, observations, stories, and incident reports, including behavioural notes and other tailored reports. Likely a parent/guardian can comment on the entries made. Sometimes other children will be included in these entries. And yes, mistakes can be made with a simple click on the wrong child or parent, seeing your child’s photo, sensitive information, being shared with other families. Many Apps allow you to invite ‘family’ to view your child’s journal, which also includes other children if they are featured in your child’s account (which is highly likely). There is no vetting of who can get these invitations; it’s on the user to invite others. This could be an aunt, an uncle, the grandparents, or even the man who lives next door, ‘just in case.’ This means that someone else is seeing your child, someone you haven’t consented to, and the child they have permission to view. You do not get to approve additional users who are added or given access by other parents, only those you choose to provide access to yourself. Additionally, there is rarely two-factor authentication in these Apps to protect login details and to ensure that people are who they say they are. Let’s not forget it’s common to practise to share login details with family than have them set up their profiles, so commentary, viewership, and communications can be from someone who is not the intended user. WHAT ABOUT PRIVACY, THE SECURITY OF THE DATA COLLECTED, AND YOUR CHILD’S RIGHTS? Mandating the use of Apps is becoming standard for the convenience of childcare providers. Recently, we have been made aware of more and more After School Care services mandating the use of Apps. In some cases, even including that, the service provider can only be notified of absences in the app. Otherwise, an ‘administration fee’ will apply. This is also applicable to many primary and secondary schools. What’s the problem with using a childcare App if, seemingly, every other industry has one? Very little research has been conducted into the childcare App market. It doesn’t even raise eyebrows as these Apps are seen as convenient and a timesaver for the provider and a key engagement driver for parents. And the Apps aren’t silly; some are designed with the whole mummy/daddy guilt in mind, played on by the developers, so parents feel that they can’t sign up without suffering from major FOMO and a massive case of the guilts. A Privacy and Security Analysis of Mobile Child Care Apps is a study that was released in March 2022. The study looked at 42 Apps and found a direct threat to privacy by tracking mechanisms embedded in the applications. Another risk it noted was information leakage. We forget that children cannot consent to their data being stored. That in itself raises privacy issues. As the authors of this study state, it is the job of the parents and educators to act with caution. Always remember that you are the product if something is free to use . So, if your child’s daycare centre is using a free App to keep you up to date with what your child is doing, that means that you could be paying with external access to all the data stored within it. Serious questions need to be considered about the mandatory use of these Apps when considering Australia’s anti-discrimination laws and the privacy rights of a child. There appears to be a giant black hole, and these two do not meet. On a cursory glance, forcing an individual or family to use an App for convenience appears to be nothing short of indirect discrimination against those exercising their rights to keep their lives offline. The Human Rights Commission defines indirect discrimination as: “ where an unreasonable rule or policy that is the same for everyone but has an unfair effect on people who share a particular attribute.” Yet, for many, they will not satisfy the protected attributes required under our discrimination laws, such as gender, disability, or race (amongst others). Some will, however, for example, the vision impaired who may not be able to utilise these apps or perhaps families who are not from an English-speaking background. Those of us who have, for want of a better expression, a ‘conscientious objection’ to putting our children’s data online have no protection. Interestingly, Article 16 of the Convention of the Rights of the Child states that: “ 1. No child shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or correspondence, nor to unlawful attacks on his or her honour or reputation, and 2. The child has the right to the protection of the law against such interference or attacks” These Apps that have been designed for convenience do not seem to consider either of these aspects, and there is a gaping hole, at least in Australia, about how we can protect our children from storage and usage of their data. The concerns with this are many, including that most services do not consult or are not consulting with parents/guardians about using an App, moving to an App, or in the case of new families to a service, even providing them with an option to use the App or to inform them that one is being used. Some families have no choice due to vacancy rates for centres in specific geographical locations. At best, you’ll be presented with a Privacy Policy or referred to the Centre’s Privacy Policy on their website that concerns the Centre itself and how it handles your and your child’s personal information. In one instance that has been observed, the Centre’s Privacy Policy is very vague, stating “… realises the importance of privacy to families/guardians and as such does not release any information of records stored to a third party for their use without the account holders’ authority unless required by law”. Arguably, the transmission of personal information is not being ‘used’ by the third party. Still, this Privacy Policy makes no mention that the service utilises an App and that, as part of the Terms of Use of the service, personal details will be disclosed. You also need to agree to the Apps’ Terms and Conditions and their Privacy Policy. One service provider states in their Terms and Conditions that whilst they take care to provide services and ensure that the App and website is free of any virus or malware, they are not responsible for damage caused, and that you indemnify the developer from all liabilities, costs and expenses. In relation to privacy, it states that whilst it aims to take due care, they do not warrant and cannot ensure the security of any information which is provided, and information is transmitted solely at your risk. CAN THEY REALLY MAKE AN APP MANDATORY? AND WHAT MIGHT THIS MEAN FOR YOUR CHILD IN THE FUTURE? The first generation of social media users are now parents, and they are the target market for these Apps. This generation had little or no guidance about the risks of what they were using when they were in school in the early 2000s, and now they are becoming parents; many don’t know the questions that they should be asking because it all just feels like ‘the way it is’. However, the problem with some of these apps, especially for those who are not tech savvy, is that you may not know what questions to ask regarding what vulnerabilities these Apps may have, what trackers are in place for analytics, or how they are used. We also don’t know how our data is used to ‘better the product,’ how secure the cloud storage is, and what country it is located in (many Apps state that data is ‘stored in an external data storage facility.’ And what happens to photographs and videos uploaded of our children? Who can download them, and who is taking screenshots? Who are the other parents, grandparents, siblings, aunties, or uncles who now have access to photos of your children through the App? For parents with no choice as to providers, their children’s data, along with their own, is being held for ransom because the use of the App is mandated by using the service. On top of all this, we have no idea what this data will look like for our children in the future. Already, some life insurers are using the reason a person has accessed mental health services, including youth mental health services, even without a formal diagnosis of a mental health condition, to decline or heavily restrict offering insurance coverage to an individual. This has been identified in various complaints to State Anti-Discrimination Tribunals and highlighted in a 2021 report by the Public Interest Advisory Centre. Whilst there is clear discrimination here, some insurers are finding their way around the Anti-Discrimination laws. Often, consumers do not know or understand their rights to challenge the decisions made. In addition to these reports, Safe on Social has been informed of cases where individuals are being declined insurance coverage for having had episodes of anxiety, including young adults. What about the possible impact of this data being released or otherwise located by our children’s potential employers down the track? How will the possible disclosure of this information impact their job-seeking? Already we know that employers are searching for prospective employees, with many scrutinising their social media presence, and the perceived reality of a candidate is an influencing factor for some employers. Adding an extra layer of sensitive information could be devastatingly adverse for some individuals. There is no suggestion that these Apps or the data collected is being misused at present, but any data collection is open to exploitation. Data such as sleep patterns, toileting, and what our children eat during the day is essential for some parents, but does it need to be collected and stored, especially when we don’t know who the App developers may share the data with? or sell it to? Photographs of our children are lovely, but do they need to be stored in a third-party Application that cannot guarantee our children’s privacy? Who monitors or vets who are allowed to use the App, and what invitations can be sent to what family members or friends? Other issues are more subtle but of equal importance. For starters, forcing a parent to utilise an App may disadvantage some members of our society (the vision impaired or the non-English speaking), potentially resulting in some form of indirect discrimination. It may even go further to discriminate against parents with family responsibilities who feel they have no choice but to remove their children from services because of a desire to protect their privacy. Those parents are forced to choose between work or placing their child in the care of a service provider who mandates storing their child’s data and who has no control over disclosing that data. We all sign Permission to Publish forms for our children, and there used to be a choice. If you opted out, you would be emailed the photo or given a printed copy. But lately, Safe on Social has been contacted more and more by parents that feel discriminated against. For example, a parent contacted us who was very upset that she had to pull her child from an early childhood after-school activity because she didn’t agree to photos of her child being published online. She had escaped domestic violence and did not want photos of her child online. She was told that her child could not participate if they could not be photographed and published on the business’s social media pages. Newer parents are learning from older parents and are indeed becoming wiser. New parents are making conscious choices to keep their children’s data offline; they’re not posting photos and are thankfully starting to be more cyber-street smart. But are they thinking about the Apps they use to keep track of their children’s activities during the day in childcare or just not sharing them on the major social media platforms? We teach our children about being safe online, and generally, we are now becoming more cautious about when we allow them online and what we allow them to do. These Applications and mandating their use of them take that control away from parents who cannot make informed decisions about what or how their data and their children’s is being used. Isn’t it time we had a conversation about how best to manage the delicate balance between convenience and our children’s privacy? BEFORE YOU SIGN ON, UP AND OVER YOUR CHILD’S DATA, HERE ARE SOME KEY QUESTIONS YOU SHOULD ASK YOUR SERVICE PROVIDER (IN ADDITION TO THOSE RAISED ABOVE): 1. Is the App paid for by the service provider, or are they using a free version? (Remember, your data becomes the product if something is free to use. If it is paid for by the parent, the use of the data may have further protection by the Australian Information Commissioners Office.) 2. Who has access to the app and its data? Where is it stored and can it be deleted if you or your child want it all deleted in the future? 3. How are the people accessing your and your child’s data vetted? 4. Are the photos able to be saved/screenshot? 5. Is there a Social Media Policy in place that advises parents that they should not share photos from within the App on their personal Facebook pages etc if other children are in the image? 6. Does the service provider have a way to email photos to the parents if they choose not to allow their child to be published on the service provider's Facebook/Instagram, and why? 7. If an opt-out is allowed, do they take photos and blur the child’s face out of things they publish online or exclude them completely? (This way, a child can still feel included, and their parents can be emailed the photo, but if blurred out, they cannot be identified online.) 8. What happens to the photos and the data when a child leaves the service provider? 9. Can a parent ask for all data to be destroyed, and if so, how does that happen and when? 10. Is the use of the App mandatory? Is there another way you and your service provider can communicate and share information without using a third-party App? Co-written by Andrea Turner and Kirra Pendergast Andrea Turner is a Lawyer and Cyber Safety Educator for Safe on Social. She is based in Cairns, Australia. Kirra Pendergast is the Founder of the Safe on Social Group of Companies and splits her time between Australia, The UK, and Italy. If you have any questions or concerns about the Application(s) that your childcare service, after-school activity provider, or School is using, or if you are a childcare service provider, school activity provider, or School who needs advice on policy and training, please get in touch with Safe on Social at wecanhelp@safeonsocial.com











