

Safe on Social
The Future of Online Safety & AI Education
What We Do
We exist to transform how online safety is taught in a world where the digital environment is no longer optional — it shapes identity, relationships, learning, and opportunity.
We lead globally in student, educator, and parent education by going beyond surface-level tips and addressing the systems beneath the screen. Our work embeds digital systems awareness and algorithmic literacy as core life skills, equipping communities to understand how platforms influence behaviour, belief, and self-worth.
Every day, thousands of messages move through children’s feeds, quietly shaping how they see themselves and the world.
We teach how to recognise that influence, question it, and take back control.
Why Our Independence Matters
What became Safe on Social was founded in Byron Bay, Australia, in 2008. From the outset, we have not accepted funding, sponsorship, advisory retainers, affiliate arrangements, equity, promotional amplification, or any other commercial benefit from social media platforms. As supporters of Australia’s 16+ Social Media Minimum Age rule, we apply the same standard to any organisation whose services are restricted under that law or regulated under Australia’s Online Safety Codes and Standards.
Many online safety educators and organisations seek to balance relationships with platforms with public commentary on harm. We do not.
For years, major technology companies have shaped public debate through media, sponsorships, partnerships, and advisory access. When those contributing to the problem also fund the conversation about it, trust erodes. In our home country of Australia, we saw this influence play out in real time. A law designed to stop social media platforms from accessing children under 16 was repeatedly framed as “a ban on kids.”
Australia did not ban children from social media. Australia banned social media companies from accessing our children. The law places responsibility on platforms — not children, parents, or educators — yet the narrative was spun in ways that shifted focus from big tech and softened their accountability.
When messaging blurs responsibility, clarity matters even more.
Children’s safety outweighs brand access, industry visibility, and financial opportunity.
Our independence remains absolute.