top of page
Kirra Pendergast BW.jpg

Hello, I'm Kirra 

I did not arrive at this work through a conference. I did not read a white paper and decide to dedicate my life to digital safety. I arrived the way most people arrive at their most important work — through something that cracked me open and refused to let me look away.

  • Facebook
  • Twitter
  • LinkedIn
  • Instagram

My Story


In 1991, when the internet was still a emerging, I began my career in Sydney in information technology. Not in the glamorous end of it — in the foundations. Networking. Infrastructure. The cables and protocols and authentication systems that the modern world would eventually depend on completely, without ever thinking about them. That invisibility was the point. When the architecture works, nobody notices it. When it fails, everything stops.
 

I spent my first years learning the bones of digital systems across firms where the work was technical, unglamorous, and consequential. Cabletron Systems. Anixter. Megatec. Bridgepoint. Business Aspect. Seven years building the kind of knowledge that doesn't come from a textbook — the kind that comes from being inside systems when they are being built, and occasionally when they break.
 

By 1998 I was at VeriSign, the certificate authority that provided the cryptographic trust infrastructure underpinning the early internet. If you have ever seen a padlock icon in a browser — the small symbol that tells you a transaction is secure — you have seen the work of the systems I was part of. VeriSign had more than three million certificates in operation, covering everything from military applications to financial services. I managed the Northern Region for Queensland and the Northern Territory at a moment when internet security architecture was being constructed at scale, in real time, with no template to follow.
 

That experience — being inside the infrastructure when the infrastructure is still being invented — shapes how I think about everything. Systems are not neutral. They are designed by people with priorities, and those priorities are not always the ones they advertise.

The Years Before Anyone Called It Online Safety
 

From VeriSign I moved into enterprise consulting. Avanade — the joint venture between Microsoft and Accenture — then Director roles in Queensland at CGI Group and Capgemini, both multinational firms operating at the intersection of government, technology, and institutional risk.
 

This is the period in my career that most people skip over, because it doesn't fit the narrative arc they expect from someone who now works in digital safety. But it is the period that made everything else possible.
 

I led identity and access management programs impacting hundreds of thousands of government employees across eight state departments. I contributed to counter-terrorism digital reform strategies in Queensland in the aftermath of September 11, 2001 — work that required me to understand not just technology, but the institutional decision-making that sits around it. The way governments assess risk. The distance between what a policy says and what a system actually does. The gap between a security framework on paper and the human behaviour it is supposed to govern.
 

I was advising at Premier level in Queensland before "online safety" had entered the room. Before it was a policy category. Before it was a funding stream. Before the platforms had PR teams dedicated to appearing to care about it. I was in that same room when the decisions were being made about digital infrastructure that would eventually touch the lives of millions of people — and I was watching, even then, what was being prioritised and what was being ignored.

 

The Crack in Everything
 

In 2013, the internet turned on me personally.
 

A sustained cyber abuse campaign. More than two years. Relentless, coordinated, and designed to destroy. I experienced firsthand what I had spent years understanding theoretically that the mechanisms by which digital systems can be weaponised against individuals, and the complete inadequacy of institutional responses when they are. I could have left the field. A reasonable person might have.
 

Instead I made a decision that redirected the rest of my career. I would not become a cautionary tale. I would not become a commentator. I would become the person who understood these systems well enough — technically, legally, psychologically, institutionally — to change them.


What I had survived had a name. What the institutions around me lacked had a name too. I was going to build what didn't yet exist.
 

Building What Didn't Exist
 

In 2009 I had founded the world's first consulting firm dedicated to social media security, privacy, and risk management. That same year I completed my first eRisk review for an Australian government agency with more than 12,000 staff. I assessing staff social media use, identifying risk, and building a governance response. Nobody else was doing that work at that level. I built the methodology because the methodology didn't exist.

​

Safe on Social grew from that foundation, formalised in 2014. I am also the Founder of Ctrl+Shft with my partners who include Maggie Dent, Dr Brad Marshall and Madeleine West. 
 

Every one of these organisations operates without funding from social media platforms or AI companies. No grants, no partnerships, no speaking fees from the industries my work is designed to hold accountable. In a field that is increasingly shaped by funded voices — by platforms and tech companies investing in the appearance of responsibility without the substance of it — independence is not a footnote in my biography. It is the entire architecture of my credibility.
 

Over 1,200 organisations across five continents now operate with my frameworks, tools, and policies embedded in their daily practice. My programs reach millions of people annually. Departments of Health, Education, and Justice. Corporate boards. Professional sports organisations. Elite school networks. Diocesan systems. Every one of them came to me because something wasn't working — and left with a system that held.

Seven Frameworks. No Equivalents.
 

I am the architect of seven proprietary frameworks. All of them designed, developed, and owned by my organisations. None of them available anywhere else in the market. I know that because I built them to fill gaps that nobody else had filled.


The eRisk Review methodology is now the benchmark standard across hundreds of Australian education institutions. It is the only auditing methodology of its kind offering this level of legal alignment, institutional specificity, and operational applicability. Mapped to GDPR, ISO 45003, BOSE, KCSIE, the UNCRC, and more it transforms what most institutions treat as a compliance checkbox into a living governance system developed by the incredible tech team at Ctrl+Shft — one that holds up when a regulator arrives, when a parent threatens legal action, when a parliamentary committee asks questions.

​

The DEAP (Digital Ethics and Accountability Program) is the world's first restorative justice framework for online harm. Legally defensible. Non-punitive. Designed with Dr Brad Marshall and Maggie Dent from Ctrl+Shft to replace the suspension-and-expulsion models that schools default to — models that satisfy the need to be seen to act without actually producing any change in behaviour. DEAP produces measurable outcomes. No equivalent framework exists anywhere internationally. 
 

Ctrl+Shft also operates Online Impact Economics designed to give boards, regulators, and insurers what they have always needed and never had, a quantitative framework for measuring the true cost of digital harm. Not narrative reporting. Not anecdote. Evidence-based cost modelling across legal, financial, reputational, and wellbeing dimensions.
 

Human Digital Risk Intelligence (HDRI) is my agentic AI safety technology platform — merging digital literacy, online reputation management, and preventative risk tools at institutional and individual scale.


The Online Safety Coach provides sustained, tiered professional development for educators and institutional leaders. Not a one-session intervention. A capability-building system.


The Digital House Framework and eReady Programs — eReady Kids, eReady Teens, eWork Ready — provide age-staged digital readiness curriculum now embedded in national school education frameworks. Built on a metaphor-driven pedagogical model designed from the ground up for cross-cultural and multilingual adaptation.

 

Policy Architecture: Where Law Meets Life
 

Most institutions have policies. Far fewer have systems. The difference is what I build. Through Ctrl+Shft my governance work connects legislative obligations to operational reality, aligning policy architecture with the full landscape of current and emerging regulation: the EU General Data Protection Regulation, the EU Digital Services Act, the EU AI Act, ISO 45003 on psychosocial risk, Australia's Online Safety Act 2021, the UK Keeping Children Safe in Education guidelines, the UK Age Appropriate Design Code, and the United Nations Convention on the Rights of the Child General Comment No. 25 — the global standard on children's rights in the digital environment.
 

More than 350 eRisk and policy reviews completed. Each one designed to produce not paperwork but defensible governance — structures that hold their ground under regulatory scrutiny, parliamentary examination, litigation, and live institutional crisis simultaneously.
 

I am the voice in the room before a new digital tool is rolled out. The filter between an app adoption and a student's lived experience. The clarity a board relies on to understand its exposure — and act — before a regulator, an inspector, or a parent demands answers.
 

I do not comment on policy from a distance. I have contributed to it directly.

Government, Parliament, Legislation

I have given evidence at the Australian Parliamentary Inquiry into Law Enforcement Capabilities in Relation to Child Exploitation. I have provided evidence in New Zealand parliamentary inquiries. I have been appointed Expert Advisor to Standards Australia's Child Safety in the Metaverse Standard contributing to the standards that will govern how children experience immersive digital environments.
 

I served as a member of the Australian Federal Government Age Assurance Technology Trial Stakeholder Advisory Board, contributing to the legislative framework that became Australia's age verification requirements for social media platforms. Previously I served as Chair of the Queensland eSecurity Industry Cluster Board.
 

I have served as expert witness in cross-jurisdictional litigation involving image-based abuse and platform liability. I have provided direct crisis support to institutional systems navigating live incidents involving sexual extortion, deepfake abuse, and technology-facilitated harm against children.
 

This is not theoretical knowledge. This is frontline experience in high-consequence environments where institutional reputation, legal liability, and human wellbeing are simultaneously at stake.
 

The Media, The Book, The Global Stage
 

I am a regular expert commentator across international media — BBC, CNN, Bloomberg, ABC, The Washington Post, The Guardian, and more. I translate complexity into clarity. That is not a small skill. Most people who understand these systems at the depth I understand them cannot explain them to a parent sitting at a kitchen table at 11pm, afraid for their child. I can do both.

I am a keynote speaker with an international program across Australia, New Zealand, Hong Kong, and Europe. I have presented at the European Digital Education Network Conference in Dublin and EduTech in Amsterdam on generative AI, neurodiverse learners, and the governance of digital education.
 

More than 400 educators are enrolled in my Ctrl+Shft Online safety coach program. 
 

I am a published writer on digital safety, platform accountability, cognitive sovereignty, AI and children's rights, and the intersection of technology design and human wellbeing.
 

My first book will be published by Macmillan in 2026.
 

​

Where I Work. How I Think.
 

I am based between Florence, Italy, my hometown of Byron Bay, and Sydney when in Australia, and our London office in Mayfair. The geography is deliberate. The work is global. The independence is absolute.
 

I have spent thirty-four years watching the distance between what digital systems promise and what they deliver. I have spent the last decade building what closes that gap — for schools trying to protect students, for governments trying to regulate platforms that have more power than most countries, for boards trying to understand risk that was never accounted for when the technology was adopted.
 

The urgency of this moment is not lost on me. Artificial intelligence is reshaping every digital environment at a speed that outpaces regulation, outpaces institutional capacity, and outpaces the understanding of the people making the decisions. The children navigating these systems today are navigating something that was not designed with their wellbeing in mind.
 

That is the work. And I have never been more certain that it matters.

bottom of page