***Trigger warning*** Three Dead Teenagers. One Common Thread. Apple’s iMessage Failed Them.
- Kirra Pendergast
- 2 days ago
- 6 min read

This is not a story about technology. This is a story about neglect. Over the weekend I read an article in The Wall Street Journal that you can read here:
https://www.wsj.com/tech/personal-tech/sextortion-scam-teens-apple-imessage-app-159e82a8?st=TWuAaq&reflink=desktopwebshare_permalink
In three separate homes, in three different US states, three families are now living with the unimaginable. Their sons were targeted by sexual extortion scammers. Strangers who knew exactly what they were doing. These criminals manipulated, coerced, and terrorised their victims using the most unassuming, widely trusted platform in the Western world: iMessage.
Apple’s default messaging app, pre-installed on every iPhone, every iPad, every Mac. The blue bubbles that feel familiar. Safe. Polished. Private. It’s the very sheen of iMessage that makes it so dangerous.
Because unlike WhatsApp, Instagram, or even Telegram, platforms that are frequently criticised for harbouring criminals iMessage offers no functional infrastructure for users, especially minors, to report crimes. There is no button to flag suspicion of sexual extortion. No alert to moderators. No connection to law enforcement. Just a quiet, meaningless option to “report junk,” which disappears into a void with no confirmation, no tracking, no hope.
This is what that means. A child, being blackmailed with explicit images, receiving threats to expose them to their family, school, or followers has no pathway to ask for help through the very tool they’re being attacked on. Instead, they’re left to fend for themselves. To block account after account as the abuser cycles through endless new iCloud identities. Because Apple allows unlimited, anonymous account creation. And because there's no intervention system in place, the attacks keep coming.
Three teens. Gone. One of them, a 17-year-old from Michigan, received over a hundred messages in a single night. Demands. Threats. Warnings. When he tried to ignore them, they escalated. When he blocked them, they reappeared. He didn’t tell his parents. Not because he didn’t love them. But because shame is a cage, and these criminals know how to lock it tight.
He died the next day.
His story is one of hundreds now being investigated across the U.S. and globally, in cases tied to online sexual extortion an epidemic so rapid, so insidious, the FBI, ACCCE and the AFP and other global law enforcement agencies have issued repeated public alerts, and NCMEC (the National Center for Missing and Exploited Children) has warned that the psychological trauma inflicted by these schemes is leading directly to suicide.
But here’s where the story twists. NCMEC received just 250 reports of child exploitation from Apple platforms last year. Meta, the company behind Facebook and Instagram, submitted over 5 million.
This is not about who has the most users. It’s about who has the most denial.
Apple’s number isn’t low because the abuse isn’t happening on its platforms. It’s low because Apple has systematically failed to build reporting infrastructure that would allow it to know. That would allow it to act. It’s not a limitation of technology. Apple has some of the most powerful engineers on Earth. It’s not a question of resources. Their Q1 2025 earnings exceeded $119 billion. It’s not even a legal grey area. Apple has the same obligations under U.S. federal law as Meta and Snap to report known instances of child exploitation to NCMEC. The difference is will.
There is no transparency report that outlines how Apple handles abuse cases on iMessage. No moderation team made publicly accountable. No roadmap for future safety tools. Apple’s public communications celebrate encryption, privacy, control. But what happens when that control is handed to predators, and children have nowhere to turn?
Technology must be held to the same standards we demand of any public infrastructure. We would never allow a school to operate without doors that lock, without fire exits, without the ability to call for help. Yet we allow these massive tech platforms to be part of a child’s daily life without any of those safety mechanisms. When the most dominant platforms refuse to participate in safety design, the system breaks. And that’s what this is. A complete system failure. Because Apple doesn’t just control the device. It controls the ecosystem. The operating system. The default apps. The user experience. Which means it also bears the responsibility to protect the youngest, most vulnerable users within it. The company that changed the face of communication has refused to adapt it to a world where communication is weaponised.
You won’t hear Tim Cook talk about this on stage. You’ll hear about AI. You’ll see camera upgrades. You’ll watch slow pans of anodized aluminum and phrases like “our most powerful iPhone yet.” But you won’t hear the names of the teenagers who died after being hunted through Apple’s app. Their deaths won’t be included in the shareholder brief. There will be no ticker tape for the lives lost to an interface that chooses aesthetics over accountability.
We cannot accept this as the price of connection. Not when the tools to fix it are simple. Not when other companies, with all their flaws, have shown it’s possible. Report buttons. Human moderation. Escalation pathways. Crisis response teams. Mechanisms to alert, intercept, and intervene before a child makes a permanent decision in a moment of temporary despair.
The lack of these systems is not a glitch. It is a choice.
And until Apple makes a different one, every parent should know: the iMessage icon isn’t just a blue bubble. For some, it has become the last door a child walked through before taking their life. The least we can do is knock it down.
3 Ways to Start the Conversation About Sexual Extortion with Your Tweens and Teens — and Why It Matters Now More Than Ever
We no longer use the term sextortion because it dilutes the violence of what’s happening. Sexual extortion makes it clear. This is not a misstep or a teenage experiment gone wrong. It’s exploitation, plain and dangerous. By naming it for what it is, we strip away the shame that stops kids from asking for help.
Words matter. And so does timing.
If you or your child are navigating any of this, don’t wait. Go to the Australian Centre to Counter Child Exploitation (ACCCE) for official guidance and reporting tools. You can also visit SmackTalk, a peer-informed education platform that tackles these conversations head-on with straight talk, not sugar-coating. Because the digital world isn’t going to slow down. But we can get louder. Smarter. And much, much harder to manipulate.
The word sexting once seemed like the scariest thing a parent might have to explain to their child. But we're past that. The reality is starker now. Children are no longer just experimenting with risky images or impulsively sharing with people they trust. They are being targeted, manipulated, blackmailed — sometimes by strangers, sometimes by people they know.
Run through encrypted platforms, gaming chats, social media, and even school-group DMs, these schemes are often part of larger criminal networks that know how to groom a child in minutes. They collect compromising images or videos, then threaten to expose the victim unless they send more. Sometimes money is demanded. Sometimes the threats escalate into real-world harm. In all cases, the child is trapped. And deeply alone.
This is not hypothetical. In 2024, the Australian Centre to Counter Child Exploitation (ACCCE) recorded an unprecedented increase in reports of sexual extortion, especially targeting boys between 12 and 17 years old. Many cases involved international criminals posing as teens online. And while legislation scrambles to catch up, kids are being coerced into silence and shame. Some don’t survive the psychological fallout.
So if you’re a parent or caregiver, the most powerful thing you can do today is start the conversation not with fear, but with clarity and consistency.
Here’s how.
1. “What kind of images or videos do you think are okay to share with others?”
This isn't about lecturing. It’s about giving your child space to process what they already see online and what they think is normal. When you ask this question, you’re not just asking about behaviour. You’re helping them define personal boundaries, online consent, and digital permanence.
Let them speak. No interruptions. Then share your values in plain, non-judgmental terms. The goal is to build trust, not compliance.
2. “What would you do if someone asked you to send a photo or video of yourself?”
Kids don’t make decisions well when they’re panicked. But if they’ve already imagined a scenario, they’re more likely to respond with confidence. This is your chance to help them pre-load strategies and scripts for high-pressure situations. Reinforce that it is never okay for anyone — friend, crush, stranger — to demand or guilt them into sharing images. And that they can always come to you, no matter what.
Make it normal to talk about awkward or frightening scenarios before they happen. That’s where the real protection begins.
3. “Do you know what can happen if someone shares your image without permission?”
Your child probably knows the images don’t disappear. But they may not know that their data — including photos — can be stolen, altered, and sold. Or that once an image circulates, even among peers, it can be used for bullying, impersonation, or long-term exploitation. This is where sexual extortion often begins: one image, shared under pressure, then used as blackmail. The offender might threaten to send it to family members, friends, or post it publicly unless more are sent. It's a trauma trap.
Let them know the law is on their side. That it's never their fault even if they sent an image of themselves naked or nearly naked. And that there are real people who can help, right now.
Comments