top of page

The Children of 764

****Trigger warning**** This post mentions multiple crimes and abuse towards children.



It started, with a boy in a bedroom. Alone, online, and invisible to the adults around him. He liked Minecraft. He watched gore. Somewhere between pixels and unseen pain, a transformation occurred. His name was Bradley Cadenhead. He was a teenager living in Texas, and he became the architect of a Discord server called 764.

In 2024, the National Center for Missing and Exploited Children’s CyberTipline received over 1,300 reports linked to 764 and similar networks. That’s a 200% increase in just one year. These aren't numbers. They're warnings. Not abstract risks, but coordinates of human lives breaking in real time. The children, the teenagers, the women drawn into this gravity well of psychological terror don’t make headlines. But they should.

764 was more than a server. It was a digital dungeon, a theatre of cruelty, a place where abuse wasn’t hidden but celebrated. The name itself was a nod to the ZIP code where Cadenhead lived. Local violence turned global. It began with sharing images, then escalated. Members of the group would lure vulnerable girls, and sometimes boys, into video chats. Then they would extort them. Cut yourself. Undress. Show us pain. Perform, or else.

In one case, they told a girl to stab herself while on livestream. Another was pushed to provide names, personal information, even intel on her school. What followed was not just digital exploitation. It was real-world terror. Bomb threats. Warnings of school shootings. Towns evacuated. Teachers under siege. All triggered by a group that operated from bedrooms and basements, wielding nothing but screens, VPNs, and a complete lack of empathy.

Discord, the platform where 764 was born, eventually flagged and reported the group in 2021. Cadenhead was arrested and sentenced to 80 years in prison in 2023. But the network didn’t die. It scattered. Splinter groups with names like 764 Inferno emerged, each one more sadistic than the last. They were not just sharing illegal content. They were coordinating abuse, active, live, and escalating.


In April 2024, two of the group’s new leaders were arrested: Leonidas Varagiannis, a 21-year-old U.S. citizen living in Greece, and Prasan Nepal, a 20-year-old in North Carolina. According to the Department of Justice, they ordered minors to harm themselves on camera. This wasn’t a few rogue individuals. This was a network with intent. Psychological warfare carried out in real-time against children.

And still, most adults know nothing about it.


We want to believe child predators live in shadows. That their depravity reveals itself in how they look or act in public. That we’d recognise it if it came near our families. But the reality is more banal, and more horrifying. The boys of 764 wore hoodies, not handcuffs. They played video games. They used the same platforms our kids use: Discord, TikTok, Twitch. They looked ordinary because they were.


What makes this worse, and far more complex, is how victimhood can twist. In Vernon, Connecticut, a local honour-roll student was manipulated into becoming an accomplice. She befriended one of the 764 members online. He convinced her to share explicit photos, then coerced her into handing over information about a teacher. That data was used to send threats of bombings and mass shootings. The digital and physical worlds collided. Fear pulsed through schools and neighbourhoods. Police initially believed she was behind the threats, and in a way, she was. But they also saw her as a victim. Because she was.


This is the psychological terrain we are now forced to navigate. Where young people are both targets and weapons, victims and enablers, abused and recruited.


And it's not just in America.


I have multiple reports of young teenagers in Australia displaying all the same patterns that parents have reported to the Police. Girls, mostly. Secretive chats with strangers online. Sudden disappearances. Running away. Self-harm. Refusing to believe it’s not love. Every textbook sign of coercive control, except the controller is behind a keyboard. There’s no “boyfriend”. Just an IP address. And behind it, someone who knows exactly how to make a teenager feel seen, wanted, dependent — then destroy them piece by piece.


What these kids are going through is not melodrama. It is not a phase. It is abuse, scaled by algorithms, automated through platforms, and reinforced by silence. And if you're a parent, an educator, or just an adult paying attention, here’s what you need to understand: the threat isn’t coming. It's here. And we are dangerously underprepared.


What You Can Do

1. Don’t dismiss strange behaviour. Sudden secrecy, withdrawn behaviour, obsessive device use, erratic sleep, or unexplained injuries these are not just growing pains. Don’t look away. Ask. Keep asking. With compassion, not interrogation.


2. Don’t tell them it isn’t love. Not yet. Not until you’ve listened. If you challenge the reality a teenager is clinging to, you don’t dismantle it. You drive them deeper into it. Ask questions. Let them speak. Then slowly, carefully, introduce the idea of what real care looks like and how coercion wears its mask.


3. Document everything. If a child confides in you, take notes. Screenshot messages. Save usernames. Record timelines. Do not assume the platform will preserve evidence. They won’t. Preserve the proof, then report it to both www.accce.gov.au and eSafety.gov.au


4. Get off the moral high horse. You are not here to shame them for what they sent. You are here to keep them alive. Many victims say they stayed because they were more afraid of parental anger than of the abuser’s threats. Fix that.

5. Learn the apps. Don’t rely on headlines. Create your own Discord account. Watch TikTok lives. Ask your kids to show you how Snap maps work. If you don’t know the terrain, you can’t help them navigate it.

6. Don’t go it alone. Reach out to experts. You don’t have to be a digital native to take action, just be a present adult.

Researchers have already warned us. In 2022, a peer-reviewed study in JAMA Pediatrics found that exposure to violent or sexually exploitative content online is associated with increased risk of both victimisation and perpetration in adolescents. The study emphasised that platforms are not neutral environments https://jamanetwork.com/journals/jamapediatrics/fullarticle/2789050

They are designed for engagement, and nothing engages faster than fear and sex.

TikTok knew. Discord knew. These companies sit on data that would make most of us weep. But their responsibility is diluted by shareholder interest. The safety of our children is not in their terms of service. It is in ours.

The future will judge us not by how we innovated, but by what we tolerated.

And right now, we are tolerating too much.

Start talking.



 
 
 

Comments


bottom of page