top of page
  • Writer's pictureKirra Pendergast

Legal notices sent to social media companies demanding action to prevent the sharing of CSAM

There are big confronting conversations that desperately need to happen NOW.

We are grateful to see that Australia's eSafety commissioner, Julie Inman Grant, has issued legal notices to five social media companies, including Twitter, TikTok, Google, Twitch, and Discord, on their platforms. Failure to comply with the notice within 35 days may result in daily fines of up to $700,000.

One of the the many major concerns we share with the eSafety commissioner is how social media giants monitor harmful content on their platforms.

Over the past decade, Safe on Social has been closely watching the increased use of artificial intelligence (AI) and algorithms by social media platforms, particularly when it comes to recommending harmful content like child sexual abuse material.

These technologies are designed to analyse user behaviour and make recommendations based on what the user will find engaging or interesting. However, in the case of child sexual abuse material, these algorithms can contribute to the proliferation of illegal content.

Algorithms used by social media platforms work by analysing user data to create profiles and make recommendations. When a user interacts with or searches for content related to child sexual abuse material, the algorithm may use this data to recommend similar content to them. This can create a feedback loop where users are continually exposed to increasingly extreme content, which can normalise and desensitise them to harmful material.

Moreover, algorithms can be manipulated by those who seek to distribute child sexual abuse material by using techniques like search engine optimisation (SEO) and keyword stuffing, which can promote their illegal content to a broader audience. Keyword stuffing is the practice of inserting a large number of keywords into Web content and meta tags in the attempt to artificially increase a page's ranking in search results and drive more traffic to the site. A keyword is a significant term that is relevant to the content in question. Keyword stuffing is considered an unethical SEO technique at best and an attack technique at worst. This can make it more difficult for social media companies to detect and remove harmful material.

Using AI and algorithms to recommend child sexual abuse material is a serious problem, It is crucial for social media companies to take responsibility for the content on their platforms and invest in better technologies and policies to detect and remove harmful material. At Safe on Social we would also like to see this extended to online games such as Roblox and the "cartoon" type content. This week we have had numerous presentations where kids as young as 8yrs old have been offered by complete strangers free "Robux" the in game economy, to "lay down" on top of another avatar, to undress their avatar to be naked, be someones boyfriend or girlfriend, play the Mum or the Dad be "adopted" and numerous other concerning behaviours when they are happening in role play games. All of this contributes to the desensitisation of children during what is often long term online grooming. Parents need to completely understand what is happening on these platforms and be open to confronting and tough conversations about the horrendous amount of child sexual abuse content and what is actually happening.

Ultimately, it is the responsibility of all of us to ensure that the internet remains a safe and welcoming space for all. It is crucial for social media companies to actively combat child sexual abuse material and protect vulnerable populations like children.

On Monday the 20th of February both myself and our expert presenter Madeleine West, gave evidence at The Parliamentary Inquiry into law enforcement capabilities in relation to child exploitation.

Australia is leading the way in issuing legal demands to tech companies and taking a zero-tolerance approach to predatory behaviour. Similar legislation has been implemented in other countries, such as New Zealand's Films, Videos, and Publications Classification Act 1993 and Harmful Digital Communications Act 2015, which make it illegal to possess or distribute such material and require tech companies to remove harmful digital communications, including child sexual abuse material, upon receiving a complaint.

In the United Kingdom, the Online Harms White Paper proposes a new regulatory framework to tackle illegal and harmful online content, including child sexual exploitation and abuse material. The proposed framework would require tech companies to protect their users from unlawful and harmful content, and failure to comply could result in hefty fines and other penalties. In the European Union, the Electronic Commerce Directive requires online platforms to remove illegal content when they become aware. The recently passed Digital Services Act proposes new rules to combat illegal content and increase transparency and accountability for online platforms. In the United States, the Child Protection and Obscenity Enforcement Act and the PROTECT Act make it illegal to produce, distribute, or possess child pornography, and online platforms are required to comply with these laws. However, Section 230(c) of the Communications Decency Act of 1996 protects online platforms from being held responsible for content posted by their users. Still, this immunity does not extend to content the platform creates or supports.

Collectively we need to do more. If you school or business would like myself and Madeleine to speak to the your parent community, business or conference about our experience on this topic and how we can do more please get in touch at

If you or someone else is in immediate danger please call Triple Zero: 000

To report criminal activity, please visit: Crime Stoppers or call 1800 333 000

To report suspected grooming, live streaming or consuming of child sex abuse material, an individual having an inappropriate conversation with a child or blackmailing of a child for sexual purposes you can also go online and report abuse and see more details on the Australian Centre for Child Abuse and Exploitation website.

If you need help or emotional support, please contact:

Lifeline - 13 11 14 Kids Helpline - 1800 55 1800 1800RESPECT - 1800 737 7328 Bravehearts Foundation - 1800 421 468 PartnerSPEAK - 1300 590 589 Fighters Against Child Abuse Australia (FACAA) Blue Knot Foundation and Redress Support Service - 1300 657 380 13YARN - 13 92 76 for Aboriginal & Torres Strait Islander crisis support

605 views0 comments

Recent Posts

See All

Because it seems no one else has......we did.

Here is what our Youth Advisor Madison Jones found out when she asked a stack of her friends whether or not they agree with social media bans for kids under 14/16 in some Australian States. (Please no


bottom of page