What is happening at the international level with Tik Tok, Google and Twitter: the issue of Child Sexual Abuse Material (CSAM)

What is happening at the international level with Tik Tok, Google and Twitter: the issue of Child Sexual Abuse Material (CSAM)

March 2023

Contribution – Annachiara Sarto

Two weeks ago, Australia’s eSafety commissioner demanded several clarifications to Twitter, TikTok and Google on what they are currently doing to fight the commercial trade of child sexual abuse material which is happening daily.

For example, on the chats of the aforementioned platforms, abuse takes place every minute.1 As a consequence, around one week after the request of the Australian eSafety Commissioner, the news reported how Meta participated in funding a new tool to remove explicit images which target minor which is directly available on the website of the National Center for Missing and Exploited Children (NCMEM).2

In 2021, NCMEM published a report which highlighted how almost 30 million reports of suspected online child abuse material were collected last year, with a rise of 35% from the previous year. 

This tool is called ‘Take It Down’ and is available at this link (https://takeitdown.ncmec.org/). CRO acknowledges the good impact that this tool may have but we still reserve many doubts concerning the practical functioning of the tool. One of the main critical issues that we found is that in ‘Take It Down’, it is possible to upload up to 10 photos /videos at the time and, usually, according to our experience, the majority of the victims have significantly more evidence of abuse. Going through this procedure numerous times to upload the rest of the photos or video may contribute to the harm caused by the so called ‘re-victimization’ of the victims involved.  

Moreover, CRO highlights how the guidelines of the tool further specify ‘online platforms may have limited capabilities to remove content that has already been posted in the past.’  In this referral, we reiterate how the difficulty to completely eliminate the content of CSAM victims is damaging for both their physical and psychological health.  A study by the European Institute for Gender Equality reported that the “51% of women who were victims of non-consensual distribution of intimate material, mistakenly known as revenge porn, considered suicide.”

At CRO we strongly underline the need to change and update the current policies of the platforms from the roots, so as to establish a safer and more secure environment to the minors accessing Internet. As reported by Europol, the increased circulation of CSAM during the COVID-19 pandemic, also increased the need for law enforcement to identify the victims depicted in it. In fact, the rises concerning the detection and reporting of CSAM on the surface web (ex: social media platforms, google, YouTube…) during lockdown, urges the attention for the possible harm level of re-victimisation of children through the distribution of already online images and videos depicting them.

The issue of CSAM, is much wider and broad, and to tackle the issue it is necessary to understand the roots of this crime, and act upon them. As mentioned in the study ‘Towards a Global Indicator on Unidentified Victims in Child Sexual Exploitation Material’ published by ECPAT International and INTERPOL, there are still numerous challenges related to the investigations of CSAM and exploitation both at the nationals and international levels. For example, nowadays there are not enough countries connected to the ICSE Database, and this would be fundamental for ‘enhanced victim identification capacity internationally’. It is important to be willing to cooperate to fight this issue, as there shall not be competitors in such field.

Are you a victim of online crime or abuse?

Contact the CRO Helpline

Preface on the Terminology adopted by CRO Cyber Rights Organization

The Article 2 of EU Directive 2011/93 defines “child pornography” as “(i) any material that visually depicts a child engaged in real or simulated sexually explicit condu ct; (ii) any depiction of the sexual organs of a child for primarily sexual purposes; (iii) any material that visually depicts any person appearing to be a child engaged in real or simulated sexually explicit conduct or any depiction of the sexual organs of any person appearing to be a child, for primarily sexual purposes; or (iv) realistic images of a child engaged in sexually explicit conduct or realistic images of the sexual organs of a child, for primarily sexual purposes”. 

The resolution of the European Parliament on Child Sexual Abuse Online of 11 March 2015, stated that it “is essential to use the correct terminology for crimes against children, including the description of images of sexual abuse of children, and to use the appropriate term ‘child sexual abuse material’ rather than ‘child pornography’”. In fact, even though the term “child pornography” is still used when addressing legal issues and contexts, we as well as numerous law enforcement bodies in many countries, Europol and INTERPOL at the international level, reject the term “child pornography” and use either “child sexual abuse material” or “child sexual exploitation material”.

___________________________________

1 https://www.esafety.gov.au/

2 NCMEC

Has someone posted private material online without your permission?

Request immediate assistance

Scroll to Top
Contact us