Child Sexual Abuse Material (CSAM)

Child Sexual Abuse Material (CSAM) is an increasingly growing international concern. Our reporting tool helps the Department of Internal Affairs to identify instances of CSAM, and and we sit on the Independent Reference Group for the Digital Child Exploitation Filter and advocate for perpetrators of creating and disseminating CSAM to be prosecuted in line with international best practice.

Digital Child Exploitation Filter

ECPAT NZ works with Aotearoa Government’s Department of Internal Affairs (DIA), who lead the national response to prevent the spread of child sexual abuse material (CSAM). We sit on the Independent Reference Group for the DIA operated Digital Child Exploitation Filter. This group was established by the DIA to maintain oversight of the operation of the Digital Child Exploitation Filtering System and to ensure it is operated with integrity and adheres to its Code of Practice principles. This filter system is the Government’s primary preventative measure against CSAM. Websites can be blocked, material followed up by investigators, potential offenders can seek help, and essential data can be provided on users of this content.


More than 60% of victims of child sexual abuse imagery are prepubescent

CSAM Reporting Tool

ECPAT NZ runs a web-based tool for the anonymous reporting of child sexual abuse imagery and other digital crimes relating to child sexual exploitation (such as trade in child-like sex dolls). This tool allows members of the public to report content to a non-governmental organisation (a noted preference compared with reporting directly to government agencies). These are forwarded to the DIA and followed up by investigators from the DIA and NZ Police as appropriate.

In 2022, ECPAT NZ received 590 reports of CSAM through our web-based reporting tool and provided data to an international investigation.

We are exploring partnership opportunities with ECPAT Sweden and other organisations to support the use of tools that can remove exploitative images and self-generated content that is being used for revenge porn, abuse, and new forms of online bullying.

We assist other non-governmental organisations and government agencies with information on CSAM. This includes contributing to ongoing efforts to extend filter services across the Pacific Islands, as well as providing training and resources to organisations and members of the public on online and digital exploitation.

67% of CSAM survivors said the distribution of their images impacts them differently than the hands-on abuse they suffered because the distribution never ends and the images are permanent.

(Canadian Centre for Child Protection, 2017)​