Subscribe for Updates
Subscribe to receive ECPAT NZ updates on keeping tamariki and rangatahi in Aotearoa safe from sexual exploitation.
21
Jul
2025
This submission is made by ECPAT Child Alert NZ (ECPAT NZ), based on the findings of “I’m just content to them: Living through sexual exploitation in Aotearoa New Zealand” a qualitative study commissioned by the Ministry of Justice and due to be released later this month. The research is based on the experiences of victims who were sexually exploited as children and adolescents, including through online channels. The report centres the voices of victims themselves, who describe how digital environments facilitated, intensified, and sustained exploitation, and how institutional responses failed to prevent, detect, or interrupt these abuses. We welcome the Education and Workforce Select Committee’s decision to investigate the harm young New Zealanders encounter online and urge members not to overlook one of the most silenced and stigmatised forms of online harm – sexual exploitation of youth.
As our lives become increasingly digitally mediated, so do the tactics of predators, abusers, and people wishing to exploit children and youth for personal gain. Online sexual exploitation of youth refers to situations in which a child or young person is coerced, manipulated, deceived, or forced through digital means into engaging in sexual activities or having sexual material (images, videos, live-streaming, or other content) created, shared, or distributed, often for the benefit of another person or group.
Many victims are unaware their images, or online content is being commodified. Some are unaware the images or content even exist. Others engage in what appear to be mutual and developmentally normative ways, but the people they engage with are not who they claim to be and subsequently exploit them. Humiliation, shame, social exclusion, withdrawal from school and from peers, poorer mental health, and suicide are some of the potential consequences of these kinds of sexual exploitation.
Approximately five percent of New Zealanders experience image-based sexual abuse online.i In 2023, the Department of Internal Affairs’ Digital Child Exploitation Team (DCET) conducted 47 investigations into child sexual abuse material (CSAM), classified around 2.96 million items, seized 209 devices, and safeguarded 35 children.ii
In 2024, 1,032,683 attempts to access websites hosting child sexual exploitation material were blocked by the Digital Child Exploitation Filtering System, and the number of prosecutions grew. In addition, DCET identified a New Zealand-led trafficker network through an investigation that led to the seizure of 12,115 CSAM items,iii highlighting the insidious entrenchment of exploitative abusers within New Zealand who pose a risk to young people online. The number of reports forwarded to New Zealand authorities by America’s National Centre for Missing and Exploited Children (NCMEC) has nearly quadrupled in the last few years.iv Finally, ECPAT NZ is increasingly made aware of young people impacted by sextortion, or the “type of blackmail when someone threatens to share a nude image or sexually explicit video of you online, unless you pay them or provide more sexual content”,v often with devastating consequences.
Yet there is very little infrastructure developed to respond to youth who are sexually exploited online. None of New Zealand’s current government frameworks explicitly recognise online sexual exploitation of youth as a distinct phenomenon requiring a dedicated response.
There is ongoing confusion over what counts as online sexual exploitation of youth, especially whether it requires an immediate exchange or can include abuse that’s later commodified. However, we define online sexual exploitation as any sexual interaction online that is coercive or power-imbalanced and involves any transaction in which one party accrues (or has the potential to later accrue) material gain from the other’s participation.
There is also little knowledge about victims’ experiences to guide policymakers. International research on image-based sexual abuse illustrates how institutional responses are often fragmented, victim-blaming, and poorly aligned with victims’ needs.vi Domestic research, by comparison, is scarce. ECPAT NZ, however, recently undertook research exploring the experiences of people who were sexually exploited as youth; some in person, others online. Their experiences, and their perspectives on prevention, intervention, and recovery, should orient conversations about online harm – they know the most, and are the most impacted by what others do not know.
This submission therefore sets out the findings of ECPAT NZ’s report (titled “I’m just content to them”) on sexual exploitation in New Zealand, and links it to what must be prioritised within this inquiry. This inquiry provides an opportunity to correct these misunderstandings and to adopt a victim-informed definition of harm that reflects victims’ own experiences. We recommend that the Committee adopt the following framing for the purposes of its inquiry:
Harm in online environments is the result of predatory behaviour by perpetrators who exploit digital technologies to groom, manipulate, and coerce young people, and may be increased through institutional and platform failures to respond with justice and pathways to redress and recovery.
Framing harm in this way ensures that responsibility is placed on those who choose to perpetrate and enable exploitation, not on the young people targeted.
The findings of “I’m just content to them” show that what victims experienced was the result of deliberate actions by perpetrators who used digital technologies to groom, manipulate, and control them. These actions were often reinforced by institutional inaction, disbelief, or mischaracterisation of the exploitation as voluntary. The victims described experiencing pressure, coercion, and threats from much older individuals, yet also explained how institutional actors often failed to recognise the relational and psychological manipulation involved. For example, the report documents how some victims were treated as if they had “chosen” to engage, because their exploitation occurred in the context of ongoing communication rather than immediate physical force. This misreading of their experiences left them unprotected and undermined their ability to seek help.
The report makes clear that harm in online environments should not be framed as an unavoidable consequence of the existence of social media or young people’s presence online. The internet and social media are not the cause of harm but are tools used by perpetrators to gain access to young people and to maintain control over them. As the report shows, harm arises from the actions of people who choose to exploit and from the failures of systems to intervene. Starting from a safe perspective is therefore paramount, and should be based on the assumptions that:
Prevention strategies that focus narrowly on instructing young people to change their behaviour (for example, to avoid risky situations or disengage from social media) obscure the agency of perpetrators and neglect the conditions that allow exploitation to continue. The report demonstrates that when victims sought help, they were often told to “be more careful” or remove themselves from platforms, while little was done to hold perpetrators or platforms accountable – leaving them unprotected, and the underlying causes of harm unchanged.
The findings also showed that perpetrators used digital platforms to gain access to young people, to maintain secrecy, and to exercise control over them. Victims described how these environments allowed perpetrators to enter their lives without detection, using apparent anonymity and accessibility to initiate and escalate contact.
Perpetrators typically approached them through social media or other shared digital spaces, often posing as peers or otherwise trustworthy adults. They established contact privately, almost always without the awareness of family members or other protective adults. The contact was then escalated into grooming and coercion, as perpetrators sought to build trust and dependency. Victims felt increasingly pressured to send explicit images or videos, which were then used against them as tools of blackmail and humiliation. The report highlights that once victims shared images, perpetrators would use the threat of exposing those images publicly, or to family and peers, to compel continued compliance, silence, and/or further recruitment.
Barriers to disclosure and support included the risks of exploitation escalating, internalise blame and social silencing of online forms of abuse, a lack of awareness amongst key professionals they had contact with, and the (perceived and actual) lack of actual help that would result from disclosure.
Perpetrators often created dependency and secrecy by presenting the abuse as a private relationship and by threatening victims with consequences if they spoke. This included threats to share explicit images, to humiliate victims publicly, or to harm them or their families. Victims described feeling trapped, with no safe way to refuse further demands or to seek help. Meanwhile, many of them internalised blame for what happened, sometimes believing they had consented to the relationship or caused the situation by engaging online. This perception was reinforced by responses from adults and institutions that treated the situation as the victim’s fault or minimised its seriousness. Some reported being treated as though their exploitation was a result of poor choices or attention-seeking, rather than being recognised as abuse. Others described how disclosures to teachers, police, or family members led to further shame and isolation rather than protection.
These barriers delayed disclosure and prolonged the exploitation. In some cases, victims only sought help after the harm had escalated to an intolerable level or after the perpetrator moved on to other targets. Even then, the responses they received rarely addressed the underlying abuse or provided meaningful protection. Services were not always equipped to recognise or respond to the specific dynamics of online exploitation, and victims’ disclosures could result in punitive or alienating interventions rather than supportive ones.
Throughout the report are examples of serious, often potentially disabling, impacts of the harm on victims who experienced it. For most of those who participated, that harm extended far beyond the immediate abusive interactions and continues to affect victims’ lives long after the exploitation has ended. Victims described severe psychological impacts, including anxiety, depression, and a persistent sense of fear. Many reported ongoing shame and humiliation related to the exposure or circulation of images and the social stigma attached to the abuse. For some, these feelings led to withdrawal from peers, loss of confidence, and difficulties trusting others, and for others, the mature of the exploitation also damaged their family and social bonds. Some described feeling unable to pursue opportunities or maintain friendships because of the lingering consequences of the abuse and the way others perceived them.
The impacts were also felt in education, housing, and relationships. Victims experienced disrupted schooling and instability in living arrangements. Some ended up suicidal; some ended up using substances. None felt the trajectories of their lives was unaltered by the exploitation. The right help, specifically for what they had experienced, did not exist; victims carried the burden of managing the consequences of the exploitation largely on their own. Institutional responses rarely addressed the full extent of harm, and in some cases, simply added to their experiences of adversity.
The “I’m just content to them” report documents how young people in Aotearoa have been deliberately targeted and exploited online, and what this does to their lives. This kind of harm is caused by perpetrators, who use digital platforms to manipulate, coerce, and control victims, and is added to by institutions that fail to recognise, prevent, or stop abuse and exploitation because it looks different to the collective construction of ‘exploitation’, especially when enacted digitally. The result is that victims are left to carry the consequences of exploitation largely on their own, while those responsible (including individuals, institutions, and enabling systems) face little accountability.
We urge the Education and Workforce Select Committee to adopt a victim informed approach to its inquiry, ensuring that recommendations focus on the accountability of those who perpetrate and enable harm, and on the provision of the right support (flexible, immediate support) to those impacted by it – without restricting their autonomy like the exploitation itself did. This inquiry presents an opportunity to address not only the symptoms of harm but the root causes of the mechanism by which it is caused (in this case, sexual exploitation), ensuring that young people are protected.
Best practice overseas and the evidence base for effective responses
International evidence shows that effective responses to online sexual exploitation of young people are victim centred, systemic, and focused on stopping predatory behaviour rather than restricting young people’s access to digital spaces. Best practice approaches recognise that harm arises because of perpetrators’ deliberate actions and enabling environments, not from children’s use of technology itself. These approaches are rights based, trauma-informed, and proactive, addressing both immediate risk and underlying vulnerabilities.
In the United Kingdom, the Child Exploitation and Online Protection (CEOP) Command combines investigation, prevention, and victim support under the National Crime Agency. Education about online risks and healthy relationships is mandatory in schools, and the new Online Safety Act 2023 creates statutory duties for platforms to prevent and remove illegal content promptly. CEOP also provides children with trusted reporting mechanisms and direct victim support.
Australia has established a dedicated eSafety Commissioner, who oversees a national response to online exploitation, including public education campaigns, victim support services, and regulatory enforcement under the Online Safety Act 2021. This legislation gives the Commissioner powers to require platforms to remove abusive material, including nonconsensual images and child sexual abuse material, within tight timeframes. Education resources are provided not only to children but also to parents, teachers, and other protective adults, emphasising how to recognise and respond to grooming and coercion.
In the European Union, a proposed Child Sexual Abuse Regulation would impose obligations on platforms to detect, report, and remove both known and newly discovered exploitative material. The draft regulation also requires platforms to assess and reduce risks proactively, while respecting children’s rights to participate safely online. Canada’s national strategy includes a centralised reporting and education service (Cybertip.ca), specialist law enforcement units, and sustained investment in research and prevention campaigns.
Above all, the research shows that education is most effective when it combines digital literacy with teaching about healthy relationships and the dynamics of coercion. This education should extend to adults, including parents, teachers, counsellors, police, and social workers, so that they can identify warning signs and respond appropriately. School based programmes that focus only on stranger danger or individual behaviour have been shown to be much less effective than those that explain relational and systemic dynamics. This distinction is important; modernised, digitally-relevant, and specialist-informed consent education is crucial, but so is the equivalent upskilling of every person in young people’s ecologies, who often have more power to enact digital safety than young people themselves, and whose responses are influential in young people’s experiences of disclosure.
Legal and institutional responses work best when laws are enforced consistently and when police and prosecutors are trained to recognise grooming and coercion, even when initial communication appears voluntary. Specialist child protection units and victim centred protocols improve outcomes for victims. Regulatory obligations on platforms, such as those in the UK and Australia, have been shown to reduce the circulation of abusive material and make it harder for perpetrators to operate. Voluntary industry codes, by contrast, have little evidence of effectiveness without statutory backing and oversight.
Finally, support for victims must be sustained and trauma-informed. The evidence demonstrates that specialist advocacy, long-term counselling, and stable, knowledgeable case management contribute to recovery and reduce long-term impacts of exploitation. Short-term, punitive, or judgemental interventions (which victims in our research reported) risk exacerbating harm and entrenching stigma.
Key recommendations
i and ii Pacheco, E., Melhuish, N., & Fiske, J. (2019, January 23). Image-based sexual abuse: A snapshot of New Zealand adults’ experiences. Netsafe
iii Department of Internal Affairs. (2025, April 30). Over one million attempts to access child sexual exploitationmaterial blocked in 2024 [Press release]. Department of Internal Affairs.
iv New Zealand Police. (2024, April 23). NZ agencies urge young people to be safe online as reports of online child exploitation continue to rise [Media release]. New Zealand Police
v Netsafe. (2023). Sextortion. https://netsafe.org.nz/scams/sextortion
vi Henry, N., Powell, A., & Flynn, A. L. G. (2017). Not just ‘revenge pornography’: Australians’ experiences ofimage-based abuse: A summary report; McGlynn, C., Rackley, E., & Houghton, R. (2017). Beyond‘revenge porn’: The continuum of image-based sexual abuse. Feminist legal studies, 25(1), 25-46
Subscribe to receive ECPAT NZ updates on keeping tamariki and rangatahi in Aotearoa safe from sexual exploitation.