How are poverty and rigid gender stereotypes placing boys in Morocco at risk of sexual exploitation?  Read the full report

Towards online child protection in the EU

How can we ensure child safety online—without giving up our privacy?

Read The Story
Reports show that the number of child sexual abuse material found online had increased from 1 million in 2014 to over 20 million in 2021. It is clear that the long-running practice of relying on tech platforms to self-regulate is not working.

The current approach of allowing big tech to opt in and self regulate has failed. Effective, long-term solutions to counter the proliferation of child sexual abuse and exploitation material online lies in government regulation. With such regulations drafted or already in place in Australia, the United Kingdom, and the United States, all eyes now turn to the leadership of the European Union. 

On 11 May 2022, the European Commission published its much-anticipated proposal for a regulation to prevent and combat child sexual abuse. 

The proposal will provide much-needed clarity on what technology companies must do to detect, remove, and report child sexual exploitation and abuse. 

It contains two key strands: 1) new obligations for ‘providers of information society services’ to protect children from sexual abuse and exploitation on their platforms and services; and 2) the creation of an EU Centre to coordinate and support action by companies and Member State Coordinating Authorities.

Law enforcement, technology companies, children’s organisations, and other stakeholders cannot prevent child sexual exploitation and abuse in a vacuum.

It is essential for all stakeholders to step up and play their part in identifying and appropriately sharing information when a crime occurs, when a child is at risk, or when they can contribute to preventing a child from being harmed.

With this new proposal, the European Commission is creating a rulebook on how and by whom that information must be gathered, and how it is to be used to protect children while respecting users’ privacy online.

These are complex issues, and there is no one-size-fits-all solution that offers comprehensive privacy while ensuring the protection of children online.

As tech companies continue to introduce end-to-end encryption onto their platforms for users’ privacy, it is critical that they also implement the necessary child safety tools and features.

Crucially, the EU public understands this. In a recent survey conducted by ECPAT International member organisations across the EU:

  • Nearly 7 out of 10 adults felt that at present, there is little to no privacy online.
  • There was an overarching concern for children’s safety online, with 73% of adults believing children cannot go online without being approached by adults looking to harm them.
  • Encouragingly, 76% of adults indicated a willingness to compromise some of their own personal privacy online to allow for automated technology tools to scan and detect images of child sexual abuse and detect other forms of sexual exploitation of children.  

These are complex issues, and there is no one-size-fits-all solution that offers comprehensive privacy while ensuring the protection of children online.

We need to ask ourselves:

Do we want child sexual abuse material to circulate online any more than we want to fall victim to malware or spam?

Presently, the vast majority of us have already consented to targeted measures, such as malware and spam filters, that store our data in order to keep us and our devices safe online.

In a similar way, technologies built to detect child sexual abuse material constitute a highly targeted and specific measure that efficiently enables the evidence of child sexual abuse to be removed from the online environment.

This protects the dignity and privacy of survivors whose recorded abuse is circulating online, serves as an important deterrence measure for offenders, and ensures that online service providers are not facilitating the exchange of criminal content.

We need to ask ourselves: Do we consider measures such as spam and malware filters to be a violation of our privacy rights? Do we want child sexual abuse material to circulate online any more than we want to fall victim to malware or spam?

This is about real children and their right to live healthy, safe, and secure lives in digital and physical environments.

The European Commission’s proposal is, in no shape or form, the wholesale abandonment or violation of privacy rights.

It means protecting children so they can benefit from safe and empowering digital lives, in the same way that we provide countless safeguards for children in the physical environment.

As negotiation and debate around the proposal begin, we need to remember that this is not only about data, resources, or policies. This is about real children and their right to live healthy, safe, and secure lives in digital and physical environments.

Stay Connected. Sign Up to Receive Updates About Our Impact on Child Sexual Exploitation