“Every unreported image represents a child potentially in imminent danger of being sexually abused again”
On 20th December 2020, a new European Communications Code came into force. Microsoft, Google and other online businesses took the view the Code had no meaningful impact on the steps they were already taking to detect, delete and report child sexual abuse material found on their platforms. They carried on as before.
Facebook took a different approach. They decided the new Code made it unlawful for them to continue trying to detect, delete and report child sex abuse material. They had been doing this for over 5 years but at the stroke that day they stopped.
The impact of this decision was immediate and has had catastrophic consequences for millions of children in the European Union. By the end of July 2021, reports of child sexual abuse material being found in EU Member States were down by 76% and this at a time when globally the amount of such material being found was increasing.
On 14th July 2021 the European Union finally adopted the text called an “Interim Derogation”. This resolved all apparent or presumed legal difficulties there might have been with the Code. It had been clear for several months that this was going to be the outcome, yet Facebook insisted they would only recommence looking for child sexual abuse material on their platforms when the final text was adopted and had appeared in the Official Journal of the European Union. That happened on 30th July, 2021. Yet at the time of this article, Facebook has not recommenced this vital child protection measure.
ECPAT International is now seeking clarification from Facebook. We are asking them these key questions:
ECPAT International’s Acting Executive Director Dr. Dorothea Czarnecki recently shared ECPAT’s thoughts on Facebook’s position with POLITICO: “Every unreported image represents a child potentially in imminent danger of being sexually abused again and often it will be the image of a child who needs to be found and helped now. Since Facebook stopped scanning, we must assume that there have been millions of unreported images in the EU. It is outrageous that Facebook is being so slow in reverting to their previous practice.”