Today on Safer Internet Day, ECPAT urges its members, supporters, policy-makers, and everyone to join us in asking the EU to allow tech companies to continue using tools that help protect children.
For more than two decades, we have used technology platforms to connect with family, and friends worldwide. Millions more people are now connecting through multiple devices, enabling them to be accessible at all times. Global digital adoption was already increasing year-over-year, and then we were hit by COVID-19.
COVID-19 has disrupted how we live, including how we spend our time. Forbes reported that online content consumption doubled in 2020, with the average daily time spent consuming content being 6 hours and 59 minutes, up from just over 3 hours a day. Between Zoom calls to online learning, the Internet has been a source of entertainment, education, and inspiration for the past year, but has it all been positive?
The increased use of the Internet has driven fear and concern around the harms of the online space, particularly for children and young people. There have been many horrifying stories of bullying and harassment and increased reports of child sexual abuse and exploitation. To that we must add the growing awareness and debate about users’ rights to online privacy and anxieties about the impact of misinformation. Considering this, it is understandable why someone may feel it has not all been positive.
Children have the right to be safe online. This Safer Internet Day, ECPAT’s topline message is that online privacy is essential for everyone, but it cannot be used to jeopardize a child’s right to safety online. By ensuring children can enjoy the right to privacy and safety, we can build a better Internet!
With more and more smart technologies in our homes, the public has a right to express concerns about how their data are stored, used, and protected. But, companies and governments alike can uphold this right without putting children at greater risk online.
The National Center for Missing & Exploited Children (NCMEC) has started a petition to demand the EU come up with a solution to better to protect children online. You can support the cause!
On 21 December 2020, the Electronic Communications Code (EECC) came into force as part of the ePrivacy Directive in the European Union. This brought various online messaging services (e.g. Facebook Messenger and Instagram Direct) within the ePrivacy Directive’s scope, which they had not been previously. However, they did not factor in the potential risk to children.
Under the guise of legal compliance, technology companies that had been voluntarily scanning their platforms for child sexual abuse material were placed into legal limbo. Prior to the EECC entering into force, at least 300 companies and organisations worldwide had been scanning for these materials using just one type of technology.
Overnight, Facebook, a company that accounted for the vast majority of child sexual exploitation reports to the National Center for Missing and Exploited Children (NCMEC), announced that it would stop scanning its platforms for known child sexual abuse material. A few companies publicly announced that they would continue despite the uncertainty. The rest remained silent, highlighting the limited transparency of technology company activities to protect children online.
Facebook’s decision has had enormous and immediate consequences – over 2,300,000 child sexual abuse images and videos reported to NCMEC in the first nine months of 2020 originated in the EU. Three weeks after the EECC kicked in the number of reports dropped by 46%, making harmful content invisible to law enforcement and others who would otherwise be in a position to act on it. As tech companies’ reports remain one of the best detection, identification, and risk management tools in our fight to end child sexual abuse and exploitation, this is an extremely urgent and deeply concerning issue.
Imagine this. A single report of sexual exploitation made by a technology company to the authorities can result in a child being protected anywhere in the world. If companies stop using tools that detect child sexual abuse materials, children in situations of abuse are effectively abandoned.
False Concerns About Data Privacy are Putting Children at Risk
The false concern raised is that using specialised tools will invade a user’s privacy more broadly. We want to be clear and address confusion and misleading information on this: these tools are designed to look exclusively for child sexual abuse material (CSAM), using hash values of confirmed CSAM. The tools only know how to ‘look’ for CSAM using a specific algorithm, and they do not know how to look at other images. Every image has a unique ‘fingerprint’ and through the use of these advanced tools each ‘fingerprint’ can be expressed as a unique code, commonly referred to as a ‘hash’. The tools are able to match the hash values of known images of CSAM to flag them. These technologies enable tech companies to detect, report and remove these materials, and are a fundamental part of the child online protection ecosystem. Reports generated through the use of these specialised and targeted tools are also one important way for law enforcement to investigate these crimes.
A temporary derogation (i.e., temporary suspension) of the EECC was approved by the European Parliament in December 2020. This would remove any doubt about the legality of continuing child protection scanning. But it has yet to be confirmed by the three EU Institutions (Parliament, Commission, and Council). This leaves great uncertainty around the duration of the derogation and a number of safeguards that have been proposed.
On this Safer Internet Day, we ask our partners, friends, supporters, policy-makers, and everyone in between to join us in asking the EU Institutions to enact a temporary derogation so that tech companies can resume scanning for child sexual abuse content and harmful sexual behaviour towards children. You can do this by supporting our campaigns below. But the work will not stop there.
During the suspension period, ECPAT International and its member organisations will work with the EU and others on changes to the policy and a new legal framework that puts children’s rights online at the forefront.
All children and young people have the right to benefit from the opportunities of the digital environment to learn, play, and develop. While all Internet users have the right to privacy in their interactions, children and young people also have the right to be protected from harm. In a COVID-19 world, even more of our day-to-day activities take place online and as such, we must prioritize safety alongside privacy. All children have the right to be safe online, and together we can help make this right a reality!