Public Opinion is Clear: Urgent Legislation Required to Protect Children from Sexual Exploitation! Read the story

Navigating the New Digital Reality: Meta’s Shift to End-to-End Encryption

Posted on Dec 14, 2023

Meta’s recent decision to implement default end-to-end encryption (E2EE) across all personal communications on Messenger and Facebook marks a pivotal moment in digital communication security.

This move towards E2EE, a technology already integral to WhatsApp, ensures that only the communicating parties can access the content of their messages. It prevents third parties, including Meta itself, from intercepting or deciphering these communications. As Loredana Crisan, Meta’s Head of Messenger, emphasises, “This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.”  

While this approach champions user privacy, it also significantly limits external monitoring capabilities, including those crucial for safeguarding children, against online sexual abuse and exploitation. 

Meta, a Goliath controlling platforms like Facebook, WhatsApp, and Instagram, disclosed a staggering 21 million suspected cases of child sexual abuse to the National Center for Missing and Exploited Children (NCMEC) in 2022. This underscores the profound impact and pivotal role the company has played in the realm of child online protection. It is crucial to remember that this figure is not merely a statistic; it represents an agonising reality where each case is a testament to the violation of a child, their exploitation captured and circulated, often used as a tool for further grooming and abuse. It highlights the distressing prevalence of child sexual abuse material (CSAM) in our digital environment and offers a window onto the scale of harm in our physical world. 

The implementation of E2EE, well-intentioned in its effort to protect user data, inadvertently shields perpetrators of criminal and harmful behaviour by obscuring their illicit activities from detection. This technological shield complicates the process of identifying and prosecuting those responsible for creating and distributing CSAM, thus potentially allowing the perpetuation of child sexual exploitation and abuse with reduced risk of apprehension. 

Meta has long held a crucial role in identifying and reporting CSAM on its platforms and has a history of notable achievements and commendations from governments and child protection agencies alike, which makes this shift in the landscape even more concerning.  

The repercussions are profound and worrying. The detection and interruption of CSAM and grooming practices on Messenger, Facebook, and WhatsApp will turn markedly more challenging, potentially unattainable, with this change. This scenario potentially creates a sanctuary for perpetrators who, emboldened by increased privacy, might exploit these platforms to harm children. 

Picture a bustling city, teeming with traffic adhering yet bound to the safety measures in place like traffic lights, pedestrian crossings, and speed limits. However, imagine a sudden, jarring change: all these safety measures are abruptly removed. Streetlights go dark, traffic signals fail, and speed limits are forgotten, plunging the city into danger. This scenario reflects what happens in the digital world. Just like children in this city, who suddenly find themselves navigating treacherous roads without the safety measures they once relied on, young internet users are often left vulnerable when protective mechanisms in the digital realm are unexpectedly stripped away or become ineffective. This leaves them exposed to a multitude of risks that were previously managed or mitigated. 

Meta’s introduction of end-to-end encryption on Facebook and Messenger is akin to removing these traffic lights and pedestrian crossings, dramatically amplifying the dangers. The digital environment, already a complex maze of risks, becomes even more hazardous. Legislative voids in child protection, ethical debates about privacy and data protection, limited digital literacy among guardians, and the captivating yet potentially perilous allure of popular digital platforms all now confront children. Meta’s move significantly exacerbates these dangers, crafting larger, obscured spaces where children can be directly harmed and continuously revictimised.   

Confronting this digital watershed moment, it is imperative to demand transparency and accountability from digital giants like Meta. While strides have been made in voluntary transparency reporting, a lack of consistent, enforceable standards persists. As Meta dims the light on critical data essential for safeguarding children, we must critically examine the measures they are implementing to mitigate this substantial gap. What strategies, tools, and educational initiatives are being introduced to counterbalance this pivotal shift? 

As a vanguard child protection organisation, ECPAT International remains resolute in tackling these challenges. Recognising the intricacies of the digital sphere and the myriad platforms operating outside public scrutiny, we acknowledge the overriding need to influence and mould the future of technology in a manner that safeguards and upholds the rights and safety of children. Meta’s decision, a powerful influence in the global domain, sets a precedent with far-reaching implications for the digital landscape. Our focus remains unwavering: advocating for a digital world where children are not mere users but are actively protected and their rights are respected. 

 

ON BACKGROUND 

A Case of David and Goliath or a case of the data holders leading the blind.  

Meta is the US company that owns and operates major platforms including Facebook, Whatsapp, Instagram and Threads, as well as a range of other products and services, upon which billions of people exchange daily worldwide. It employ over 66,000 people worldwide and in 2022 held assets of US$185.73 billion.  

Meta is also the company that in 2022 alone submitted 21 million reports of suspected child sexual abuse on or via its platforms to the National Center for Mission and Exploited Children (NCMEC), the global clearing house for online child sexual abuse reports by electronic service providers located in the US. 

Meta is also the company that last week announced that it had activated a plan announced in 2018 to install End-to-End Encryption on its chat app Messenger, globally.  


Background
 

Meta platforms account for such a large proportion of reports to NCMEC for several reasons:  

  1. The company is huge and processes and stores globally and historically unprecedented volumes of data. To an extent, where there is lots of data, a proportionate percentage may be illegal data.  
  2. The company was one of the first to implement proactive measure to detect, remove and report child sexual abuse material from its platforms. It has consistently had bad press because it has stuck its head above the parapet.  
  3. The company platforms are social platforms. They are specifically designed for people to go to and interact with others. That means that people, as in life, do good and bad things to each other.  


What this move means
 

  • Meta, the individual platforms and therefore law enforcement and specialised organisations such as internet hotlines will no longer be able to access the content – whether text or images/videos – of messages shared via end-to-end encrypted apps. 
  • This is already the case for messages shared by Whatsapp. That company claims it has other techniques, including the processing and analysis of metadata (signals that are associated with a message such as timestamp, location, frequency of contact between two numbers) to detect criminal behaviour.  
  • Therefore, it will overnight become significantly more difficult and in most cases technically impossible to detect and disrupt people sharing CSAM via Messenger, Facebook or Whatsapp (Instagram is not yet part of the transition to E2EE) or engaging in sexual conversations with a child for the purpose of sexual exploitation and abuse.  
  • It also means that intentional offenders and other types of perpetrator will include Messenger and Facebook in their technology of choice because they know they are unlikely to be detected.   


How children will be impacted
 

  • Digital environments are already fraught with a wide range of risks for children. These are evolving, and they are becoming increasingly complex to mitigate. This is for reasons such as:  
    • Lack of legislation that regulates safety of children  
    • Ethical and legal disagreement regarding privacy, data protection, security and how it relates to safety. This includes issues such as how a platform determines the age profile and/or identity of its user, often referred to as ‘’age assurance’’ or in other spheres as ‘’know your customer’’.  
    • Limited caregiver digital literacy 
    • The cult of the cool is always more powerful than the story of safety 
  • So essentially, we know have one more huge space where children can be directly harmed and where children depicted in sexual abuse images can be repeatedly revictimized far from the eyes of law enforcement. 
  • At worst, and many assume this will be the case, CSAM offenders will migrate en masse to these platforms because they are secure and easy to use. They will also lure children to those platforms for other non-encrypted spaces where they meet them, such as TikTok or online games.   


What we don’t know but need to know
 

In 2023, governments and the public still only know what a digital conglomerate like Meta wants us to know. There has been some significant progress in voluntary transparency reports, including by Meta platforms. These typically tell us things like how many CSAM reports were sent to Law Enforcement in any given year, and how many requests were received from law enforcement for lawful access. And while there are a few frameworks for those reports, there is no international standard that enables comparability.  

Crucially, there are also no regulations that enforce transparency reporting in a consistent manner, although there are steps towards this. Legislation in the UK, Australia and the EU, to name a few more high-profile locations, do outline certain standards and criteria that must be included in a report.  

How is this relevant? If Meta is ‘’turning off the light’’ on reports of CSA that may be pivotal in identifying and safeguard children, we need to know what steps they have taken – what tools, processes, staffing, checks, training and educational outreach –  to mitigate the gaping hole they will create in our capacity to fight child sexual abuse online.

 

Final note 

To pay Devil’s Advocate, this move by Meta is not the only things we need to worry about, and we should try not to get distracted to the point that we lost sight of the bigger picture.  

You could argue that that platforms we really need to worry about are those that never engage, never talk publicly about these issues, are not subject to regulatory controls, and operate in non-democratic countries where government interference is a significantly more likely possibility.  

But at the end of the day, Meta is one of the most powerful companies in the world, yielding power and influence far beyond those of many nation states. If they start to call the shots, then the flood gates are essentially open and the question remains not whether we can influence the future of tech, but how we can limit the power that it will yield.