Meta’s recent decision to implement default end-to-end encryption (E2EE) across all personal communications on Messenger and Facebook marks a pivotal moment in digital communication security.
This move towards E2EE, a technology already integral to WhatsApp, ensures that only the communicating parties can access the content of their messages. It prevents third parties, including Meta itself, from intercepting or deciphering these communications. As Loredana Crisan, Meta’s Head of Messenger, emphasises, “This means that nobody, including Meta, can see what’s sent or said, unless you choose to report a message to us.”
While this approach champions user privacy, it also significantly limits external monitoring capabilities, including those crucial for safeguarding children, against online sexual abuse and exploitation.
Meta, a Goliath controlling platforms like Facebook, WhatsApp, and Instagram, disclosed a staggering 21 million suspected cases of child sexual abuse to the National Center for Missing and Exploited Children (NCMEC) in 2022. This underscores the profound impact and pivotal role the company has played in the realm of child online protection. It is crucial to remember that this figure is not merely a statistic; it represents an agonising reality where each case is a testament to the violation of a child, their exploitation captured and circulated, often used as a tool for further grooming and abuse. It highlights the distressing prevalence of child sexual abuse material (CSAM) in our digital environment and offers a window onto the scale of harm in our physical world.
The implementation of E2EE, well-intentioned in its effort to protect user data, inadvertently shields perpetrators of criminal and harmful behaviour by obscuring their illicit activities from detection. This technological shield complicates the process of identifying and prosecuting those responsible for creating and distributing CSAM, thus potentially allowing the perpetuation of child sexual exploitation and abuse with reduced risk of apprehension.
Meta has long held a crucial role in identifying and reporting CSAM on its platforms and has a history of notable achievements and commendations from governments and child protection agencies alike, which makes this shift in the landscape even more concerning.
The repercussions are profound and worrying. The detection and interruption of CSAM and grooming practices on Messenger, Facebook, and WhatsApp will turn markedly more challenging, potentially unattainable, with this change. This scenario potentially creates a sanctuary for perpetrators who, emboldened by increased privacy, might exploit these platforms to harm children.
Picture a bustling city, teeming with traffic adhering yet bound to the safety measures in place like traffic lights, pedestrian crossings, and speed limits. However, imagine a sudden, jarring change: all these safety measures are abruptly removed. Streetlights go dark, traffic signals fail, and speed limits are forgotten, plunging the city into danger. This scenario reflects what happens in the digital world. Just like children in this city, who suddenly find themselves navigating treacherous roads without the safety measures they once relied on, young internet users are often left vulnerable when protective mechanisms in the digital realm are unexpectedly stripped away or become ineffective. This leaves them exposed to a multitude of risks that were previously managed or mitigated.
Meta’s introduction of end-to-end encryption on Facebook and Messenger is akin to removing these traffic lights and pedestrian crossings, dramatically amplifying the dangers. The digital environment, already a complex maze of risks, becomes even more hazardous. Legislative voids in child protection, ethical debates about privacy and data protection, limited digital literacy among guardians, and the captivating yet potentially perilous allure of popular digital platforms all now confront children. Meta’s move significantly exacerbates these dangers, crafting larger, obscured spaces where children can be directly harmed and continuously revictimised.
Confronting this digital watershed moment, it is imperative to demand transparency and accountability from digital giants like Meta. While strides have been made in voluntary transparency reporting, a lack of consistent, enforceable standards persists. As Meta dims the light on critical data essential for safeguarding children, we must critically examine the measures they are implementing to mitigate this substantial gap. What strategies, tools, and educational initiatives are being introduced to counterbalance this pivotal shift?
As a vanguard child protection organisation, ECPAT International remains resolute in tackling these challenges. Recognising the intricacies of the digital sphere and the myriad platforms operating outside public scrutiny, we acknowledge the overriding need to influence and mould the future of technology in a manner that safeguards and upholds the rights and safety of children. Meta’s decision, a powerful influence in the global domain, sets a precedent with far-reaching implications for the digital landscape. Our focus remains unwavering: advocating for a digital world where children are not mere users but are actively protected and their rights are respected.
ON BACKGROUND
A Case of David and Goliath or a case of the data holders leading the blind.
Meta is the US company that owns and operates major platforms including Facebook, Whatsapp, Instagram and Threads, as well as a range of other products and services, upon which billions of people exchange daily worldwide. It employ over 66,000 people worldwide and in 2022 held assets of US$185.73 billion.
Meta is also the company that in 2022 alone submitted 21 million reports of suspected child sexual abuse on or via its platforms to the National Center for Mission and Exploited Children (NCMEC), the global clearing house for online child sexual abuse reports by electronic service providers located in the US.
Meta is also the company that last week announced that it had activated a plan announced in 2018 to install End-to-End Encryption on its chat app Messenger, globally.
Background
Meta platforms account for such a large proportion of reports to NCMEC for several reasons:
What this move means
How children will be impacted
What we don’t know but need to know
In 2023, governments and the public still only know what a digital conglomerate like Meta wants us to know. There has been some significant progress in voluntary transparency reports, including by Meta platforms. These typically tell us things like how many CSAM reports were sent to Law Enforcement in any given year, and how many requests were received from law enforcement for lawful access. And while there are a few frameworks for those reports, there is no international standard that enables comparability.
Crucially, there are also no regulations that enforce transparency reporting in a consistent manner, although there are steps towards this. Legislation in the UK, Australia and the EU, to name a few more high-profile locations, do outline certain standards and criteria that must be included in a report.
How is this relevant? If Meta is ‘’turning off the light’’ on reports of CSA that may be pivotal in identifying and safeguard children, we need to know what steps they have taken – what tools, processes, staffing, checks, training and educational outreach – to mitigate the gaping hole they will create in our capacity to fight child sexual abuse online.
Final note
To pay Devil’s Advocate, this move by Meta is not the only things we need to worry about, and we should try not to get distracted to the point that we lost sight of the bigger picture.
You could argue that that platforms we really need to worry about are those that never engage, never talk publicly about these issues, are not subject to regulatory controls, and operate in non-democratic countries where government interference is a significantly more likely possibility.
But at the end of the day, Meta is one of the most powerful companies in the world, yielding power and influence far beyond those of many nation states. If they start to call the shots, then the flood gates are essentially open and the question remains not whether we can influence the future of tech, but how we can limit the power that it will yield.