Public Opinion is Clear: Urgent Legislation Required to Protect Children from Sexual Exploitation! Read the story

Children’s Groups Express Strong Support for Apple’s New Policy

Thank you, Apple, for showing what more is possible to protect children

The publication and exchange of child sexual abuse images and videos over the internet expands and compounds the grave harm already done to the victims of these crimes.

The testimony of the now adult survivors of child sexual abuse should leave nobody in any doubt about the enormity and long-lasting nature of the damage done by the continued publication of images and videos of them being abused.

The detection of child sexual abuse material leads not only to that material being taken down from the platform it is being shared on, but also makes it possible to identify and safeguard children from ongoing or imminent abuse and to arrest offenders. For all these reasons, it is of paramount importance that every technology company, in all parts of the value chain and regardless of size, does everything in their power to detect, remove and report child sexual abuse material as quickly as possible from their systems.

For those same reasons we, as a coalition of child protection organisations and child rights advocates from many different parts of the globe, are writing to support, in the strongest possible terms, the recent policy announcement made by Apple.

In August 2021, Apple announced that in the next update of their operating system, they would include privacy-respecting technology that would enable the company to detect known child sexual abuse images on devices prior to being uploaded to iCloud.

We applaud Apple’s decision not to ignore this issue, but rather to confront it.

Apple’s contribution is hugely significant and very welcome

For many years, other companies large and small have been taking steps to identify and delete child sexual abuse material and report it to the relevant authorities. In 2020, for example, nearly 300 companies submitted 21,751,085 such reports to the NCMEC CyberTipline. Apple accounted for only 265 of these, so the company is a late but most welcome actor.

Crucially, Apple is also the first major global technology company to propose the processing and detection of CSAM at device level. It is equally important to note that while the proposed CSAM detection technology used is not new, Apple is proposing to add layers of encryption and other complex technologies to protect the privacy and security of iCloud users. This emphasis on privacy and security is highly noteworthy as a consistent feature of the Apple brand.

The issue may be less about technology or privacy than about trust

We note Apple’s decision to pause implementation of its announced policy towards child sexual abuse material. The company says it is doing so following “feedback from customers, advocacy groups, researchers and others”. Apple goes on to say: “we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

” The rights of children to life, safety and wellbeing cannot be the casualties of policies that place the privacy of all users “

While we are disappointed at the delay, we are greatly encouraged that the company gave no indication of an intention to reverse its decision or abandon the policy. On the contrary, Apple appears intent on making improvements, perhaps also taking more time to explain its methods and reassure critics they have nothing to fear in relation to privacy.

Apple’s declared intention is solely to create a capacity to identify verified child sexual abuse material. Nothing more, nothing less. The company’s licence agreement expressly forbids any user from using their systems to “plan or engage in any illegal activity”, and the company, reserves the right to undertake inspections with a view to enforcing the license agreement.

The rights of children to life, safety and wellbeing cannot be the casualties of policies that place the privacy of all users, including people who produce, share and consume child sexual abuse material, above the rights of a child in need of protection or a child whose privacy rights have been violated.

State pressure on technology companies to modify or repurpose their products for political aims is a wholly distinct issue. It requires a specific response and cannot be an argument against protecting children where the technology exists to do so. Rather, the solution lies in increased transparency and accountability globally.

We all look forward to a time when Apple’s products will become a synonym not only for the highest standards of privacy as at present, but also for the highest standards of child protection. As things stand today, Apple’s stance is a hugely important step forward, both for Apple and for the sector as whole. Having been told repeatedly that “nothing more can be done”, Apple has shown precisely what more can be done. If there is a will to do it.

” we, as a coalition of child protection organisations and child rights advocates from many different parts of the globe, are writing to support, in the strongest possible terms, the recent policy announcement made by Apple.”

Our collective call to action

  • We call on Apple to remain on course and to establish a clear timeline for implementing these vital steps to prevent the repeated sexual exploitation of children.
  • We strongly encourage Apple to go further, and we welcome dialogue with the company on how to do this.

Join us now to act

Join us in encouraging and working with Apple to ensure its announced policy stays on track and provides children with the protection they need and deserve. Together we can make a difference.
For more information or to add your name and organisation to the list please contact ECPAT International at amyc@ecpat.org