December 2025
Across the globe, many governments are considering or have already introduced legislative bans on social media use for children under a certain age. As of 10 December 2025, Australia began enforcing a world-first social media ban restricting children under 16 from using platforms such as Instagram and TikTok. While the urgent and genuine concern to protect children from technology-facilitated child sexual abuse and exploitation often drives these proposals, these bans do not provide a perfect solution.
As an organisation committed to ending the sexual exploitation of children, ECPAT International supports efforts to improve children’s safety in digital spaces. However, we are concerned that legislative bans risk misdiagnosing the problem, excluding children from critical connection and support spaces, and shifting responsibility away from the actors best placed to drive systemic change: governments and technology companies.
Instead of hastily implementing age-based bans as a one-size-fits-all solution, we urge policymakers to invest in rights-based and evidence-led solutions that address the complexity of the digital ecosystem.
Public support for regulation is high. Recent polling suggests strong support for bans on children under 16 using social media¹. While such measures provide temporary political or parental reassurance, they are simplistic solutions to a complex issue.
The evidence base for these policies is limited. Bans may create a false sense of security while failing to reduce actual harm. They also risk overlooking other digital environments where risks persist—or even grow. For example, the rapid development of AI means that child sexual abuse material can now be created by anyone through generative tools and shared in digital spaces beyond social media platforms.
While rights-based age assurance mechanisms are a powerful piece of the puzzle to keep children safe online², social media bans only restrict access to the platforms. They do not guarantee age-appropriate experiences, which must recognise the importance of also allowing children and adolescents room to practice online safety and safe decision-making as their autonomy and understanding of online risks gradually grow.
To truly protect children from technology-facilitated sexual abuse and exploitation, we must move beyond surface-level solutions. A robust child protection framework for the digital age should include:
These recommendations align with the rights-based approach outlined in the UN Convention on the Rights of the Child and its General Comment No. 25 on children’s rights in relation to the digital environment³.
Any discussions about children and digital spaces must centre on the rights of the child, including the rights to safety, participation, inclusion, access to information, and privacy. Bans risk undermining these rights by conflating protection with restriction.
Children have the right to be online safely. They should not be excluded as a consequence of failures to make digital spaces safe.
Research conducted with children and caregivers across 15 countries in Europe, Asia, and South America⁴ shows that children themselves want guidance, not exclusion. They consistently express a desire to understand technology better, such as how algorithms, cookies, AI, and safety measures work. Children emphasise the need for balanced support from caregivers while retaining their growing autonomy, underscoring that meaningful protection must respect and strengthen children’s agency.
ECPAT International supports efforts to protect children online and in-person. However, we do not support blanket social media bans for children as a primary solution policy tool. Instead, we call on states to:
Blanket bans may look like action, but real protection comes from strengthening systems that recognise children’s rights, hold companies accountable, and prioritise age-appropriate and gender-sensitive safety at every level of the digital experience.
For enquiries, please get in touch with us at communications@ecpat.org.
—