Dublin Rape Crisis Centre: View on social media ban for under-16s
10 February 2026

We fully appreciate and support what we believe to be the Government’s intention behind the proposal to ban under-16s from social media: to protect young people from real harm online. However, based on international experience and evidence emerging from both Europe and Australia, we have concerns about whether a blanket ban can deliver the protection we all want for children.
Furthermore, Dublin Rape Crisis Centre is concerned that focusing solely on the risks to children does not address the harm people of all ages are already experiencing because of the largely fig-leaf measures social media platforms have put in place to date. Addressing the harm at source requires confronting platform practices, not simply restricting users.
Experiences from other jurisdictions already show that young people are quick to find workarounds. In Australia, where the world’s first such ban has been introduced, teenagers have rapidly turned to alternative apps or VPNs to bypass restrictions, even in the earliest weeks of enforcement. In France and the UK, policymakers have acknowledged that enforcement remains one of the biggest unresolved challenges because there is still no reliable, rights‑respecting way to verify age at scale. Even with significant compliance efforts, Australia reported that although 4.7 million under‑16 accounts were removed in the first month alone, by no means have all underage accounts been eliminated. This reinforces the risk that a ban may create the impression of safety without guaranteeing it.
Age‑verification technologies themselves remain deeply problematic. UK analysis warns that the systems required to enforce such laws could amount to mass surveillance, requiring age verification for all users, not just those under 16.
It is also important to consider what adolescence actually looks like today. Between the ages of 12 and 16, young people are forming relationships, exploring their identities, taking some risks and in doing so, becoming more independent. These developmental milestones have not changed for centuries, but the environment in which they play out increasingly includes online spaces. Removing teenagers entirely from social media risks cutting them off from places that are central to modern peer interaction. Australian wellbeing research shows that both very high use and no use at all are associated with poorer outcomes, and that moderate, supported use is linked to better wellbeing for many young people. This is especially true for boys in mid‑adolescence, for whom online communication often plays a key role in maintaining friendships.
Dublin Rape Crisis Centre has been one of the leading voices highlighting how social media platforms expose young people to harmful and distressing content including material that is fuelling the epidemic of sexual violence and misogyny our clients are experiencing. However, a ban places the responsibility for avoiding harm squarely on teenagers rather than on the platforms that create, host, and amplify it. Across Europe and Australia, policymakers are increasingly recognising that platforms act as publishers. Their algorithms, recommender systems and design choices determine what content children encounter and they must be held accountable for the harms that result.
There is also a real concern that bans may push young people into less visible or more secretive online behaviour. Australian experts warn that restricting access may lead some teenagers to engage online without any adult awareness or guidance, which only increases their vulnerability if something goes wrong. Instead of removing opportunities for learning, we believe young people need structured, guided practice in navigating online spaces safely — just as they do in the offline world.
For all of these reasons, we believe that a more effective response lies elsewhere. International evidence points toward an approach that combines much stronger platform regulation that includes safe‑by‑design requirements, a total ban on harmful content, eliminating harmful algorithms, and tighter controls on addictive features, alongside sustained investment in digital literacy, education, and gradual, supported access. This makes the online space safer for all before and after our young people reach their 16th birthday. Australia’s experience also shows that regulatory pressure alone can push platforms to change even before enforcement fully takes hold. The UK, too, is moving toward targeted restrictions on algorithmic and design features rather than blanket bans, recognising both practical and developmental concerns.
It is critical that it is the platforms that are held to account and this means that they must be treated as publishers. Further delays in recognising them as such only exposes more people to the toxic and harmful content that they have successfully commercialised.
We all want the same outcome: a digital world in which every person is protected from harm and supported to thrive. But based on the emerging evidence, a blanket ban on social media risks directing our attention away from the real source of harm — the systems and structures created by Big Tech — while also limiting the opportunities young people need to build confidence and resilience online.
It is vitally important that the conversation is not just about the online harm children are exposed to but the reality that people of all ages are currently suffering appalling tech-facilitated violence while platforms are free from the real accountability that the classification of ‘publisher’ would bring.
Dublin Rape Crisis Centre therefore urges the conversation about eliminating the harm caused by social media to be redirected towards the platforms themselves and that real change is achieved through the classification of online platforms as publishers. A safer internet does not lie in banning under-16s from using social media, but in regulating platforms for what they are: publishers. This approach is key to addressing the root causes of online harm and ensuring meaningful protection for all.