From Harm to Accountability: All Party Parliamentary Group on Sexual Violence
14 April 2026

Picture your feelings upon receiving an explicit image showing you in an encounter that never happened. This is just one scenario of thousands of instances of sexual violence playing out in the digital world. Another woman has had private intimate images stolen from an online account and fears they have been shared online; another suspects that an ex-partner secretly recorded a sexual encounter without her consent.
These personal stories and others like them reveal a growing crisis at the intersection of technology and sexual violence, one that Irish law is not yet equipped to address. The All-Party Parliamentary Group (APPG) on Sexual Violence, convened by the Dublin Rape Crisis Centre, held its third session to confront that crisis.
The session, From Harm to Accountability: Delivering Meaningful Reform, was chaired by Dublin Rape Crisis Centre CEO Rachel Morrogh and featured expert contributions from Dr Lorraine Hayman, a doctoral researcher at the Centre for Global Women’s Studies at the University of Galway, and Professor Barry O’Sullivan, an internationally recognised expert in artificial intelligence from University College Cork.
Since the group last met in December, the Grok incident on X, in which AI-generated non-consensual sexual images of real women were created and shared at massive scale, has laid bare just how easily these tools can be weaponised. As Rachel Morrogh put it in her opening remarks: “the genie is out of the bottle.” The question now is how legislators can get in front of the problem.
The Scale of the Problem
Dr Hayman presented findings from her research into what she terms “cyber-located sexual violence”, a spectrum of non-contact, online forms of sexual violence that goes well beyond deepfakes. Her survey of 280 women found that among 18–24 year-olds, 88% had experienced cyberflashing, usually men sending images of their genitals, primarily through direct messaging on social media. Sixteen percent had encountered non-consensual creation of intimate images, either real images of them or AI-generated content. Eleven percent had experienced intimate images being shared without consent, and 8% had been threatened with the sharing of a deepfake image made to look like them.
These figures point to a problem that is happening at enormous scale in Ireland and beyond, yet we still lack national prevalence data on technology-facilitated sexual violence. The lack of inclusion of such data in the 2022 CSO survey has created a major gap that needs to be addressed.
Perhaps most striking was the gap between how victims and people at risk of these harms them and how the law treats them. Ninety-eight percent of the women surveyed considered online sexual abuse to be sexual violence. Yet under Irish law, these offences are not classified as sexual offences. This disconnect has real consequences: only 12 women reported what happened to them. Only 3% sought support from a sexual violence service. Many weren’t sure whether what had happened was even illegal. Those who did go to An Garda Síochána were told there was nothing that could be done, even after the enactment of Coco’s Law.
Institutional Failure
One of the women who spoke to Dr Hayman shared her experience of cyberstalking, where her perpetrator used technology to stalk and harass her. She explained that reporting to An Garda Síochána compounded her harm, as she was told that there was very little that could be done for her.
Another, the same woman who worried that her intimate images had been stolen from an online account and shared online, was asked by An Garda Síochána to provide evidence that a crime had occurred.
Another’s ex-partner contacted her boss and accused her of being an abuser. Her workplace did not handle the situation with the care she needed, but with legal disclaimers about their lack of involvement in this private situation.
These are examples of systemic, institutional failures. The time and energy the women who were targeted in these abuses have had to pour into trying to feel safe and getting some form of accountability is time and energy they cannot spend on their careers, their families, or their communities. This is the cascading, real-world cost of legislative gaps.
The New Landscape of Harm
Professor Barry O’Sullivan brought over 35 years of expertise in artificial intelligence to the discussion. He was direct about the reality legislators are facing: generative AI now makes it possible to create fake intimate images that cannot be detected by any level of expert. Digital evidence has limited value when you can no longer tell what is real.
He described how social media platforms are built around recommender systems, algorithms designed to keep users engaged for as long as possible, that were never designed with these kinds of harms in mind. On X alone, a quarter of the platform’s 600 million accounts are anonymous, allowing perpetrators to act without consequence. The Grok scandal, he argued, amounted to automated sexual assault at an astonishing scale and with astonishing irresponsibility.
Professor O’Sullivan also flagged emerging threats that most people have not yet considered. Wearable devices like smart glasses are already being used to record people covertly, including intimate content. These devices can be linked to nudification technology, meaning someone could theoretically record an entire streetscape and generate non-consensual intimate images of everyone in it. The victims would never know. AI-powered tools can take a single photograph and identify a person’s home address, email, and other private data. The potential for AI-enabled stalking is vast.
He was equally blunt about the challenge of holding platforms to account. These companies are larger than many states. Fines, even substantial ones, mean little to companies for whom personal data is the core asset. Ireland’s position as a European headquarters for many of these firms adds a further complication: the country is financially dependent on companies it is being asked to regulate.
What Needs to Change
The discussion surfaced a number of concrete areas where legislative and regulatory action is urgently needed.
Close the creation gap in Coco’s Law. Currently, the non-consensual distribution of intimate images is criminalised, but their creation, including by AI, is not. The UK has recently approved legislation covering even the solicitation of deepfake creation, recognising that production can be outsourced to jurisdictions where it is not illegal.
Classify these offences as sexual offences. The current legal framework does not categorise technology-facilitated intimate image abuse as a sexual offence, despite the fact that 98% of affected women in Dr Hayman’s research identified it as such. This matters for how victims are treated, how Gardaí respond, and how the courts approach sentencing.
Treat platforms as publishers. Rachel Morrogh was clear: platforms must be recognised as publishers and held to the responsibilities that come with that designation. The current model, where platforms profit from content without accountability for harm, is unsustainable.
Introduce meaningful financial penalties. Current fines are paltry relative to the revenues of technology companies and do not function as a deterrent. Penalties need to be scaled to the level where they represent a genuine commercial consequence.
Require pre-release safety assessments. Professor O’Sullivan drew an intuitive comparison: pharmaceutical companies must demonstrate that their products are safe before they can sell them. No such requirement exists for AI technologies. The release of AI tools is, in effect, the test; the public bears the cost.
Improve regulatory coordination. Responsibility for AI-enabled sexual abuse is currently dispersed across Coimisiún na Meán, the European Commission, the Data Protection Commission, and An Garda Síochána. No single framework comprehensively addresses the risks, and the fragmentation creates real gaps in accountability and enforcement.
Legislate for harms, not just contexts. Dr Hayman argued that legislation needs to be drafted around the harms caused, rather than attempting to describe every possible technology or platform. The tools and spaces will always evolve faster than specific provisions. Technology-neutral, harm-focused legislation is the only way to get ahead of the problem.
Introduce traceability for digital content. Professor O’Sullivan noted that we can trace every piece of meat consumed in Ireland but have no equivalent for digital content. Digital provenance standards would help address the challenge of verifying the origin and authenticity of online material.
Looking Ahead
The evidence presented at today’s session underscored a central point: the harms being discussed are not new, but their scale, speed and accessibility have changed fundamentally. What was science fiction to most people five years ago is now a reality, and existing legislative and regulatory frameworks are not keeping pace.
Both Professor O’Sullivan and Rachel Morrogh stressed that this challenge needs to be tackled at EU level and beyond, given the scale and mobility of the technology companies involved. But there are concrete steps that Irish legislators can take now, starting with closing the creation gap in Coco’s Law and building toward a framework that treats platforms as accountable publishers.
Dublin Rape Crisis Centre will continue to support the work of this APPG and to advocate for legislative reform that centres the experiences and needs of survivors. We are committed to ensuring that as these technologies evolve, the protections available to those they harm evolve with them.