
Irish Internet Hotline fully supports a total ban on “nudify” apps and other forms of AI-based functions that can produce deepfake sexual images of children and adults.
Our longstanding position remains that there is no legitimate purpose for such technology. Nudify apps and similar AI tools serve only to undermine the safety and dignity of others, and as such are an affront to fundamental rights and personal autonomy. When children in particular are targeted, the consequences can be lifelong and devastating. It is absolutely crucial that the Government acts to make it an offence to possess, create, or distribute AI tools capable of generating deepfake ‘nudification’ images.
Through our work at the Irish Internet Hotline, we continue to see the real harms caused by child sexual abuse material (CSAM), child sexual exploitation material (CSEM) and Intimate Image Abuse (IIA). AI-generated sexual imagery adds a new and dangerous dimension. Tools that can “nudify” images, simulate sexualised acts, or realistically manipulate someone’s likeness create serious risks of abuse, harassment, and coercion.
Irish law covering the production of such material is already robust. The possession and distribution of any sexual imagery depicting children (including AI-generated content) is already always prohibited under the Child Trafficking and Pornography Act (1998). Additionally, under the Harassment, Harmful Communications and Related Offences Act 2020 (“Coco’s Law”), it is an offence to take, publish, distribute, share, or threaten to share intimate images without a person’s consent, including images that purport to be of someone even if they are digitally altered. The law recognises two levels of offence: one where there is intent to cause harm, which can carry an unlimited fine and up to seven years’ imprisonment, and another where images are shared without consent even without specific intent to harm, which can carry a fine of up to €5,000 and/or up to 12 months’ imprisonment.
The emergence of these AI tools underscores a simple but urgent truth: technological innovation must never come at the cost of human dignity, consent, or the safety of children. Strong legal prohibitions and enforcement are essential to prevent AI from being misused to commit sexual offences. Today, any generative AI engine can produce images from a text prompt, including naked images or sexualised depictions of children. Only by applying safety by design principles can this be prevented. Some companies implement these safeguards, others do not. We call on all AI developers releasing tools to the public to ensure robust safety guardrails from the outset to prevent inevitable misuse and harm.
Anyone affected by online sexual abuse or non-consensual sexual imagery can report illegal content directly to the Irish Internet Hotline at hotline.ie/report or to An Garda Síochána, and access specialist support through their local rape crisis centre. Even where AI has been used to create the images, it still constitutes a serious offence under Irish law.