Self-Generated CSAM Content: What You Need to Know​

Self-Generated CSAM Content: What You Need to Know

In light of the publication of the  Irish Internet Hotline 2024 Annual Report, I wanted to address a term that often causes confusion, and occasionally concern – Self-Generated Child Sexual Abuse Material (CSAM).

What is CSAM?

CSAM refers to any visual depiction of sexually explicit conduct involving a child. It is illegal to possess, distribute, or produce such material under Irish law, specifically the Child Trafficking and Pornography Act 1998, as amended. It is a very clear definition under Section 2 and you can read more about its terms and interpretation here.

 What Does “Self-Generated” Mean?

The term self-generated simply means that the child appears to have taken the image or video themselves. This material is most often produced using a smartphone or webcam, alone or with a friend.

What is crucial to remember is that the term does not imply consent, intent, or awareness of the consequences.  It is not a moral judgement.  It does not imply blame.

This classification is used by our analysts to describe the apparent origin of the material, not the circumstances under which it was created. Our analysts cannot and do not extrapolate or make assumptions about whether the child was coerced, manipulated, or exploited. They assess only whether the content meets the legal threshold for CSAM.

As noted by M Leary (2010), this is a complex problem that spans a spectrum. This material ranges from naively produced images, to coercion and ultimately too to malicious viral distribution. The term self-generated reflects that complexity, it is categorically not a judgment. Prof. Mary Aiken also did a great deal of work on this issue.

 Why This Matters:

Misunderstanding this term can lead to victim-blaming, which can be deeply harmful. Children who appear in such material are victims, regardless of the manner in which the content has been created. As such, providing them protection, support, dignity and justice is of utmost importance.

In many cases, the full context behind the creation of this material can only be uncovered through home interventions by police and social services. This too is a process that is often traumatic for children and families.

Meanwhile, criminals and individuals with a sexual interest in children are actively collating and sharing this material, often in deep-web hidden forums hosted by so-called “bulletproof” providers that are increasingly difficult to take down. In 2024, the Irish Internet Hotline saw a very large increase in this material, which has also been seen by the INHOPE network and Law Enforcement across the world.

The Role of the Irish Internet Hotline:

At Irish Internet Hotline, we provide a safe, anonymous way for the public to report suspected CSAM. We analyse these reports and then work closely with law enforcement and international partners to remove harmful content and protect children.

We are proud to be part of the Irish Safer Internet Centre, alongside our partners:

Together, we are working to make the internet a safer place for children and young people.

 What You Can Do:

  • Report suspected CSAM at hotline.ie– we will assess and decide what action is needed.
  • Talk to young people about the risks of sharing intimate images
  • Support efforts to improve digital safety education and legislation
  • Challenge misconceptions that blame victims or downplay the harm

This is a conversation we need to have, openly, honestly, and with compassion. If we want to build a safer internet for children, we must start by understanding the language we use and the realities behind it.

Let’s keep talking.

Mick Moran