We assessed 361,062 reports and 7 in 10 (252,194 reports) of those led us to finding imagery online of children being sexually abused.
We were able to find 64% more of this criminal material in 2021 due to some significant improvements we made within our Hotline – to our working practices and procedures, the technology that we’re using and not to mention making best use of our hugely skilled and experienced Analyst team we have.
2021 was the year that we saw sexual abuse imagery of girls being shared more widely than any previous year. Girls were seen in 97% of the imagery we helped to remove.
That’s not to say we didn’t see imagery of boys; we did. And for the first time this year we took a more detailed look at what this imagery can tell us.
Almost 7 in 10 instances of child sexual abuse involved 11-13 year olds. And when we see imagery of babies, toddlers and young children aged 6 and under, they are more likely to be suffering Category A child sexual abuse over Category B, or Category C.
“Self-generated” child sexual abuse, where someone captures a recording via a phone or computer camera of children who are often alone in their bedrooms, is now the predominant type of child sexual abuse imagery we’re finding online – just over 7 in 10 reports include this type of content.
6 in 10 actioned reports specifically show the sexual abuse of an 11-13 year old girl who has been groomed, coerced or encouraged into sexual activities via a webcam. Sadly, we’ve seen instances of children aged 3-6 being contacted and abused in this way.
For the first time we’ve looked at the prevalence of female offenders in the imagery that we see. We’ve seen how this imagery most often involves children aged 7-10 years old, and that boys are most often seen being abused by a female offender.
We’ve published a deeper analysis into the abuse of domains in relation to child sexual abuse, as we believe a greater focus in this area could have a significant and positive impact on thwarting the distribution of child sexual abuse material on the internet.
We now have more than one million unique image hashes of child sexual abuse. And around a third of those include detailed metadata on the type of sexual activity seen. We’ve published this breakdown which puts into real words the crimes being inflicted upon children.
We encourage you to use our data and information to inform your own work and understanding of the prevalence, distribution and fight to eliminate online images and videos of child sexual abuse.