Susie Hargreaves OBE
Marking 25 years of combatting child sexual abuse imagery on the internet is bittersweet; I’m so proud of this organisation – which I have led for more than 10 years now – and equally sad at the large volumes of criminal imagery we’re finding.
I asked Professor Hany Farid who’s dedicated his professional life to creating technology to help stop this imagery, for his reflections on the past 25 years, and he didn’t hold back. We’ve also captured stories from our analysts. We’re breaking new ground in this battle, and it’s important to capture the voices of those resilient individuals who work to stem the repeated abuse of sexually abused children.
In 2021 the UK’s National Crime Agency revised their estimate of the number of people who pose a sexual threat to children in the UK. They put it between 550,000 to 850,000. Whilst it’s a great achievement that thanks to our work, and that of UK hosting companies, such little child sexual abuse material is hosted in the UK, it means very little when so many of its population want to view it, and when year-on-year our numbers only show that the situation is getting worse.
This year, we were able to find so much more of this material than ever before. We could do this thanks to huge strides within our Hotline to work more efficiently, using technology better and being able to employ an additional two analysts.
Additionally, we’ve developed a way to reduce the number of off remit reports being presented to our analysts with an improved reporting process for the public. This has saved our analysts needing to assess nearly 10,000 off remit reports, allowing them instead to focus on their proactive searching for this material.
We launched IntelliGrade – a world first which allows our dedicated team of graders, funded by a Thorn grant, to quickly assess and ‘hash’ (create digital fingerprints) child sexual abuse images from the UK Government’s Child Abuse Image Database (CAID). What’s new, is that this grading process allows the hashes to be compatible with multiple legal jurisdictions around the world. And at the same time, we’re adding large volumes of metadata which allows us to understand more about the sexual abuse happening to the children pictured, and provides a way for technology companies to build and train the tech of the future.
Tackling child sexual abuse material is ever challenging, but by working collaboratively and forging great partnerships with technology companies, governments globally, law enforcement and the third sector makes it possible to do this effectively.
Finally, a word for our team of dedicated analysts: They spend each and every day assessing some of the most challenging content imaginable. They do this because they know that for every image or video they remove, it stops that child being revictimised and gives that child some hope.