To support young people to remove sexual images of themselves online, together with the NSPCC we developed Report Remove in partnership with age verification app, Yoti.
In 2021, the IWF processed 110 reports which came to us from Report Remove.
It was launched in June after several years of development. By teaming up with the NSPCC’s Childline service, it ensures that the young person is safeguarded throughout the process.
Meta has been a long-term partner of IWF to help tackle child sexual abuse imagery online. Additionally, Meta collaborated with IWF to support the technical development and piloting of Report Remove.
How does it work?
- Young people aged 13+ are first directed to Yoti to verify their age using ID.
- They are prompted to create a Childline account, which allows them to be safeguarded and supported throughout the process.
- Young people are then taken to a dedicated IWF portal where they can securely upload images, videos or URLs (website addresses).
- IWF analysts assess the reported content and take action if it meets the threshold of illegality*. The content is given a unique digital fingerprint (a hash) which is then shared with internet companies to help prevent the imagery from being uploaded or redistributed online.
- The outcome will be conveyed to Childline who will then contact the young person via their Childline account to keep them updated and offer further support.
This solution provides a child-centred approach to image removal which can be done entirely online.
The young person does not need to tell anyone who they are (their ID is not linked to their report), they can make the report at anytime, and further information and support is always available from the Childline website.
Each hash is tagged as originating from ‘Report Remove’. This ensures that law enforcement bodies are aware that this is a self-referred image, to reduce the risk of children receiving a visit from police unnecessarily. As the Report Remove tool develops, we hope it may be possible to gain further information on self-generated images to see how this process and the laws surrounding it could be improved to better protect children.