As intimate image abuse reaches crisis levels and the increased awareness of NCII drives demand for stronger legislative action, the Trust and Safety industry must work together to strengthen and evolve the methods and technologies to protect users. Today, we’re proud to announce that we’re partnering with StopNCII.org (hosted by UK not for profit charity SWGfL) to integrate their hashing technology with our new image similarity matching solution to combat non-consensual intimate image abuse.
“We’re thrilled to be partnering with Resolver to further our mission of combating non-consensual intimate imagery online,” said [David Wright, CEO of SWGfL]. “By integrating our StopNCII.org hashing technology with Resolver’s advanced image matching capabilities, we can significantly expand our reach and prevent more adults across the world from having their privacy violated. This collaboration is a major step forward in the fight against this insidious form of abuse.”
Cameron Pearce, Principal Product Manager at Resolver added, “Our partnership with StopNCII.org represents a powerful step forward in protecting online communities from the devastating impact of non-consensual intimate imagery. By integrating StopNCII.org’s hashing technology with our image-matching capabilities, we’re giving our partners the ability to quickly identify and act on its harmful distribution, helping to safeguard privacy and ensure safer digital spaces for everyone. This collaboration is an important part of our mission of making the digital world a safer place”
Why this matters now
The UK government has classified the sharing of intimate images without consent as a ‘priority offense’ under the Online Safety Act. This means platforms must proactively prevent and remove this content or face penalties up to 10% of global revenue. This comes as the human cost of online abuse continues to rise at an alarming rate. In 2023:
- Over 36.2 million reports of suspected exploitation were made to NCMEC’s CyberTipline
- The UK Revenge Porn Helpline (operator of StopNCII.org at SWGfL) saw a significant uptick in reported cases with 19,000 reports made in 2023 representing a 106% increase in reports compared to 2022.
- Increasingly, sextortion is the predominant form of intimate image abuse reported to the UK Revenge Porn Helpline, experiencing a 54% increase in reported cases in 2023 compared to the previous year.
- The emergence of AI-generated CSAM content has added new layers of complexity to an already devastating problem
Keeping Pace with the Escalating Crisis
Our new technology represents a shift in how platforms can identify and remove harmful content on their site. Unlike traditional systems that require exact matches, our AI-powered technology can detect content when it’s been modified, addressing a critical vulnerability in existing protection systems.
In partnering with StopNCII.org, a free global online tool already working with major social platforms, we can access their database of protected content hashes – unique digital fingerprints of intimate images. This integration allows us to review these hashes against our platform partners’ data and immediately alert them when protected content appears, helping prevent its spread across multiple sites.
Key Capabilities
- Advanced Pattern Recognition: Identifies nuanced similarities between images, catching content that has been altered to evade detection
- Privacy-First Design: Maintains user privacy while effectively identifying harmful content
- Regulatory Compliance: Fully aligned with DSA and OSA requirements
- Continuous Learning: Adapts to new threats and evasion tactics
What This Means for People at Risk
- Faster Protection: When someone reports their content to StopNCII.org, our technology can identify if that image appears on any of our partners’ sites, helping prevent it from spreading across multiple platforms
- Better Prevention: Even if abusers try to modify images to evade detection, we can still identify and report it back to our partners
- Lasting Impact: Once modified content is flagged for protection, it stays protected, providing long-term peace of mind
- Strategic Intelligence: Cross-platform pattern analysis helps identify and disrupt coordinated abuse networks before they can scale
Supporting Platforms to Support People
For online platforms, this isn’t just about compliance—it’s about taking meaningful action to protect their users. Our partnership enhances how intelligence services help platforms of all sizes to:
- Respond more quickly to abuse reports
- Prevent the spread of harmful content
- Provide better support to users in crisis
- Create safer spaces for authentic connection
- Demonstrate proactive approaches to regulators globally
A leap forward in online safety
At Resolver, we are committed to creating safer online spaces. Our partnership with StopNCII.org increases the reach of their critical tool, enhanced with our in-house image similarity matching which acts as a force multiplier combining increased detection capabilities, adaptive intelligence, and scalability through our risk intelligence services. Only by acting together can we take meaningful action against intimate image abuse.