Google Partners with StopNCII to Address Non-Consensual Intimate Imagery
On September 18, 2025, Google announced a significant partnership with StopNCII.org to combat the spread of non-consensual intimate imagery (NCII), reports 24brussels. Over the coming months, Google plans to implement StopNCII’s hashes to proactively identify and remove nonconsensual images from its search results.
Hashes are algorithmically-generated unique identifiers allowing services to detect and block flagged imagery without revealing or storing the actual source. StopNCII claims to utilize the PDQ hashing system for images and MD5 for videos.
According to Bloomberg, Google has faced criticism for lagging behind other tech companies in adopting this method, an acknowledgment reflected in its recent blog post. “We have also heard from survivors and advocates that given the scale of the open web, there’s more to be done to reduce the burden on those who are affected by it,” the post states. Notably, platforms such as Facebook, Instagram, TikTok, and Bumble joined StopNCII as early as 2022, while Microsoft integrated it into Bing in September 2024.
In addition to this new strategy, Google offers tools for users to request the removal of non-consensual content and personal contact information. However, similarly to past initiatives aimed at tackling revenge porn, these measures place the onus on victims to identify and report the content. While flagging and removing harmful material, particularly AI-generated content, without requiring victims to submit hashes from their own devices presents challenges, advocates are calling for Google to address this issue effectively.