Don’t miss out on our latest stories. Add PCMag as a preferred source on Google.
Google has partnered with a group that combats revenge porn and pledged to hunt down and block non-consensual images.
The tech giant already offers users the option to request the removal of their images. However, users need to go looking for their explicit images and report them to Google. With its new partnership, Google will use StopNCII’s hashes to proactively find and block the images by itself.
Hashes are digital fingerprints assigned to your intimate images or videos. You can upload an image from your phone, and StopNCII will assign it a hash value, which is a string of letters and numbers and not the image itself, and share it with participating platforms. The image never leaves your phone, only the hash does.
If an image matching the hash appears on other participating platforms, it will be taken down. These checks are conducted periodically, and you can use your StopNCII case number to track progress.
Recommended by Our Editors
Participating platforms include Tinder, Bumble, Facebook, Instagram, and Microsoft Bing. Tinder and Bumble joined in 2022, Meta in 2023, and Bing in 2024. In 2024, Meta removed as many as 63,000 Instagram accounts linked to extortion scams.
As Bloomberg notes, Google is late to the game and has been criticized in the past for not considering it. Even now, the rollout will take a “few months,” it says.
Get Our Best Stories!
Stay Safe With the Latest Security News and Updates
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
About Our Expert
Experience
Jibin is a tech news writer based out of Ahmedabad, India. Previously, he served as the editor of iGeeksBlog and is a self-proclaimed tech enthusiast who loves breaking down complex information for a broader audience.
Read Full Bio