Some Android users are starting to see blurred images on their devices while using Google Messages. It’s part of a Sensitive Content Warning system that obscures images containing suspected nudity. The feature was announced last year and is now rolling out on Android devices.
According to Google’s Help Center post, when the feature is turned on, the phone can detect and blur images with nudity. It can also generate a warning when one is being received, sent or forwarded.
“All detection and blurring of nude images happens on the device. This feature doesn’t send detected nude images to Google,” the company says in its post. These warnings also offer resources on how to deal with nude images.
It’s possible that images not containing nudity may be accidentally flagged, according to Google.
The feature is not enabled by default for adults, and it can be disabled in Google Account settings for teens aged 13-17. For those on supervised accounts, it can’t be disabled, but parents can adjust the settings in the Google Family Link app.
How to enable or disable the feature
For adults who want to be warned about nude photos or to disable the feature, the toggle switch is under Google Messages Settings / Protection & Safety / Manage sensitive content warnings / Warnings in Google Messages.
The nude content feature is part of SafetyCore on Android 9 plus devices. SafetyCore also includes features Google has been working on to protect against scams and dangerous links via text, and to verify contacts.
Measuring the feature’s effectiveness
Filters that screen for objectionable images have become more sophisticated due to a better understanding of context through AI.
“Compared to older systems, today’s filters are far more adept at catching explicit or unwanted content, like nudity, with fewer mistakes,” said Patrick Moynihan, the co-founder and president of Tracer Labs. “But they’re not foolproof. Edge cases, like artistic nudity, culturally nuanced images or even memes, can still trip them up.”
Moynihan says that his company combines AI systems with Trust ID tools to flag content without compromising privacy.
“Combining AI with human oversight and continuous feedback loops is critical to minimizing blind spots and keeping users safe,” he said.
Compared to Apple’s iOS operating system, Android can offer more flexibility. However, its openness to third-party app stores, sideloading and customization creates more potential entry points for the kind of content Google is trying to protect people against.
“Android’s decentralized setup can make consistent enforcement trickier, especially for younger users who might stumble across unfiltered content outside curated spaces,” Moynihan said.
According to Moynihan, making the system automatically opt out for adults and opt in for minors is a practical way to start. But he said, “The trick is keeping things transparent. Minors and their guardians need clear, jargon-free info about what’s being filtered, how it works, and how their data is protected.”
