As someone who has spent months documenting, reporting, and repeatedly reaching out to VRChat’s moderation team, I can say with full certainty: VRChat is failing its community.
Worse, it’s failing the thousands of underage users on its platform by allowing sexually explicit avatars to remain accessible in public worlds. These avatars are not subtle. They’re not merely “suggestive.” They’re explicit, with full nudity toggles, simulated sexual parts (DPS/SPS), and hyper-sexualized models that can be used—and are being used—in public worlds accessed by children.
And VRChat knows it.
I’ve reported over 120 public avatars, each of which violated VRChat’s own Terms of Service and Community Guidelines. Each avatar included blatant sexual content, from strip toggles to simulated genitalia. And while some of these avatars were later switched to “private”— which proves the reports were acknowledged — this lazy moderation tactic does nothing.
Any user can reupload the same avatar. They can flip the switch and make it public again. And no action is ever taken against the accounts that originally uploaded them. That means the same user who put a hyper-nude avatar in a public world where minors play can go back and do it again without a single consequence.
Let That Sink In: No Consequences. No Accountability.
VRChat is a platform that boasts over 100,000 active users daily, and it has a rapidly growing Quest user base—which includes children as young as 13. Yes, the platform allows minors. And yes, those minors can—and do—equip fully nude avatars because they are public and often marked as Quest-compatible. This isn’t hypothetical. It’s happening now.
This isn’t just poor moderation. It’s corporate negligence.
When I reached out through the official moderation ticketing system, I was met with silence. Weeks passed. Tickets were ignored. In rare cases where action was taken, it was insufficient, temporary, and left the underlying problem untouched.
Out of frustration and with deep concern for platform safety, I reached out to multiple senior staff at VRChat Inc., including:
- Casey (VP of Product & Production)
- Graham Gaylor (Co-Founder & CEO)
- Sam Luangkhot (Senior Community Manager)
- Travis Morrison (Chief of Staff)
- Thomas Hyde (VP of People)
Not a single response.
The VRC+ Illusion
I’ve been a paying member. I’ve supported the platform. But VRC+ membership doesn’t come with any enhanced support, moderation tools, or even a voice in community health. You’re just another number feeding revenue to a company that reduced its workforce by 30% while its platform grew in daily users and risk.
Meanwhile, the few remaining staff are too overwhelmed—or disinterested—to address one of the platform’s biggest threats: unchecked exposure to sexual content.
Why It Matters
Because this isn’t about prudishness. This is about basic digital safety and platform ethics. When minors are allowed to equip and explore NSFW avatars that simulate sexual acts, it’s no longer just a community problem. It becomes a legal and reputational liability. And it’s something the entire VR industry should be watching closely.
It’s astonishing that in 2025, a leading VR social platform can look the other way while its underage users are exposed to—and actively using—avatars that could easily fall under definitions of sexually explicit material.
I’m Done Waiting
I’ve sent emails. I’ve submitted tickets. I’ve done the job of a community moderator for free, painstakingly documenting and flagging harmful content. And in return, I’ve been ignored.
No one should have to fight this hard just to get a basic safety violation acknowledged.
So now, I’m bringing this to the public. If VRChat won’t listen to those of us within the community trying to uphold its own policies, then maybe it’s time the wider tech world takes notice.
A Call to Action
If you’re a developer, creator, or someone who cares about the future of digital platforms: speak out. If you’re a parent, take a hard look at what your kids are accessing. And if you’re part of the VRChat team reading this: do your damn job.
You don’t get to hide behind automation and ticket systems anymore. Your users—especially your youngest ones—deserve a safe environment. And those of us who care about this community aren’t going anywhere.
Harry Varden