Edgar Cervantes / Android Authority
TL;DR
- YouTube has been fighting a battle against inappropriate content targeted at children for years now.
- The latest wave of problematic videos combines cutesy AI characters with nightmare scenarios.
- YouTube is trying to enforce is policies against such violators, but the low barrier for creating new content makes that a constant struggle.
YouTube may either be the most useful resource ever for parents looking to keep children entertained, or a dangerous cesspool callously exposing vulnerable young minds to content intended to traumatize. That’s going to depend not just on who you ask, but what’s going on in the latest headlines.
In a new exposé published by Wired, the site looks into the newest trend that seems to be targeting young viewers. Channels like “Go Cat” are plastered with attractive, smiling characters, easily created with today’s readily available AI tools. And while the channel starts off advertising itself as “a fun and exciting YouTube channel for kids,” we quickly start getting to they eyebrow-raising part, with promises of “beloved toys … reimagined in strange, funny, and sometimes spooky new forms.”
They are not kidding about the “spooky” bit there, and it doesn’t take long for the smiling cats to have their parents gunned down before them, or the Minions to be dissolved by nuclear waste and transformed into sewer-dwelling worm-beasts. So why is YouTube allowing this kind of disturbing slop to proliferate?
While YouTube leans on its Community Guidelines, pointing out how all content must abide by these rules — and that includes not targeting children with inappropriate material — enforcement is another story. The existence of powerful AI video generators means that even if a channel gets taken down or demonetized, spinning up a new one is trivial.
Even with hundreds of thousands of views attached to these clips, it’s unclear how popular they really are, with entries in video comments raising suspicions of bot-inflated numbers. Traffic sources like those could be exactly why it’s so easy for the forces behind these kind of channels to reinvent themselves over and over again.
So what’s to be done? Well, for starters, parents should feel free to be more involved in spending time with their children while they’re consuming media, making sure that they’re making appropriate choices. And when you do come across videos that are clearly inappropriate, take advantage of the platform’s reporting tools to ensure that someone with the power to do something about it is reviewing them.
Or maybe just get your kid into books?