The Future of Vaping, BluCig, and YouTube’s Broken Moderation System
On November 7th, 2025, at 6:31 PM, I received a notification from YouTube that one of my oldest videos “The Future of Vaping ~ The BluCig” had been removed for violating community guidelines. Along with the removal came a formal warning against my channel. But here’s the Thing: the video was from 2016. And it was satire.
Let me give you some context. In 2016, I uploaded a video titled “The Future of Vaping ~ The BluCig”, a satirical piece that raised a simple, uncomfortable question: “What happens to vaping if we don’t push back against corporate influence? and FDA’s Epidemic?” The answer, I posited, was a future dominated by sterile, lifeless devices from Big Tobacco like the BluCig. It wasn’t an endorsement. It wasn’t a product placement. It was a warning.
Almost a decade later, YouTube flagged the video. It was removed. A strike was placed on my channel. And the appeal button? Non-functional. When I clicked to contest the decision, I was met with a gray screen and the words: “Review not available.”


I shared this experience publicly on X (formerly Twitter), tagging @YouTube and @TeamYouTube to bring awareness to the absurdity of the situation. The removal of a nearly 10-year-old satirical video, with no opportunity to appeal speaks VOLUMES about the current state of content moderation on YouTube.
This isn’t just about my video. It’s about the precedent. It’s about creators, educators, and advocates whose work may be caught in the crosshairs of automated systems that fail to recognize nuance, satire, or historical context. Worse still, when those systems malfunction, creators are left without a voice or a path to resolution. Since being completely demonetized, I have also lost certain other features including “Partner” status, meaning I have no way of reaching out other than social media. That’s insane.
There must be a better way forward. Platforms like YouTube need to build systems that distinguish between malicious content and critical commentary. They need to build tools that let creators challenge wrongful decisions transparently and effectively. And they need to treat creators… especially long-time contributors who helped shape the platform with more respect IMO.
If a satirical critique of Big Tobacco from 2016 is now punishable, what’s next?
This article was mostly written by a robot and GrimmGreen
Update: I’m not the only one. As highlighted in a post by @SmileySmilik on X, dozens of creators are reporting the instant rejection of appeals—some within seconds—suggesting that YouTube’s “manual review” process may be entirely AI-driven. This stands in direct contradiction to public statements made by @TeamYouTube assuring users that appeals are manually reviewed. The growing gap between YouTube’s statements and creator experiences is impossible to ignore.
