I grew up in a country where censorship isn’t something you fight. It’s something you simply accepted and adapted to like an undercurrent to everyday life.
The press says what it’s told. Voices that veer off course are quietly sued, discredited, or bankrupted. As voices moved online, so did censorship, and centralized platforms facilitated the process. Now, the government simply issues a “correction” order, and if the poster doesn’t edit the content or take it down, the platform blocks it.
As we usher in the Web3 age, a decentralized internet promises an alternate reality. With no single “gatekeeper” to online content, will we finally have freedom?
It has all the mechanisms in its favor. But the real question isn’t just about code or servers. It’s about humans and how badly we want that freedom.
You see, where I’m from, “censorship is a necessary trade-off” eventually became the dominant narrative. For harmony. For national growth. For safety.
To varying degrees, we see the same pattern being played out all around the world. In the name of protecting public health or combating misinformation, we allow our speech to be managed when we’re convinced it’s “for the greater good”. Remember back when toilet paper became a prized possession? Major platforms removed posts that strayed from official health advice (even when voiced by credentialed professionals), arguing it was necessary to prevent harm—and people welcomed it.
Therefore, the task for a decentralized internet isn’t just to dismantle institutional power. It has to undo the mental chains we place around ourselves that keep us censored.
Will it be up to the challenge?
First, we have to see what it’s up against.
Centralized censorship is so efficient, we start self-silencing
The first issue is that centralized systems make censorship so quietly efficient, you don’t even realize what you see is filtered.
Most of our digital lives funnel through a few corporations: Meta, Google, Apple, and Amazon. When authorities want to shut something down, they don’t need to raid offices or jail editors. A well-placed legal threat or financial penalty to these few big players can make a post vanish, a user disappear, or an outlet collapse.
Corporations are in on it, too. Brands can quietly ask for unflattering content to be suppressed. Platforms themselves preemptively downrank “borderline” speech to protect themselves.
But the power of the system doesn’t just lie in what is silenced. It lies in the narratives it amplifies and how this shapes behavior.
The cycle starts like this. The platform allows the safest opinions to be more visible, the user is psychologically rewarded for this sanitized content, and then continues producing more of the same. Others take the hint, and the cycle of self-censorship spreads.
The algorithm becomes both enforcer and echo chamber, shaping not just what we see, but how we think.
What current decentralization does well
Current decentralization efforts are already tackling one core problem: concentrated control. By spreading power across protocols, peers, and networks, they remove the need to trust any single entity to host, approve, or distribute content.
Projects like IPFS, Arweave, and Filecoin decentralize storage. Social platforms like Nostr and Farcaster decentralize identity and publishing. This makes it far harder for any one actor (be it government or business) to silence a voice entirely.
But while these tools succeed at distributing control, they still haven’t solved the biggest barrier to mainstream adoption: complexity.
We need a simpler, decentralized internet
Right now, using decentralized tools requires a high tolerance for friction. The onboarding is confusing, too much jargon, too many wallets, and too many steps.
Most people won’t adopt a tool unless it’s as easy (or easier) than what they already use. And if they don’t see censorship as a problem, there’s no incentive to switch.
To scale, decentralized tools need to match or exceed the ease of centralized ones. Signal is more secure than WhatsApp, but WhatsApp won because it was smoother. Substack emails still land in Gmail inboxes because that’s where people already are. In practice, usability beats principle.
If decentralized platforms want to move beyond niche adoption, they need to make onboarding seamless, participation intuitive, and incentives obvious.
We need to make freedom matter more
But even if we solved the usability problem, we’d still hit the second, deeper psychological wall: most people don’t mind being censored. Or rather, they don’t feel censored. And if people don’t feel something, they won’t act.
We often come across platform bans, shadowbans, demonetization, and disappearing posts. But it doesn’t register as a threat. Most simply dismiss it as a moderation error or a platform policy that was probably justified.
Decentralization needs to make the attack on freedom more tangible. That means showing people what they’re not seeing, what they’re losing, and how easily it could happen to them. Otherwise, the average user won’t care if their tech is censorship-resistant—they’ll just want it to work.
This is why preventing silence isn’t enough. To matter, decentralization has to make censorship loud.
What if decentralized tools didn’t just preserve speech, but punished attempts to suppress it? What if trying to remove a post simply makes it spread even more? What if muting a voice unlocked more funding for it? What if censorship became high-cost, high-risk, and publicly embarrassing?
That’s the mindset shift decentralization needs: moving from merely evading control to actively eroding it.
When Spain tried to block Catalan referendum sites, activists mirrored it across IPFS—every takedown just made it spread faster. Imagine if that mirroring happened automatically, turning every censorship attempt into a built-in Streisand Effect.
On Nostr, every user and relay already has a public key. What if censorship actions were logged and linked to those keys by default? Blocking content wouldn’t just be hard—it’d be public and embarrassing.
This isn’t just creating virality for the sake of keeping content visible or shaming censors. It serves a deeper purpose: to make the battle public and the stakes tangible to people so it becomes something that feels powerful, contagious, and worth being part of.
When censorship becomes visible, it also becomes relatable. People start asking: What if that was my voice? What if it could happen to me?
That’s when buy-in grows. Not out of abstract principle, but out of personal foresight and fear.
By making the threat more real and personal, offensive tools don’t just protect freedom—they make people want to move toward it.
Decentralization’s hardest problem isn’t infrastructure—it’s us
Censorship and decentralization will always be locked in a long war. But this fight isn’t just about better infrastructure or more resilient networks. It’s about how technology shapes psychology, and ultimately, culture.
We’ve seen how powerful tech can be in rewiring behavior. Design nudges, algorithms, and convenience have easily normalized silence. They’ve made control feel frictionless, even virtuous.
But if technology has trained us to accept censorship, it’s time to use that power to reconnect people to a deeper instinct: the human right to speak, question, and be heard.
That’s the real challenge of decentralization: not just building systems that resist control, but designing experiences that remind us of the importance of personal freedom.