Why Channels on Farcaster are doomed to devolve into Reddit & Mastodon

Or why moderation matters

The background here I that I've experienced my first, slight hint, of overzealous moderation on Farcaster.

For those who don't know, channels on Farcaster, were launched as a feature of Warpcast (meaning not in-protocol) and are still in-beta. DWR petitioned the community for channel ideas, and these channels were launched without moderators. Eventually, the option to 'create' a channel for 25 Warps/year was added, and the new channel 'hosts' now had the opportunity to ban, warn, or hide posts.

A menu of options to banhammer your patrons

For existing channels created before the pay-to-create model, DWR petitioned community members to become moderators.

Of course, the only reason I'm writing this is because one of my <trump voice>beautiful, perfect</trump voice> cast was 'Warned & Hidden'.

Image embeds on cast

Now, I'll leave it up as an exercise to the reader as to what 'Channel norm' I violated, that was so egregious that my post promoting an event from the Ethereum Foundation's Privacy & Scaling Exploration team needed to be warned & hidden. Was it low effort? I'm not claiming it's a ton of work to make a cast, but off-topic memes are allowed and that's pretty low effort when they are not necessarily germane to the channel. Was it shilling? Nope, I don't work for PS&E, and it's literally an arm of the not-for-profit Ethereum Foundation. Overall, I don't really care which mod hid my post, I'm not trying to start a witch-hunt here, but if this was salient enough for me, it will become salient for others in time soon enough.

I know this is reading quite 'whiny & aggrieved'. Semi-guilty. But a few points (& I'll be quick since this isn't the main thrust of this article):

  1. The ZK channel existed prior to moderators

  2. These 'rules' came out of nowhere and of course, I wasn't notified

  3. I wasn't told what rule I violated

  4. I don't know which moderator took action, so I can't appeal to the other moderators that one moderator among them has gone rogue and needs to be removed.

Also let me steelman the case for hiding my post: often Subreddits want posts to spark discussion rather than just being unilateral links to other content. I will admit, my post was not really 'sparking discussion', but often an automoderator replies to your post on Reddit prompting you to 'add a comment' on why you think it's relevant and what you would like to discuss. That being said, this is not explicitly in the rules, so the "letter of the law" is on my side, if I do say so myself.

This isn't a knock on the people at Merkel Manufactory (the company behind Warpcast), they launched a killer feature and I've enjoyed using it. But it's started to dawn on me that moderation is '9/10ths of law' so to speak when it comes to channels/communities. There are entire departments at various social media companies from Discord to Facebook dedicated to creating moderation tools. It is a software category onto itself, and I don't expect a small team at Merkel to make this.

I think the fundamental problem here is a mismatch between channel citizens and channel moderators. I, as a channel citizen, want to reach other people interested in ZK. Really what I want is a hashtag for all ZK subscribers, I am not trying to 'opt-in' to some rules decided by moderators that I did not consent to, but as DWR as pointed out on several podcasts, hashtags do become kinda spammy, and ultimately became a bit of a joke (I'll try and redeem the idea of hashtags later on in this post). OTOH, moderators want to curate an exclusive community and be adored for doing so. Becoming a moderator often brings out the 'petty dictator' demons that reside inside every one of us. Often the goal becomes to have a 'mega downvote' or veto power over discourse, which is ultimately the ability to banhammer a community. Or even to become a gatekeeper around a certain topic and benefit from said gatekeeping. I recall a friend of mine who ran a very popular public Spotify playlist and would get invited to exclusive parties by famous hip-hop artists in exchange for adding their latest tracks. That's a pretty big sway for running a Spotify playlist!

This is the difference between a Community and a Topic. Topics are a p2p discovery between interested poasters and interested subscribers. Communities are often hierarchical and reach a stable equilibrium under democratic norms. (Of course, channels right now are not democratic).

Part of the whole schtick of Farcaster is to 'own your social graph', however if a channel user, like myself, builds up a followership in a channel, and the channel moderator bans me, do I really have ownership? Probably not.

So where do we go from here? We don't want to land up in the Reddit/Mastodon land of petty dictators and wannabe Eloms.

A couple ideas:

  1. Moderation via smart contract
    Trial by jury isn't perfection, but it's better than all the rest. Rather than leaving it to a moderation team to be judge, jury, and executioner: consider using a justice protocol like Kleros to assemble a jury that will judge the dispute based on the letter of the law & jurisprudence rather than what side of the bed the mods woke up on. At the very least, the actions taken by moderators should be public & auditable, not just siloed by Warpcast, so we know when they are acting with impunity.

  2. Redeem the Hashtag
    Hashtags can be spammy, but the core kernel of the idea is still a good one, you want to notify people interested in a topic with nil intermediaries. Using something like a Rate Limiting Nullfier can provide an economic incentive to stop spam (ironically, a protocol developed by PS&E, the org whose event I was promoting in the ZK channel). A further layer of simple to advanced AI can do classification to stop spam that makes it through the RLN filter.

  3. Multiple communities with the same name.
    Yes, it's nice to have "warpcast.com slash zk" refer to one and only one channel, however, there might be multiple communities who want to have that namespace with interests that are overlapping. Think about an even more contentious topic like an Israel/Gaza/Palestine channel, should there be one singular hegemon channel for each keyword? If we are building a permissionless protocol, we have to allow for the other side. Currently, Status.app is working on a community protocol curation protocol where whichever community stakes the most tokens (in this case SNT) is promoted to the top of the list in a given keyword search. There are probably other ways to solve this, but simply allowing for multiple communities to have the same name would be a big unlock, and the prioritization of which community to show at the top of the list could be done in umpteen ways from PageRank to staking economic value.

To conclude: I don't think Farcaster channels can be solidified in their current state. We have too many potential or actual petty dictators fending over their petty fiefdoms. I created the /sanjose channel for San Jose (CA), but do I want to be the dictator for life? Probably not, at least half of the city is more liberal than me and half is more conservative, and I don't want to be in the position of censoring anyone's reach when all they want to do is talk to other people in the same city.

We all need to work together to make Farcaster work rather than just placing all the weight on Merkel Manufactory to solve all our problems. If you're interested in working on this, please reach out, and we could create a working group or group chat at the very least. I don't have a lot of time on my hands, but "you have my axe."

I also asked Google Gemini for some ideas related to arbitrary & capricious moderation, here's what I got, enjoy it, for what it's worth (I liked the 'gaming the system' point):

When Power Goes Silent: The Problem With Arbitrary Moderation on Social Media

Social media platforms have become the public squares of the digital age. But unlike their physical counterparts, these online spaces are governed by a complex web of moderation policies and the often unseen hands of human moderators. While content moderation is essential for maintaining a civil online environment, the current system, reliant on often opaque and arbitrary decisions by moderators, faces significant problems.

The Issues with Unaccountable Moderation:

  • Arbitrary and Capricious Enforcement: Many platforms struggle with defining clear and consistent guidelines around what content is acceptable. This ambiguity allows for moderators to make subjective decisions, potentially silencing legitimate voices while letting offensive content slip through the cracks.

  • Lack of Transparency: The inner workings of moderation decisions are often shrouded in secrecy. Users rarely receive clear explanations for content removal or account suspensions, leaving them feeling powerless and frustrated.

  • Silencing Dissent: Under the pressure to maintain a sanitized online environment, moderators may inadvertently remove content that challenges the status quo or criticizes powerful entities. This can stifle healthy debate and free expression.

Towards a More Equitable System:

There's a need for a more robust and transparent content moderation architecture. Here are some potential solutions:

  • Community-Driven Standards: Platforms could involve users in developing and refining content moderation policies. This could be through surveys, focus groups, or elected user representatives.

  • Algorithmic Assistance: Machine learning can be trained to identify harmful content like hate speech and violent threats. However, such algorithms should be used in conjunction with human review to avoid unintended consequences.

  • Decentralized Moderation: Blockchain technology offers the potential for decentralized moderation systems where decisions are made by a distributed network of users. This could increase transparency and reduce the risk of bias.

The Path Forward:

A well-functioning social media landscape requires a balance between free expression and content moderation. By fostering a more transparent and accountable system, platforms can ensure a safer and more inclusive online environment for everyone. This can be achieved through a combination of community involvement, technological advancements, and a commitment to upholding the principles of free speech.

Loading...
highlight
Collect this post to permanently own it.
DFern Longform logo
Subscribe to DFern Longform and never miss a post.