Cover photo

From Casts to Knowledge

dataDAOs mining knowledge on Farcaster

Introduction

Imagine every time you share a thought, a picture, or a video on Farcaster, you're not just casting content—you're contributing to a vast, decentralized library of knowledge. Every cast carries potential insights about ourselves, the world, and everything in between. But how do we unlock this knowledge treasure of data for the greater good, and more importantly, how can we ensure that everyone benefits, not just the tech giants? Let's dive into the world of knowledge mining on Farcaster's Social Data Ledger.

Turning Casts into Knowledge with a Context

The magic starts with context. Without it, casts are just... data. But with context, these bits and pieces of data transform into knowledge. And who's better at understanding context than us, humans? Jup, we humans are still superior to LLMs in understanding context. The first feature in Farcaster that have users providing context for their casts, are channels. Attaching a channel to our casts, helps in finding the context. Moreover, other users interacting with the cast, through replies or recasts can provide further context. So all that happens organically and most casts perhaps do not carry valuable knowledge that others would seek to know & worth the computation necessary to turn into knowledge. How can we create an incentivized system governed by humans that signals & curates the knowledge? And how can we implement this system for humans to do this seamlessly & natively in their farcaster feed?

The Power of Human Curation

Imagine stumbling upon a cast discussing the latest in ERC404, revealing the new algorithm in the $DEGEN airdrop or possible scams and traps that are caught, e.g.

Now, imagine we see this as knowledge that can help people to not get scammed. All you can do to participate in knowledge mining would be to cast in reply:
1 $knowledge - Scam with fake app

With a simple tip and a comment, you can signal this as valuable knowledge & provide context. It's like saying, "Hey, pay attention here; there's something worth knowing." And the best part? You're doing it as part of your daily scroll through Farcaster, seamlessly weaving knowledge curation into your digital social life, while incentivizing knowledge publisher & yourself, when knowledge is added to the dataDAO knowledge graph and consumed.

The Knowledge Mining Ecosystem

Once your tip sends a signal, knowledge mining bots spring into action. They, along with human-supervised LLMs, sift through the content, connecting dots and filling in gaps within the decentralized knowledge graph. This ecosystem includes:

  • Cast publishers: The original sharers of knowledge.

  • Cast signallers: Those who highlight valuable knowledge.

  • Knowledge extractors: Bots and LLM models, with a human supervision.

  • Knowledge graph nodes: The storers and publishers of our knowledge assets.

  • Knowledge graph query services: Making it easy to find what you need.

  • Knowledge graph consumers: Anyone seeking knowledge.

This cycle ensures that from creation to consumption, knowledge remains open, uncensored, and ethically accessible to all—truly democratizing data and AI.

dataDAOs can form specializing in different types of knowledge, leveraging different types of mechanism in tipping, tokenomics, governance over knowledge extraction filters & pipelines, knowledge storage infrastructure, knowledge query monetization, etc. There can be many flavours, and consumers can decide based on the quality of knowledge

Why a Knowledge Graph? If you're curious about why we're focusing on a knowledge graph format, check out my previous post. It's all about creating a web of interconnected information that's as easy for machines to navigate as it is for humans.

Looking Ahead

Next time, we'll explore the pioneering protocols building the backbone of web3's knowledge graphs (storage & retrieval), long before the concept became mainstream. Stay tuned as we uncover the infrastructure enabling our journey from data to knowledge.

About me & datalatte

My name is Amir, and I am an engineer based in Berlin. After completing my PhD in chip design, I discovered my passion lies at the convergence of data, artificial intelligence (AI), and web3 technologies. Over the past three years, I have been engaged in research and development with datalatte, collaborating with partners such as Ocean Protocol (a decentralized data marketplace), OriginTrail (a decentralized knowledge graph), and Neuroweb.ai (Knowledge mining for AI).

Follow us:
datalatte Warpcast
dudeamir Warpcast

Loading...
highlight
Collect this post to permanently own it.
dudeamir.eth logo
Subscribe to dudeamir.eth and never miss a post.
#farcaster#daos#ai