Before we dive into creating your own digital asset, lets cover a few fundamental concepts to shine some light on the technical shenanigan’s & structure a mental framework to help guide you in thinking about how to take advantage of this incredible opportunity.
What is tokenization?
Tokenization is a fancy way of saying: “creating a digital asset”.
In the context of blockchain and cryptocurrency, tokenization refers to the process of issuing tokens (digital units of account) that can represent any arbitrary form of value.
What can be tokenized?
Nearly anything you can imagine.
Ranging from familiar real-world assets including stocks, bonds, real estate, or commodities like precious metals (gold/platinum/copper/etc.) to more novel, abstract things such as utilities (access to a community), services (data storage/content delivery), or functions within a network.
What are the benefits of tokenization?
At the most basic level, the idea of tokenization is to make assets more economically efficient. However, the exact benefits brought on by tokenization will depend on the specific use case/application of the tokens being created.
Blockchains are distributed databases that allow participants to share a single immutable record of ownership and activity. These digital ledgers can be used to track a wide range of transactions in a secure and transparent manner. Assets that are issued on a blockchain inherit the underlying mathematical/technological primitives of its mother chain including:
- Granularity (fractionalization)
Digital assets are highly divisible allowing for fractional ownership. This fractional ownership decreases the participation threshold and in turn, introduces a new caliber of participants that have previously been left out of traditional markets due to capital constraints.
- Accessibility (borderlessness)
Distributed ledgers (public ones) are geologically neutral and do not discriminate against user provenance. Meaning that there is no resistance for new participants to join. This can be understood particularly well through the fact that tokenized assets can be transferred easily and within seconds from and to anyone in the world. While many leading governments view this as a negative because it allows for regulatory arbitrage, the massive pool of potential investors/traders outside of these government confines eclipses their minority interests.
- Interoperability (composability)
The programmable nature of a digital asset lends itself to a more fluid logical construct that can be seamlessly integrated across a variety of protocols for use in decentralized applications and stored within smart contracts. This integration is especially prominent for the DEFI sector as the interconnectivity between protocols expands potential liquidity.
- Transparency (openness)
Blockchains have permanently redefined the structure of data. Legacy computation was based on the C→R→U→D scheme (create, read, update, delete) but blockchains are append-only data structures that removed the D and left us with a C→R→A (create, read, append) scheme. Given that nothing can be deleted, nothing can be hidden. This is the holy grail of DLTs and the future of all finance. Blockchain data is open data; all of the activity that happens on-chain does so with the option to be publically verified. (this does not mean that private data will be publically available, just the necessary metadata to guarantee honest behavior).
*** Another lesser-talked-about benefit is the critical element of near-instant finality in the settlement time. Once a transaction is completed it cannot be reversed, which provides a higher level of confidence for users.
What types of tokens Exist?
There are two archetypes of tokens that are categorized according to their fungibility profile, either fungible or non-fungible. The main difference between the two is their liquidity profile; fungible tokens are typically highly liquid instruments, whereas non-fungible tokens are highly illiquid. Liquidity refers to the ease with which these assets can be traded.
Fungible tokens refer to assets that are equally valued and easily interchangeable; 1USDT=1USDT. The most well-known/widely adopted example of fungible tokens are ERC-20s on the Ethereum blockchain. Used to represent a broad range of assets, such as cryptocurrencies or voting rights within a decentralized autonomous organization (DAO).
Non-fungible tokens (NFTs) are a newer class of tokens that are related but not easily interchangeable; 1 BMW x5 ≠ 1 BMW x6. Made famous through the ERC-721 framework, this type of token has thus far primarily been applicable to very niche-specific use cases such as artwork, gating (community access), and DAO membership. However, in my opinion, as the industry continues to grow the use cases of NFTs will at some point become far larger than those of the fungible variants.
* The difference in the DAO application between a fungible and a non-fungible token is hierarchical power separation. To increase prominence in a fungible token DAO, users must acquire more tokens; to increase prominence if a non-fungible token DAO users must acquire specific tokens (the more rare or higher valued versions).
Ok, so let's Tokenize something!
The technical processes of tokenization are rather simple, just deploy a token smart contract to any chain of your choice and voila, you made a token. There are tons of free tools available to help you do this within a few minutes, No developers or coding experience required! (You can literally google “create my own crypto token” and at least 20 different resources will pop up)
It is always recommend having somebody with experience do this for you in order to set up all of the specifications correctly, deploy properly, manage the tokens and make sure there are no vulnerabilities or backdoors in the service you use. There are no “re-dos” on the blockchain, once you deploy a contract it’s there, forever.
Given that token value is generally ascribed to the perspective of free-market buyers, the most intensive task at hand when it comes to tokenization is the deep thinking required in order to design the tokens & their respective ecosystems well enough to imbue them with value.
The process of tokenization can be broken down into six (6)steps:
Conceptualization of application
Defining what exactly you want the tokens to represent and how you intend on having them represent it. Will your tokens be claims of some underlying physical objects or will they only be useful in cyberspace? How will people interact with the token? Will there be a native application interface? Where will they be traded?
Designing the tokens to express their representation
Beyond the number of tokens and their emission schedule, token design refers to a wide range of things including their fungibility, incentive structure, distribution plans, supply policy (deflation/inflation/fixed), and reflexivity of transaction tax, among other things.
The future will be an interconnected and multi-chain, however, making sure that your token’s mother chains philosophy is aligned with the philosophy of your project will ultimately make or break your community. As a basic rule of thumb remember that fee markets dictate adoption; higher fees mean higher security. If your goods are lower value and high velocity, such as in-game objects, don't deploy to a chain that is extremely high in fees. On the other hand, if you intend of providing higher value lower velocity objects, such as shares of real estate, then maybe you should consider higher security.
I cannot begin to explain how important this step is. I have seen so many projects try to rush or skip this step that it is no surprise those projects are no longer alive. Documenting the tokenization will be the first layer of concept validation. Taking the time to write/draw everything out in a document will help organize your ideas properly and give you a chance to review the thinking process from the outside.
The least sexy part of the whole process. Modestly time-consuming, but incredibly valuable. Testing takes your concept to the next level. It implies two sub-processes; first, take the documentation you made and share it with the public/potential community members. Get feedback from your audience about what they think; odds are they will find inconsistencies better than you will. Second, play on-chain (deploy to a testnet or create a dummy version of your token and play around with it before committing to the final version). Empirical data is 1,000x more valuable than theory. By playing around with the token you will be able to track in real-time how your system might actually respond to activity in the real world.
After all the necessary design adjustments are finalized, feedback from the public is gathered, and operational evidence from testing has been aggregated and internalized, the moment of tokenization has arrived. For a high-fidelity implementation, it is best to hire experienced developers to join your team for the long haul. Outsourcing should only be done to firms with proven track records and deep ties to the industry. Their reputation will be the most valuable asset and they will not tarnish it. In my experience, offshore teams are best avoided; they will have little to no motivation for quality delivery.
Now you are off to the races, building your team/community, marketing, selling, and educating the public on your project.
Competing against thousands of other startups, if you stick tightly to the above six points, you will be well-equipped with a kick-ass product and on your way to the top!
The future will be tokenized!
I hope this serves you well on your journey,
Thank you for reading!
See you all on the other side 🥂
- Loading comments...