The Explosive Potential of a Standard Digital Token Taxonomy

The Explosive Potential of a Standard Digital Token Taxonomy

tl;dr: Token taxonomy sounds as exciting as watching moss grow. Without it, the promise of crypto-led innovation may not materialize. The good news is that it’s underway.

One of the big differences between Ethereum and Bitcoin was the innovation of a smart contract.

With Bitcoin, as revolutionary as it is, the idea of a token is relatively simple. It’s a digital representation of ownership rights secured by a decentralized ledger.

But a Bitcoin is relatively limited in what it can do. Aside from holding (er, hodling) it, sending it, receiving it, and selling it, there was not much more you can do with a Bitcoin.

RSK is trying to change this, but the basics of Bitcoin, when it comes to tokens, are relatively simple.

Ethereum changed this.

An Ethereum token can be programmed to behave according to a set of conditions.

It’s like If This, Then That, but for digital tokens.

As Roham Gharegozlou, the CEO of Dapper Labs (and co-creator of Crypto-Kitties) said, Ether is programmable value.

No alt text provided for this image

It’s a pretty massive idea and we are only at the earliest stages of seeing the impact of this innovation.

Programmable Value May Sound Simple

The capability to create programmable value will generate entirely new business models.

We are already seeing it in the form of decentralized financecrypto-collectiblesdecentralized AI, and more.

Understanding that tokens are programmable value is one of the keys to unlocking an understanding of the impact of blockchains.  It’s kind of a blue pill/red pill moment.

But creating programmable value is challenging.

There are many components (I’m only saying that because I am sure that I don’t fully understand all of them), but three jump out.

  • Crypto Economics
  • Token Engineering
  • Token Taxonomy

I’m not a world-class expert on any of them, but let’s do a walk-through so we all understand it a bit better.

Components of Programmable Value

Crypto-economics

Crypto-economics is the process designing a set of complementary incentives among various actors in a system so that some degree of balance is maintained.

You don’t want the rich getting richer to the point where they can control the entire network.

You want to make the accessible enough that anyone can join, but not so accessible that there is a tragedy of the commons.

This discipline involves things like game theory and behavioral economics.

For example, when Satoshi set up the system of miners getting paid and users paying fees for transactions, it was a crypto-economic design.

There are many great write-ups about this, but I still go back to Aleksandr Bulkin’sseries of posts from way back called “Crypto-economics is hard” as a great starting point.

The Token utility canvas that Outlier Ventures put out seems like a great tool to help think through the ways of creating aligned incentives.

Token Engineering

Token Engineering is the actual build out of the tokens.

Once you have the design for a bridge or a road, you need to figure out the materials you will use, what machines you need, etc. The digital equivalent is the same.

How will you construct it (programming language, for example)?

How will you audit it for security? After all, smart contracts can get hacked.

One day, PhDs will be awarded in this discipline. For now, a good reference is the Token Engineering Wiki.

Token Taxonomy

Token Taxonomy is the standard set of terms and definitions that describe the attributes of a token.

I didn’t really have much of an appreciation for the importance of this element until recently.

I ran into Marley Gray, Microsoft’s Principal Architect for Azure Blockchain, at the Blockchain Revolution Global event in Toronto.

We had chatted via Skype and emailed but never met in person, which was my loss.

Marley introduced me to his latest initiative in his role as Chair of the Token Taxonomy Initiative.

The TTI is an effort to create a global standard for token definitions that will allow for large-scale interoperability. It aims to do for tokens what the Gauge Act of 1845 did for railroads, get agreement on how things will be done.

The Token Taxonomy Framework that Marley wrote is a very accessible primer for anyone. For a tech-head, Marley puts on his best marketing hat and uses a powerful set of analogies (mostly Lego) to help you understand the various pieces that compose a token.

For example:

  • base token artifact— this is the choice between fungible (not unique) and non-fungible (unique).
  • It’s the difference between a given Ether token and a one-of-a-kind crypto-kitty.
  • sub-division— can you cut it up?
  • You can own a segment of a Picasso or a building, but not a seat on a flight to Milan.
  • behaviors–what is the token allowed to do?
  • is it transferable? will it expire? Think about this with gift certificates or coupons.
  • behavior groups— what higher order class does the token fall into?
  • I’m less clear on this, so I will turn it over to Marley.“Think of this like, a manual stick-shift vs. automatic in cars. When you say a car is manual, you know that it means the car will have a clutch and a gear shift requiring you to manually shift the gears. Calling it manual is a shortcut for that meaning, where automatic means you will not have a clutch and gear shift, just P/D/N/R. You still have to define a clutch and gear shift individually, but when I am specifying the features of the car, I don’t spell them out, I group them into a feature called manual.”

Once you have these outlined, you can begin to compose a token.

So, for example, you could say….

I want to build a

  • fungible token (all are the same, like a dollar bill)
  • that can be sub-divided to 2 decimal points (like a dollar can into cents),
  • that will expire in 1 year (like a credit on an airline)
  • which falls into a group of liabilities (owed to creditors or customers)

If you use that standard terminology and the standard composition of the TTF, you have made your life easier because you have made the lives of other people easier.

Now, a partner, a customer, a vendor can look at a token and immediately understand what it can and cannot do. Even better, they can start using it without knowing exactly how it works.

Back to Marley one more time.

“Continuing with the analogy, the goals for the TTF are to make using Tokens like driving cars. 

If you know how to drive an automatic, I don’t have to teach you how to drive a Telsa if you have only driven gas-powered cars. Manual takes some training and getting used to, but the concept is the same. Every token, regardless of how it is powered, or its gears are managed, has basic controls: steering wheel, gas and brake pedals. 

What makes them different are the additional behaviors, power windows, remote start, Turbo, auto-park and non-behavioral properties like the Mercedes hood ornament, color, etc.”

The fact that you don’t have to learn new behaviors to use a token accelerates combinatorial innovation. What’s more, because the tokens (for the most part) are open-source, they only have to be created once.

Taxonomy isn’t Sexy, but…

Without it, enterprises will have a difficult time driving huge value from token-based initiatives. Plus, a ton of time will be spent by developers trying to figure out the composition of tokens built by others.

Every day, we see more and more evidence of growth in the crypto/blockchain sphere. A lot of it is exciting. A lot of it appears boring and bland (taxonomy, ugh), but these are the critical frameworks that enable subsequent explosive growth.

The Industrial Revolution may not have even happened without railroad standardization.

The blockchain revolution requires token standardization.

The good news is that it’s happening.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics