Q&A On TokenSpace: A New Conceptual Classification Framework For Cryptoassets

In search of fresh perspectives on the characteristics of cryptographic assets. This Q&A with Matt ฿ originally appeared in 21cryptos.com in December 2018. A comprehensive manuscript describing TokenSpace will be released soon, in the meantime more TokenSpace information over at pllel.com and on Twitter.

Q: Can you give a bit of background on yourself? What got you interested in cryptocurrency?

Sure, it’s been a winding road though so let’s not get too lost in details! I grew up in various towns and cities in the UK mostly reading maths and sci-fi books, stargazing, misusing home chemistry crystal growing kits, making music and playing way too many computer games. Spent a decade at universities studying, researching and managing scientific research in chemistry, physics and astronomy where I really got exposed to the idea of organising knowledge to further our understanding. My chemistry mentor (now YouTube-famous) Professor Sir Martyn Poliakoff is very likely the world’s leading connoisseur of the periodic table of the elements so I’ve had classification systems such as taxonomies on the brain for a while now.

After that I spent several years working with experimental music and arts, running a record label, organising educational activities, managing interesting projects and curating a festival. Whilst on a music tour around the US West Coast in 2012 we went to a friend-of-a-friend’s place in Silicon Valley, he opened his closet and said “check this out, I’m doing this thing called mining Bitcoin”. It took a while to be convinced, the idea sounded great but everything I could find online looked quite sketchy — Mt. Gox, Bitcoinica, BitInstant and all that — and it wasn’t until 2014/5 during what may have been Bitcoin’s darkest days that I started to get really interested. The idea of natively digital money that isn’t controlled by anyone has obvious appeal, but surviving the Gox incident showed me that the technology had some serious resilience and could be a long-lived proposition. Since then it’s gradually taken over my life as I’ve worked my way through various activities as a hobbyist — watching the markets, running nodes and following on-chain activity, messing around with coloured coins and smart contracts, mining and now research of various flavours through an independent research organisation Parallel Industries.

My Bitcoin sunrise, after playing a gig in Stanislaus National Forest in Summer 2012

Q: ELI5 TokenSpace.

TokenSpace is an attempt to make a relatively simple and easy to use comparison system out of the sprawling and confusing mess of cryptocurrencies, tokens and suchlike that we find ourselves with today. Think of it as a 3D “space” to place different assets inside, with each of the axes representing a characteristic that we can use to visually compare and contrast different assets. The position of an asset along each axis is determined by a scoring system between 0 and 1 for that characteristic, so that a score of zero means the asset doesn’t have those properties at all, and a score of one means it’s a textbook case. Where the score comes from is up to the user, it can be from an intuitive ‘gut feel’ perspective, a weighted taxonomy of different properties, a consensus view from a panel of advisors and so on. It all depends on the intended application.

TokenSpace visual impression

The primary application so far has been to look at the ongoing uncertainty as to the legal and regulatory status of cryptoassets and how similar or different they are to traditional asset types such as monies, securities or commodities. Obviously there is a lot of variation from asset to asset and it is becoming increasingly clear that government bodies are looking at these things closely.

It’s important to understand that the difference between concepts like TokenSpace and the periodic table of chemical elements is that we are still very much in a subjective realm with cryptoassets, and therefore any particular score should be taken with a pinch of salt. People are not going to have the same opinions on a lot of these things — if you follow the cryptocurrency and blockchain space then you will know that humans are VERY biased creatures! A future avenue for this work is to explore different perspectives to see where they come together and where they do not. You could say we are still in the occultist and alchemical phase of cryptocurrency…

Q: Tell us about the metrics you’re using to place the assets in this 3D space.

The axes I’ve chosen are for the properties Securityness, Moneyness and Commodityness — in other words how much a coin or token embodies or exhibits the characteristics of a securitised asset, a money or a commodity. Having encountered the fruitless debate of “I think token X is a security but you think it is not” innumerable times, and given the fact that these tokens and networks are hybrids of payment mechanisms, rights to on-chain property or “cashflows” like masternodes, value stores and consumable resources it seems reasonable to engender a greater ability to differentiate between more subtle differences in these assets.

One thing that’s nice about working with a conceptual framework like this is that it could easily be adapted for another purpose — for example Parallel Industries has begun a collaboration with DAO specialists who want to apply a similar approach to characterising the organisational structures that exist around decentralised networks and providing the right dimensions are found, there’s no reason why you can’t also build a set of taxonomies or scoring systems for that purpose. It does require careful thought and design choices to ensure you end up with a useful tool that can be meaningfully used.

Q: How would you distinguish between, for instance, Bitcoin, Litecoin, Tether and Polymath with this framework?

Good question. I think it’s reasonable to say that as assets, bitcoin and litecoin are often thought of as having “commodity-like” characteristics. People often refer to the digital gold and silver memes so they would place reasonably well on that, though bitcoin has much more liquidity and market depth so it would be easy to make a case for it being the premier digital commodity. Neither have much in common with securities though you could make a case that Litecoin’s founder and Foundation are somewhat relied upon for expectation of profit. As much as it’d be nice to say otherwise, bitcoin and litecoin still aren’t great as monies compared to fiat currency so they do still have some ground to cover there.

Tether functions primarily as a monetary substitute although it’s hard to be confident about it’s supply or ability to store value in the long term, though by virtue of its stability against fiat currency relative to traditional cryptocurrencies it does fulfil that purpose reasonably well in today’s high friction on and off ramps with exchanges for example. It doesn’t look much like a commodity or a security to me.

Polymath is not one I’m very familiar with, being a security token platform they are at least being upfront with that. As an ERC20 token on Ethereum with a central administrative team it does seem to have a lot of the hallmarks of a security and though there does appear to be some “utility” being used to issue securities tokens on their platform it could be argued that it has more commodityness than the typical Ethereum ICO vintage of 2017 or something quite useless such as XRP but nowhere near as much as bitcoin or litecoin.

Placing selected assets in TokenSpace. Scores are assigned by author.

Q: You’ve taken on the seemingly insurmountable task of attempting to classify cryptoassets. What are regulators doing wrong? What sort of organisations would benefit from this?

It’s a tall order indeed, and perhaps not surprising that it’s taken a while to get to this stage. The hope is that tools like TokenSpace can help coin and token issuers, lawyers, regulators and exchange operators get a better grip on the characteristics of different assets and avoid making misinformed decisions such as blanket bans, listing or adopting assets which might cause them compliance headaches or issuing poorly designed tokens which might land them in hot water later.

I’ve met a few regulators, token issuers and exchange compliance officers and it seems that a lot of the pitfalls seen so far (and many more to come) are from a lack of understanding of how these assets and the underlying networks function and evolve over time. It’s virtually impossible to have a complete grasp on these things — even Satoshi didn’t have every angle covered! The biggest mistake I’ve seen being made so far by officials is the rush to make sweeping pronouncements without being able to back them up with justifications that make the situation even less clear.

One example are comments made by US Securities and Exchange Commission officials that the ETH crowdsale was a securities offering but the Ethereum network has since become “sufficiently decentralised” and therefore is no longer a security. Taking that at face value, it suggests that at one point, ETH has passed through a “legal / not legal” boundary, but where and how? What made the difference and how was that decision arrived at? Node distribution? Concentration of tokens amongst insiders? Decentralisation of leadership? It’s not easy to resolve that with existing securities laws guidelines like the Howey test. What about network forks and issues such as The DAO exploit? These sorts of things are going to keep happening.

Example of an Arbitrary Regulatory Boundary Function

Q: What could regulators be doing better?

Make clearer statements, do your homework to understand the technology at play and be more upfront about decision-making processes! What are the metrics that regulators deem important? Why? Don’t build rigid legal frameworks that can’t cope with the breakneck pace of cryptocurrency developments. There will always be regulatory arbitrage with borderless technologies, just look at Malta and Puerto Rico. Which small nation will be next to reposition itself to attract jurisdiction-hoppers like Binance?

There is also the perennial issue of legions of “Blockchain Experts” who usually land influential advisory roles but seem to know very little about the ins and outs of applied cryptographic networks and assets associated with them. Having spent a very frustrating year in a business school environment having to deal with fakers and imbeciles claiming said proficiencies recently, I can confirm that this is a very real problem.

Q: What else is Parallel Industries working on? What are your future plans?

Currently Parallel Industries is very much in the bootstrap phase, limping along with very little income (thanks bear market) so it’s a major priority to bring in resources through sponsorship, consulting and contract research to operate sustainably so that we can expand our research activities and yours truly isn’t spread quite so thinly! The TokenSpace paper is finally approaching readiness and our Forkonomy project undertaking comparative analysis of network forks (such as BTC/BCH, ETC/ETH, BTCP/ZCL) has already had a number of outputs including a talk at the recent ETC Summit in Korea and a well-received paper. There’s also a project in progress named DAOs and Don’ts looking at power imbalances in cryptocurrency networks which has been on the sidelines a little too long. Keep an eye out for an article series on political and humanitarian hacks and use cases for cryptocurrencies in In The Mesh magazine under the title Reaching Everyone.

If any of that arouses curiosity do a look at our website www.pllel.com or find us on Twitter @parallelind. If you’re a crypto-millionaire looking for a way to lighten your bags and fund some research in the process, we can help with that too!

2 thoughts on “Q&A On TokenSpace: A New Conceptual Classification Framework For Cryptoassets

Comments are closed.