I’ve been invoking a layered stack model (after OSI and Buterin) to attempt finer grain characterisations, explanations and rationalisations of various epistemic and phenomenological happenings in the domain of cryptocurrencies and blockchain-architected P2P networks for a while. It’s a helpful lens with which to attempt definitions of many of the most loosely employed terms in regular use — decentralisation, permissionlessness, censorship-resistance, immutability. Take a look at Reaching Everyone Pt II for the low-down.
“Layering has both technical and social implications: it makes the technical complexity of the system more manageable, and it allows the system to be designed and built in a decentralized way.”
The reason for the above framing is that the acid test of a conceptual framework’s robustness when harnessed to build classification systems such as taxonomies and typologies is its ability to accept new objects or be generalisable into a meta-metholodology. To this end, one can also readily frame the ontological quest (simply put, the pursuit of wisdom and meaning) as a layered stack of intellectual discliplines. I’m sure this notion of an ontological meta-stack isn’t a totally original approach, but having played with it for a while it’s become a useful conceptual lens with which to (try to) understand how trans/multi/inter/anti/pan/omni/supra/quasi/para/post disciplinaryresearch (choose your favoured poison) differs from steady state scholarly work. Furthermore, layered stacks may be one of the most easily invoked mechanism to achieve differential discretisation whilst grappling with linear domains. Aproposof nothing: a few years ago I was completely preoccupied with the wacky idea of starting a serious-and-satirical research disorganisation-superimposition that I was very tongue-in-cheekily calling The Institute of Non-Linear Phenomenlogy. Stacks of layers need not apply within.
Overly individuated, siloed or specialised knowledge domains — typically mature fields — tend to be hyper-focused on very small regions of the spectrum rather like the spectroscopic disciplines which I spent a decade playing supramolecular detective with. A photophysicist / spectroscopist could spend an entire lifetime “playing with” a narrowly bounded set of quantised energy levels in an atom, molecule, crystal or superstructure.
Likewise, a researcher could spend a fruitful and fulfilling career looking for the same signatures in wildy different systems. I studied the same ?(CO) vibrational signature in both exotic low-temperature astrophysical ices and in complex solution-based supramolecular assemblies. The same fingerprint can be exploited to provide rich information with respect to its environment on both sub-nanosecond and celestial timescales!
It would be remiss to attempt an approach to this weighty subject without discussing taxonomy, epistemology and more generally the design science of information systems. Taxonomy is one of the more powerful intellectual tools at our disposal with which to progress towards ontologies — although a classification system may not have truth value in itself, it may be an intermediate development and therefore useful in its own right.
Tegmark’s notion of subjectivity at the Universe’s margins (Planck limits, complex adaptive systems) with empirical objectivist domains betwixt seems appropos here. Let’s call it a subjectivity sandwich. Feynman famously opined “There’s Plenty of Room at the Bottom”, but there ostensibly exists even more roomat the top!
Okay so here it is, the first iteration anyway. Let me know what you think by commenting or pinging me on Twitter. In reality, this framework may not be truly linear, granular and hierarchical but there is hopefully some value to it. Perhaps the next iteration of this half-baked idea will be an open form: woven meshes, interlinked gears — an ontological meta-DAG!?!?
As we move from bottom to top, complexity of the agents in focus increase alongside subjectivity. But at sub-quantum scales, the Universe also appears subjective — at least based on observations through our current paradigmatic lenses. Interesting phenomena tend to emerge at the margins, between disciplines or even as a synthesis of several elements of the meta-stack. Perhaps it’s time to repurpose the wonderful bibliotechnical term marginalia to capture this essence.
Cryptocurrencies are a great example of systems drawing on a number of these components. Indeed at the network level these protocols are very simple messaging systems but can exhibit extremely complex and unexpected emergent phenomena — Forkonomy is a good example of an attempt to apply unusual lenses to characterise unusual goings-on. This may also help to explain why researchers possessing deep subject-specific knowlege pertaining to one of the areas which cryptocurrency draws upon — cryptography, distributed systems, protocol engineering, law, economics, complex systems — often find it difficult to communicate with fellow scholars rooted in another traditional pastime. Perhaps the rise of novel meta-disciplines such as complexity science show that one approach — in our parlance — to harness and further understand non-linear domains is to capture as much of the stack as possible.
TLDR: Are we entering a new “Age of Techno-Dilettantism”?
Wassim recently moderated a lively panel discussion on “Economic and Social Risks” at the Breaking Bitcoin security conference in Amsterdam.
.@AaronvanW bringing the ?by saying we like to think bitcoin is free of human biases but in reality it is a shared fiction (manifested in software) because money is essentially a shared fiction that helps people cooperate. #BreakingBitcoin
TokenSpace: A Conceptual Framework for Cryptographic Asset Taxonomies. Analytical Index As Compressed Manuscript
1 Introduction & Historical Review
Characterising the properties of physical and conceptual artifacts has never been trivial, and as the fruits of human labour proliferate in both number and complexity this task has only become more challenging. The quest of bringing order, process, context and structure to knowledge has been a critical but seriously underappreciated component of the success story of intellectual development in domains as philosophy, mathematics, biology, chemistry and astronomy from which the higher echelons of the ontological meta-stack which embody the richness of modern human experience have emerged. Despite the undeniable observation that we live in a society driven by finance and economics, these fields still exist in a relatively underdeveloped state with respect to wisdom traditions, and the very recent accelerationist emergence of cryptographic assets has completely outrun any scholarly approach to understanding this sui generis population of intangible, digitally scare objects.
1.1 Necessity for the Work, Regulatory Opacity & Uncertainty
Over the time of writing and researching this work (2017–2019) the sentiment and perceived landscape inhabited by the field of cryptoassets has shifted significantly. This work was conceived in the midst of the 2017 bull market and Initial Coin Offering tokenised fundraising frenzy as a potential remedy to mitigate the naive conflation of cryptoassets as self-similar, with potential legal, compliance and regulatory ramifications. Over the intervening time, various national and international regulators and nation-state representatives have made comments, pronouncements and legislation taking a wide range of approaches to attempting to bring jurisdictional oversight to financial assets and products built upon borderless peer-to-peer protocols. This has set up a game of regulatory arbitrage in extremis as participants in the wild distributed boomtown surrounding cryptocurrency networks hop between countries faster than legal systems can react and adjust.
1.2 Hybrid Character of Cryptocurrency Networks & Assets, the Complex Provenance & Nature of Decentralisation
The protocols, networks and assets springing forth from cryptocurrency innovation are complex, intricate and exhibit a high degree of emergent behaviour and phenomena which prove very hard to predict or explain ex ante. There are also significant semantic challenges with terms such as “blockchain”, “decentralisation”, “censorship-resistance”, “permissionlessness” and “immutability” being used without precise definition. For these reasons there are a great deal of challenges with legalistic approaches that are put forward by persons and entities either with vested interests or lacking a deep understanding of the nuanced factors at play within these complex systems and associated assets.
Before a rigorous regulatory approach can be taken with respect to cryptoassets, the nature of the objects themselves must be deconvoluted. This work takes the approach that -for regulatory purposes at least — these objects may be characterised with sufficient meaningfulness as hybrids between securities, commodities and moneys.
1.3.1 Nature of Money Throughout the Ages
The use of objects as money has accompanied human progress, at least for the past several millennia. Aristotle wrote in the 4th century BC about the desirable properties of a money as fungibility, divisibility, portability and intrinsic value. Jevons framed this in terms of usage as stores of value, media of exchange and units of account. Chung took Baudrillard’s notion of the simulacrum to theorise that as technological progress continues, the willingness and / or expectations of human societies to use more technologically advanced objects as money moved in tandem. Economics provided the concept of stock-to-flow, the varying extents of rarity of monetary objects was tested by advances in transportation and geo-arbitrage exploited asymmetries in small-scale monetary systems. As technology widens the horizons of human society, even globally rare monetary objects such as precious metals become susceptible to supply inflation from seawater or asteroid recovery. Cryptographic assets provide the first instantiation of algorithmically enforced and provable, universal rarity. Unforgeable costliness of production may be the necessary counterpoint to nation-state controlled seigniorage, debasement and politically motivated manipulation of monetary supply to serve purposes of legacy institutions rather than the populace.
1.3.2 What Makes a Good Become Commoditised?
The commodification of goods arguably dates back as far as the development of agrarian societies. Marxist economics defines a commodity good as the fruit of productivity, by human or machine. More generally any sufficiently widely useful and in demand object which becomes standardised can be regarded a commodity.
1.3.3 Regulating Securitised Asset Issuance in a Post-Howey Paradigm
A security is simply an agreement to provide benefits from an underlying asset in return for investment. US legal precedent such as the WJ Howey, Silver Hills and Reves vs. Ernst & Young cases showed how orange groves, financial “notes” and golf courses could be securitised, not just obviously financialised assets such as company equity and debt. The critical features of a securitised asset: investment agreement, capital risked, expectation of profit, entities relied upon to deliver returns, fractional ownership of underlying, governance rights, returns such as a cashflows in addition to capital appreciation.
Considering cryptoassets, many projects exhibit some or all of these characteristics but very few prior to 2018/9 were openly registered as securities. Some senior regulatory officials made comments which further muddied the waters by discussing certain assets as having been initially distributed in securities offerings but having become “sufficiently decentralised” to no longer bear those hallmarks without providing definitions or heuristics. This leads to the notion that many projects are strongly incentivised to engage in decentralisation theatre, to carefully configure the outward appearance of their networks and/or tokens as decentralised in order to circumvent compliance burden of securities regulation. Clearly a more sophisticated conceptual framework is required to rationalise the similarities and differences between the diverse constellation of cryptographic networks and assets existing now and in the future.
Marx wrote in great length in Das Kapital about the interrelationship between commodities and moneys and how a good might move from one to the other. Some short-dated government debt could be considered as securitised, commoditised and as a monetary equivalent. Precious metals have at times been considered commodities and commodity-moneys.
2 Classification Approaches & Design Science of Information Systems
Designing conceptual systems to systematise knowledge is an intellectually demanding pastime that requires a deep knowledge of the subject material to produce meaningful and useful outputs.There is still merit in attempts which do not meet their goals, as they may be steps towards their goal which require advances in theory, bodies of knowledge, empirical information or maturation of a scholarly ecosystem.
2.1 Definitions of Terms Within Paradigms of Classification
Classification: Spatial, temporal or spatio-temporal segmentation of the world Ordering on the basis of similarity
Classification System: A construction for the abstract groupings into which objects can be put
Framework: A set of assumptions, concepts, values and practices that constitutes a way of understanding the research within a body of knowledge
Typology: A series of conceptually-derived groupings, can be multivariate and predominantly qualitative in nature
Taxonomy: Empirically or conceptually derived classifications for the elucidation of relationships between artifacts
Taxonomic System: A method or process from which a taxonomy may be derived
Cladistic Taxonomy: Historical, deductive or evolutionary relationships charting the genealogical inter-relationships of sets of objects
Phenetic Taxonomy: Empirically derived groupings of attribute similarity, arrived at using statistical methods
2.2 Philosophy of Design Science & Classification Approaches
For the most part typologies conceptually derive an ideal type (category) which exemplifies the apex (or maximum) of a proposed characteristic whereas taxonomies develop a constructed type with reference to empirically observed cases which may not necessarily be idealised but can be employed as canonical (or most typical) examples. Such a constructed type may subsequently be used to examine exceptions to the type.
“A researcher may conceive of a single type and then add dimensions until a satisfactory typology is reached, in a process known as substruction. Alternatively the researcher could conceptualise an extensive typology and then eliminate certain dimensions in a process of reduction.”
Popper Three Worlds vs Kuhn’s discontinuous paradigms
In contrast to Kuhn’s paradigmatic assessment of the evolution of concepts, Popper’s Three Worlds provides some philosophical bedrock from which to develop generalised and systematic ontological and / or epistemological approaches. The first world corresponds to material and corporeal nature, the second to consciousness and cognitive states and the third to emergent products and phenomena arising from human social action.
Niiniluoto applied this simple classification to the development of classifications themselves and commented:
“Most design science research in engineering adopts a realistic / materialistic ontology whereas action research accepts more nominalistic, idealistic and constructivistic ontology.”
Materialism attaches primacy to Popper’s first world, idealism to the second and anti-positivistic action research to the third. Design science and action research do not necessarily have to share ontological and epistemological bases. Three potential roles for application within information systems were identified: meansend oriented, interpretive and critical approaches. In terms of design science ethics Niiniluoto comments:
“Design science research itself implies an ethical change from describing and explaining the state of the existing world to shaping and changing it.”
Ivari considered the philosophy of design science research itself:
“Regarding epistemology of design science, artifacts of taxonomies without underlying axioms or theories do not have an intrinsic truth value. It could however be argued that design science is concerned with pragmatism as a philosophical orientation attempting to bridge science and practical action.”
Methodology of design science rigour is derived from the effective use of prior research (i.e. existing knowledge). Major sources of ideas originate from practical problems, existing artifacts, analogy, metaphor and theory.
2.3 Selected Examples of Taxonomy & Typology Approaches
Classification approaches can be traced back as far as Aristotle’s “Categories” and was applied thoroughly first in the natural sciences by botanists and zoologists such as Linneaus and Haeckel who employed a hierarchical and categorical binomial paradigm.
Periodic table of the elements, from occultist elementalism and alchemy to empirical verification of atomic structure and electronic bonding. Periodic tables are snapshots in time of the taxonomic progression, new elements and isotopes continue to be discovered and synthesised. Coal & tar trees.
Nickerson’s generalised and systematised information systems artifact classification approach (meta-taxonomy!) provided the methodological foundations for TokenSpace taxonomies, upon which the multi-dimensional framework was constructed.
2.4 Recent Approaches to Classifying Monetary & Cryptographic Assets
Burniske / Tatar: naive categorisation, two dimensions deep. Non-exhaustive, not explanatory.
BNC / CryptoCompare: data libraries, descriptive, more a typology
Kirilenko: nothing here — arbitrary data, scatterplot, risk/reward
BIS / Stone money flower: non-exhaustive, overlapping assignations.
Cambridge regulatory typologies: simple, unassuming, good! The classification framework next door. Potential for development.
Fridgen: a true Nickerson taxonomy. Cluster analysis to determine anisotropy in parameter space. Great but rather specific and limited in scope.
3 Designing TokenSpace: A Conceptual Framework for Cryptographic Asset Taxonomies
For a classification system to be useful and have explanatory value, it should address the limitations of understanding the differences between artifact properties within the populations of objects as exists at present. TokenSpace is intended to be a conceptual framework addressing issues arising from limitations of appreciation of the hybrid nature and time-dependence of the qualities of cryptographic assets.
3.1 Introduction & Problem Statement of Taxonomy Development
Taxonomies should be explanatory, exhaustive, robust to new objects and irreducible without losing meaningfulness.
3.2 Construction of the TokenSpace Framework: Components & Methodology
Conventional taxonomies are categorical and can be either flat or hierarchical. The classification approach should be built to discriminate for a meta-characteristic, with a series of dimensions asking questions of each object, of which two or more categorical characteristics provide the options which should encompass the property in question.
3.2.1 Building Robust Taxonomies based on Information Systems Best Practices
Elicited, weighted score taxonomy, intuitive, deductive etc.
3.2.2 Three Conceptual-to-Empirical Approaches to Short-Listing Taxonomy Dimensions & Characteristics
For the meta-characteristics of Securityness, Moneyness and Commodityness a series of attributes were produced from which to iterate through the construction of taxonomies.
3.3 Design Choices & Justifications for TokenSpace
3.3.1 TokenSpace as a Conceptual Representation of Spatio-Temporal Reality
Making an n-dimensional space with meta-characteristics as axes and asset locations based upon scored taxonomies.
3.3.2 Defining Boundaries
Bounds of TokenSpace between zero and one, possible to extend beyond to consider notions of good and bad characteristics but this is non-trivial.
3.3.3 Dimensionality & Clarity
Three spatial dimensions enables the construction of a simple visual output which a non-technical user such as a lawmaker, politician or regulator may readily use to make informed subjective comparisons of asset characteristics. Orthogonality for visual clarity, not necessarily real — the properties and hence taxonomies of Moneyness and Commodityness are clearly very similar.
3.3.4 Categorical & Numerical Characteristics
Categories have limitations — boundary conditions, edge cases, Goodhart’s Law susceptibility. Ranged indexed dimensions attempt to address this by trading off complexity for subjectivity, however the effectiveness of this approach is limited by the domain expertise of the researcher.
3.3.5 Score Modifiers & Weightings for Taxonomy Characteristics & Dimensions
The relative significance of each dimension to the overall score output by a TokenSpace taxonomy can be adjusted through weighted scoring, and indeed the taxonomies of Commodityness and Moneyness vary mostly in their weightings rather than the choice of dimensions and characteristics.
3.3.6 Time-Dependence
Time-dependence can be assessed by charting the evolution of asset scores and locations in TokenSpace over time.
3.3.7 Boundary Functions
Boundary surfaces may be used to delineate regions in TokenSpace for regulatory and / or compliance purposes. Taking the example of Ethereum and Hinman’s summer 2018 comments. ETH has moved through a boundary surface from “was security offering” to “no longer is”. Lack of precision in language, definitions, characterisation metrics and visual conception. TokenSpace can address these to some extent.
3.3.8 Non-point & Anisotropic Asset Locations
The model can be extended beyond simple point asset locations in space through use of error bars or probability density functions as utilised in molecular orbital space filling representation in chemical bonding. The way that anisotropic score densities could be arrived at is via functions representing some of the dimensions yielding multiple, incompatible, quantised or inconsistent results or by elicitation resulting in a lack of consensus among panel members.
4 Creating a TokenSpace: TS10
It is difficult to exemplify the TokenSpaces without wholesale reproducing the taxonomies, scoring outcomes and visual depictions themselves so discussion will be largely limited to the process and outcomes.
4.1 Iterative Construction of Taxonomies, Indices & Score Modifiers
TS10 Taxonomy Development Iterations:
1) Progression from intuitively reasoned shortlists to categorical & indexed dimensions
2) Assigned unstandardised characteristic score modifiers (weightings incorporated), reduced number of dimensions, some categorical dimensions consolidated into index form
3) Standardised characteristic score modifiers to separately apply weightings, further reduction of dimensions, collapsing some categorical dimensions further into indices for ease of application — at possible expense of increased subjectivity
THIS IS THE BIG TOKENSPACE TRADEOFF — MITIGATING “GOODHART MANIPULABILITY” AT THE EXPENSE OF SUBJECTIVE REASONING. TRADEOFF CAN BE TUNED BY BALANCING CATEGORICAL AND INDEXED DIMENSIONS.
4.2 Placing Assets in TS10
Taxonomies and weighting modifiers used to construct a TokenSpace with top 10 marketcap assets.
4.3 Cluster Analysis & Correlations
Clustering, statistical distribution and correlations studied. Clusters mapping to sub-populations or “species” of assets such as PoW moneys, federated systems, ICO “smart contract” platforms, ETH, USDT, BTC occupying unique space somewhat.
4.4 TSL7: Using TokenSpace to Compare Cryptographic & Legacy Assets
This case study exemplifies why maximising the range of meta-characteristic scores within the population of objects, in that no cryptographic assets are particularly good moneys yet, especially in comparison to legacy assets such as USD and CHF. Benchmarking and normalisation are very valuable tools and have been used heavily in the construction
4.5 TSTDX: Time-Dependence of Selected Assets
A number of interesting observations can be made from this case study, monetary metals are decreasing in Moneyness with time as Bitcoin’s increases — ostensibly as the digitalisation of human society corresponds to favouring similarly digital (“simulacrised”) money such as Bitcoin over specie. In this respect, silver is some way ahead of gold, being largely a commodity rather than a commodity-money in the present day. The loss of gold and silver backing on moneys such as the British Pound (GBP) and the US Dollar (USD) leading to loss of Commodityness, Moneyness and an increase in Securityness may also be rationalised as derealisation — a loss of mimetic gravitas in addition to simulacrum-related societal sentiment.
The time-dependence of cryptographic assets generally shows a trend of decreasing Securityness as the networks mature and assets become more adopted, distributed, widely held, useful and used. In concert Moneyness and Commodityness also tend to increase as more reasons to use, hold and transact with the assets emerge. Ethereum (ETH) is particularly remarkable as — in tandem with Hinman’s summer 2018 sentiments, what started as a securities offering of a centralised asset reliant on the efforts of a team of others for speculative gain has become (to some extent) more widely used, useful, held and distributed hence leading to a decrease in Securityness and increases in Moneyness and Commodityness. It could perhaps be said that Ethereum in particular is well on the path to desecuritisation, or indeed may have arrived at that destination depending on where boundaries are perceived to lie in TokenSpace. The US Dollar (USD) still possesses a strong Moneyness being the de facto world reserve currency, though its Moneyness and Commodityness have been declining since the abandonment of gold-backing and the rise of the petrodollar system.
5 Discussion & Considerations Informing Further Development of TokenSpace
Decentralisation theatre upsides / downsides Often token issuers engineer their assets to have a perceived value proposition by apportioning future cashflows to the holders, via mechanisms such as token burns, staking or masternodes which are coarsely analogous to share buybacks but with critical and significant differences to the downside. Likewise masternodes, staking rewards or issuer-sanctioned airdrops map coarsely to dividends in legacy finance. However, if a token is deemed to be too security-like then exchanges may be reluctant to list for fear of future liability or compliance issues.
Limitations of TokenSpace It is important to explicitly discuss the limitations of TokenSpace. For the purposes of a subjective classification system such as TokenSpace, as many attributes of cryptographic networks and assets are continuous, exhibit subtle variations and / or edge cases, a mixture of categorical and numerical discrimination is most likely the optimal approach. Therefore, the instantiations of TokenSpace which will demonstrate the most explanatory power will be hybrids of traditional and phenetic taxonomy types. This design choice is justified by the desired outcome of numerical scores as the output of the classification execution in order to populate asset locations in the Euclidean 3D space that TokenSpace creates. Conversely in the interests of pragmatism, a great deal of insight may still be derived from a primarily categorical classification approach with some range-bound indices and if this meets the needs of the user then it is an acceptable and valid design choice. Further it minimises over-reliance on measurable attributes which may be subject to manipulation for motivations related to decentralisation theatre.
Looking in the mirror As with all information systems, the principle of GIGO (Garbage In, Garbage Out) applies. A number of potential pitfalls are as follows, and the author does not exclude oneself from susceptibility to any or all of these. The use of misinformed judgement, lack of methodological rigour in taxonomy construction, over-estimation of the researcher’s knowledge of the field or competence in applying taxonomic methodology, latent biases, poor quality / misleading data sources or judgements and a lack of appreciation of edge cases or category overlap may severely limit the usefulness of the TokenSpace produced and therefore its explanatory power. It must be reiterated yet again that TokenSpace affords a subjective conceptual framework for the comparative analysis of assets. The meta-characteristic definitions and choices, dimensions, categories and characteristics employed, score modifiers and / or weightings are all subjective and depend on the choices of the researcher which derive from intended purpose. It is entirely realistic that an asset issuer may tailor their taxonomies, score modifiers, regulatory boundary functions or a combination of the above to present a favourable assessment with respect to their biases or motivations.
Changing goalposts Additionally, considering the changing nature of regulatory and compliance landscape may have a large bearing on what can be considered to be acceptable asset characteristics in compliance terms and may necessitate a re-evaluation of weightings and / or score modifiers. Some distinction between “good” and “bad” securities, moneys or commodities in an area of particular interest is a worthwhile area to explore, as it potentially extends the explanatory power of the framework if a meaningfulness-preserving approach to occupying a region between -1 and +1 could provide a coarse mechanism to do this, though the way that dimension scores and weightings are determined would have to be adjusted and naive methods such as taking moduli do not sufficiently discriminate as to the quality of an asset, nor would they easily be integrated into the weighted taxonomy approach — though individual dimensions could be assigned negative weightings rather than inverted dimension scores.
Future directions Future planned developments include the construction of TokenSpaces with higher dimensionality and alternative taxonomies for different meta-characteristics intended for purposes other than increasing regulatory clarification. Scoring mechanisms including categorical and indexed dimensions, score modifiers and weightings may also be further refined and extended. Other approaches to generating asset coordinates for TokenSpaces will also be explored, with plans in place to form “digital round tables” with broad subsets of stakeholders to arrive at asset scores or ranges. Work is underway with collaborators to extend TokenSpace into DAOSpace in order to characterise similarities and differences of “Decentralised Autonomous Organisations” as opposed to assets. One interesting nexus of DAOSpace and TokenSpace is attempting to disentangle the design choices and properties of decentralised organisations (and their native assets) with respect to Securityness in particular. The SEC has already made it clear that TheDAO tokens (DAO) would be classified as securities and therefore profit-oriented tokenised DAOs must be designed carefully with this in mind should they intend to be compliant with existing regulations. Interestingly Malta has passed laws giving DAOs legal personality, meaning that another cycle of jurisdictional arbitrage may be underway, this time with organisations as well as or instead of assets. Likewise stablecoins with certain properties especially related to asset issuance may also be problematic from a compliance perspective so a potential extension of this work towards a StablecoinSpace classification framework for pegged assets is an avenue being explored currently.
A future goal of TokenSpace is the development of an environment which may be updated in real-time from various information feeds from market, semantic / linguistic and / or network data in order to provide dynamic information as to evolving asset characteristics as well as historical trends at varying points in time. This may facilitate the goal of descriptive, explanatory and even predictive techniques for understanding, rationalising or foreseeing trends, issues and opportunities relating to assets and networks before they become readily apparent from naıve analysis.
Want moar? Take a look at various TokenSpace materials to go deeper: Primer, Q&A, Manuscript (available on request), Lecture video (soon!).
Wassim Alsindi previously directed research at independent laboratory Parallel Industries, analysing cryptocurrency networks from data-driven and human perspectives. Parallel Industries activities and resources are archived at www.pllel.com and @parallelind on Twitter.
Wassim is now working on a new journal and conference series for cryptocurrency & blockchain technology research in a collaborative endeavour between the Digital Currency Initiative at the MIT Media Lab and the MIT Press. Our goal is to bring together the technical fields of cryptography, protocol engineering and distributed systems research with epistemic, phenomenological and ontological insights from the domains of economics, law, complex systems and philosophy to crystallise a mature scholarly ecosystem. Find him at @wassimalsindi on Twitter.
In
open source software, project codebase forks are commonplace and occur
when existing software development paths diverge, creating separate and
distinct pieces of software. Torvalds’ original Linux kernel from 1991
has been forked into countless descendant projects. In the case of blockchain-based cryptocurrency networks implementing ledgers, there exists the prospect of both codebase and ledger forks. A cryptocurrency codebaseforkcreates
an independent project to be launched with a new genesis block which
may share consensus rules but with an entirely different transaction
history than its progenitor — e.g. BTC and LTC. A ledger fork creates a
separate incompatible network, sharing its history with the progenitor
network until the divergent event, commonly referred to as a chain split — e.g. BTC and BCH.
3/ Fork terminology is tricky and sometimes muddled. As the concept came to cryptocurrency from OSS development many of the terms have been co-opted but there are further nuances here: particularly to distinguish between codebase / ledger forks, and hard / soft / velvet forks. pic.twitter.com/fh8RLjkAIk
— Parallel Industries ꙮ ? ? (@parallelind) 8 October 2018
Consensus
rule changes or alteration of the network transaction history may be
the cause of such a fracture, deliberate or unplanned. Often when
networks upgrade software, consensus rules or implement new features a
portion of the network participants may be left behind on a vestigial
timeline that lacks developer, community, wallet or exchange support. In
the summer of 2018 a fifth of nodes running Bitcoin Cash (BCH) — a minority ledger fork
of BTC with significantly relaxed block size limitations — were
separated from the BCH network and a non-trivial number of would-be
nodes remained disconnected from the BCH network weeks later.
Governance: Decision making process between multiple parties.
Blockchain governance: Decision-making process by mutually distrusting entities in a multi-party distributed system.
On/off-chain governance (aka governance by / of the network):
Decisions are made either explicitly through the network’s ledger and
UTXOs/balances possessed therein, or via some other (in)formal mechanism
such as rough or social consensus.
Read these articles collated by CleanApp for a detailed discussion of key considerations in formally governed blockchain networks.
Immutability: An attribute primarily observed at the protocol layer in the decentralised networking stack — upon
which the monetary layer depends for persistence — ensuring the
inability of stakeholders or adversaries to alter the transaction record
and thereby balances. With this in mind, the oft-quoted concept of ‘code is law’
which refers to immutability in cryptocurrency networks, typically
referring more to preserving the intended use and function of a system
and its ledger rather than a blind adherence to a software
implementation regardless of flaws or vulnerabilities.
What Maketh a Fork?
The
distinction between what constitutes a vestigial network and a viable
breakaway faction is unclear and difficult to objectively parameterise.
There is a significant element of adversarial strategy, political
gamesmanship and public signalling of (real or synthetic) intent and
support via social media platforms. The notions of critical mass and
stakeholder buy-in are ostensibly at play since ecosystem fragmentations
would be characterised as strongly negative sum through the invocation of Metcalfe’s Law
as regards network effects and hence value proposition. Any blockchain
secured thermodynamically by Proof-of-Work (PoW) is susceptible to
attack vectors such as so-called 51 % or majority attacks, leading to re-orgs (chain re-organisations) as multiple candidates satisfying chain selection rules emerge. These can result in the potential for double-spending the
same funds more than once against entities such as exchanges who do not
require sufficient confirmations for transaction finality to be
reliable in an adversarial context. Should a network fragment into
multiple disconnected populations, adversaries with control of much less
significant computational resource would be in reach of majority hashrate either using permanent or rented computation from sources such as Nicehash or Amazon EC3.
4/ What ingredients are needed to sustain a “minority” network fragment such as $ETC? For a PoW network, it’s going to need significant hashrate to avoid replay attacks, re-orgs & wipeout risks – especially if #codeislaw & immutability is respected. Devs, users & businesses too. pic.twitter.com/G63TJHT4VP
— Parallel Industries ꙮ ? ? (@parallelind) 8 October 2018
A
striking example of this was the divergence of the Ethereum developer
and leadership cadre (ETH) from the canonical account-oriented Ethereum
blockchain (ETC) due to the exploitation of a flawed smart contract project resembling a quasi-securitised decentralised investment fund known as The DAO (Decentralised
Autonomous Organisation). In this case the Ethereum insiders decided to
sacrifice immutability and by extension censorship-resistance in order
to conduct an effective bailout of DAO participants
which came to exercise Too-Big-To-Fail influence over the overall
Ethereum network, insider asset holdings, token supply and mindshare. A
social media consultation process in conjunction with on-chain voting
was employed to arrive at this conclusion though both methods are known
to be flawed and gameable. During the irregular state transition process
akin to a rollback, a co-ordinated effort between miners, exchanges and
developers took place on private channels, exposing the degree of
centralisation inherent in the power structures of constituent network
participants.
The key event which transformed the canonical Ethereum blockchain (where the DAO attacker kept their spoils) from a vestigial wiped out chain
to a viable if contentious minority fork was the decision by Bitsquare
and Poloniex exchanges to list the attacker’s timeline as Ethereum
Classic (ETC) alongside high-profile mining participants such as
Chandler Guo, well resourced financial organisations such as Grayscale
Invest (a subsidiary of Digital Currency Group) and former development
team members such as Charles Hoskinson to publically declare and deploy
support, developers and significant hashrate to defend the original Ethereum network.
ETC now exists as an independent and sovereign network with diverging
priorities, characteristics and goals to the larger Ethereum network
ETH.
Forks and network governance: the case of Bitcoin and SegWit (excerpt from Forkonomy paper)
For
a range of reasons, there is often strident resistance to hard
forks — irreversible protocol upgrades or relaxing of the existing
consensus ruleset — in “ungoverned” trust-minimised cryptocurrency
networks such as BTC. The lack of controlling entities may lead to a
chain split and lasting network partition if the delicate balance of
stakeholder incentives fails in the presence of a divergent event. The
implementation of SegWit (Segregated Witness) by the BTC network was
eventually achieved in 2017 as a backward-compatible soft fork following
several years of intense political and strategic maneuvering by the
constituent stakeholders in the BTC network. This off-chain governance
process of emergent consensus requiring supermajority or unanimity
measured by miner signalling has proven to be an inefficient and gameable mechanism
for administering the BTC network. Certain stakeholder constituencies
such as the developers maintaining the reference Bitcoin Core software
client implementation of BTC could not easily reach agreement with
mining oligopolists and so-called big block advocates over the optimum
technological trajectory for the BTC network.
Major
stakeholders of the mining constituency strongly opposed SegWit as it
would render a previously clandestine proprietary efficiency advantage
known as covert ASICBoost
ineffective on the canonical BTC chain. A grassroots BTC community
movement campaigning for a User Activated Soft Fork (UASF) for SegWit
implementation and a face-saving Bitcoin Improvement Proposal (BIP91)
facilitated the eventual lock-in of the SegWit upgrade in the summer of
2017. A contentious network partition took place in August 2017, giving
rise to the Bitcoin Cash (BCH) network which rejected SegWit and opted
instead for linear on-chain scaling. By changing the block size and
loosening the consensus ruleset without overwhelming agreement from all
constituencies of the BTC network, it is difficult to find a basis for
BCH proponents’ claims to be the canonical Bitcoin blockchain without
invoking appeals to emotion, authority or other logical fallacies. The
continuing presence of Craig S. Wright and his claims to be a progenitor
of Bitcoin are an example of these attempts at legitimacy.
What the #forkgov? Fork-resistance and governance
Given
the significant downside potential of real and perceived threats to the
resilience and legitimacy of a fragmenting network and loss of
associated network effects, the ability of a blockchain-based protocol
network to demonstrate fork resistance provides significant strength to
its value proposition and two notable examples of networks attempting to
utilise such a mechanism are Tezos and Decred. Decred is an example of a
hybrid PoW/PoS monetary network which is implementing a proposal and
governance mechanism termed Politeia.
Since coin-holders have voting rights based on stake weight, they have
the ability to keep miners and developer constituencies honest through
the mechanism to reach decisions by majority stakeholder consensus on
matters including hard forks. These lessons were ostensibly learned
through the developer team’s experiences in writing a BTC client which
they felt was not appraised objectively by the Bitcoin Core developer
ecosystem. Decred’s fork resistance is effectively achieved by the fact
that most stakeholders would be non-voting on a minority chain, it would
remain stalled as blocks would not be created or propagated across the
upstart network.
Taking
a high-level perspective, let’s address the most general question: are
these two notions meaningfully compatible? If we think of any natural
process in the Universe — from the celestial to the tribal — as
accretions and communities grow in size and complexity, scalability challenges increase markedly. Minimising accidental chain splits during protocol upgrades is a worthy goal. However, denying
a mechanism to allow factions a graceful and orderly exit has upsides
in preserving the moat of network effect but at the cost of internal
dissonance, which may grow over time and lead to second-order shenanigans. Sound familiar?
BrexitCoin combines Proof-of-Flag governance and historical obfuscation so that mistakes are guaranteed to be repeated. Unelected leaders of BrexitCoin committee must wait for a scandal to occur before the next “democracyblock” may be proposed.https://t.co/cNpICRxHCIpic.twitter.com/dsgYr09UG5
— Byzantine Business School (@BusinessSch001) 4 June 2018
One can look at ledger forks in a few different ways as good, bad or neutral:
[Good]
A/B/…/Z testing of different technical, economic or philosophical
approaches aka “Let the market decide, fork freedom baby!”
[Neutral]
An inevitability of entropy and/or finite social scalability as these
networks grow and mature it is not realistic to keep all stakeholders
sufficiently aligned for optimal network health.
As
such, protocol layer fork resistance and effective public fora with
voting mechanisms can certainly be helpful tools, but there is a
question as to whether democracy (the tyranny of the majority)
should be exercised in all cases. If there was a “block size” style
civil war in Tezos or Decred with no acceptable compromise in sight,
would the status quo still be the best situation in all cases?
Worth noting that I predicted the first Tezos hard fork over a year ago (@tzlibre)?.
There is *nothing* that supercedes fork-based governance for cryptocurrencies, and that is a good thing. https://t.co/E0jRomnvqi
My
perspective is that fork-resistance will largely redistribute the
manifestations of discontent rather than provide a lasting cure to ills,
and the native network governance mechanisms may be gamed by either incumbents or ousters.
More time is needed to see how decision-making regarding technical
evolution unfolds in both networks. Decred seems to be sitting pretty
with a fairly attack-resilient hybrid PoW/PoS system,
but there are some “exclusionary forces” in the network leading to the
escalating DCR-denominated costs of staking tickets necessary to receive
PoS rewards and participate in proposal voting, denying access to the
mechanism to smaller holders.
Demand
for tickets and staking rewards naturally increases with ongoing
issuance, as the widening pool of coin holders wanting to mitigate
dilution also does. As the ticket price is dynamic and demand-responsive,
it creates upwards pressure which would make tickets inaccessible for a
growing proportion of coin holders. At time of writing, “ticket splitting”
allowing smaller holders to engage in PoS is available from some stake
pools and self-organised collectives but the process is not yet
automated in reference clients. On the other hand, the ongoing bear
market has seen the USD ticket price fall from ~$8–10k USD at January
and May 2018 peaks to ~$2k USD today in late January 2019 so those
entering Decred with capital from outside the cryptocurrency domain
would likely be undeterred. Data from dcrdata.org and coincap.io.
Further, as per Parallel Industries’ TokenSpace taxonomy research,
staking rewards resemble dividends and token-based governance
privileges resemble shareholder rights which make Decred appear a little
closer to the traditional definition of a capital asset
than pure PoW systems. This may or may not be an issue depending how
regulation unfolds. Tezos has those potential issues plus the regulatory
risk from the token sale. Decred’s airdrop may not have distributed the
coin as fairly as possible but will undoubtedly attract a lower
compliance burden than a token sale or premine.
“Activist Forks” & “Unfounder Forks”
Can
you appreciate the technology utility but dislike the economic, human
and compliance issues packaged together with a project?
Taking this a step further, these dissonant groups may conduct a guerilla
campaign inside a network to focus attention on their cause. Last
summer, a few anti-KYC factions of Tezos had appeared on social media
outlets prior to network launch, however since the mainnet launch things
have quietened down somewhat. One faction which still apparently
intends to create a fork of Tezos changed tact and became a delegated staker within the network whilst continuing to voice dissent — perhaps this “fork activism” can be interpreted as a response to the “fork-resistance” of Tezos.
So, what else could a fork activist do? Take a look around at the ongoing ICO bonfire of the vanities which is largely due to poorly thought out sales of high-friction futility tokens
infringing upon / attempting to circumvent various regulations around
the world. The prospect of removing the token issuers and the tokens
themselves once treasuries are liquidated (by themselves, or by
lawmakers) and development ceases is quite attractive indeed — will we
see a wave of “unfounder forks” as in this example? Perhaps operating in reverse to Simon de la Rouviere’s “Tokenised Forking” where both tokens and founders are excised.
Been thinking about this, started already and will only happen more from here – “activist” or “unfounder” forks if you will.
Let the ICO vendors bear legal risk & develop, then remove the token friction & promoters upon launch/runway expiry.#forkonomyhttps://t.co/7LpZvpbQhc
Wassim will be speaking about TokenSpace at the upcoming UNCHAIN Convention in Berlin June 14-15th. More information and tickets are available here: https://www.unchain-convention.com/
Wassim’s “What The #ForkGov” article exploring fork-based governance of decentralised networks has been included in DAOStack‘s “Decentralised Thriving” Anthology. Download it from here.