Towards an “Ontological Meta-Stack”: who wants a Subjectivity Sandwich?

Taking the layered systems approach to its supra-logical endpoint. Chains of blocks ALL the way down.

I’ve been invoking a layered stack model (after OSI and Buterin) to attempt finer grain characterisations, explanations and rationalisations of various epistemic and phenomenological happenings in the domain of cryptocurrencies and blockchain-architected P2P networks for a while. It’s a helpful lens with which to attempt definitions of many of the most loosely employed terms in regular use — decentralisation, permissionlessness, censorship-resistance, immutability. Take a look at Reaching Everyone Pt II for the low-down.

By Kevin Durkin for In The Mesh
Excerpt from TokenSpace cryptographic asset taxonomy research manuscript.

The reason for the above framing is that the acid test of a conceptual framework’s robustness when harnessed to build classification systems such as taxonomies and typologies is its ability to accept new objects or be generalisable into a meta-metholodology. To this end, one can also readily frame the ontological quest (simply put, the pursuit of wisdom and meaning) as a layered stack of intellectual discliplines. I’m sure this notion of an ontological meta-stack isn’t a totally original approach, but having played with it for a while it’s become a useful conceptual lens with which to (try to) understand how trans/multi/inter/anti/pan/omni/supra/quasi/para/post disciplinary research (choose your favoured poison) differs from steady state scholarly work. Furthermore, layered stacks may be one of the most easily invoked mechanism to achieve differential discretisation whilst grappling with linear domains. Apropos of nothing: a few years ago I was completely preoccupied with the wacky idea of starting a serious-and-satirical research disorganisation-superimposition that I was very tongue-in-cheekily calling The Institute of Non-Linear Phenomenlogy. Stacks of layers need not apply within.

Overly individuated, siloed or specialised knowledge domains — typically mature fields — tend to be hyper-focused on very small regions of the spectrum rather like the spectroscopic disciplines which I spent a decade playing supramolecular detective with. A photophysicist / spectroscopist could spend an entire lifetime “playing with” a narrowly bounded set of quantised energy levels in an atom, molecule, crystal or superstructure.

The electromagnetic spectrum, much more than meets the eye — or at least that the eye can detect & process. Source: https://upload.wikimedia.org/wikipedia/commons/3/30/EM_spectrumrevised.png

Likewise, a researcher could spend a fruitful and fulfilling career looking for the same signatures in wildy different systems. I studied the same ?(CO) vibrational signature in both exotic low-temperature astrophysical ices and in complex solution-based supramolecular assemblies. The same fingerprint can be exploited to provide rich information with respect to its environment on both sub-nanosecond and celestial timescales!

Vibrational spectroscopy applied to astrophysically relevant ices. Conducted at Leiden Observatory, 2003.
Time-resolved IR spectroscopy of Re(I) complexes & their transient excited states on the picosecond timescale. Conducted at the University of Nottingham & Rutherford Appleton Laboratory, 2003–6.

It would be remiss to attempt an approach to this weighty subject without discussing taxonomy, epistemology and more generally the design science of information systems. Taxonomy is one of the more powerful intellectual tools at our disposal with which to progress towards ontologies — although a classification system may not have truth value in itself, it may be an intermediate development and therefore useful in its own right.

Excerpt from TokenSpace cryptographic asset taxonomy research manuscript.

Let’s not get too deep into the weeds here, instead take a look at various TokenSpace materials to go deeper: Primer, Q&A, Compressed Essay as Analytical Index, Manuscript (available on request), Lecture video (soon!).

Excerpt from TokenSpace cryptographic asset taxonomy research manuscript.

Tegmark’s notion of subjectivity at the Universe’s margins (Planck limits, complex adaptive systems) with empirical objectivist domains betwixt seems appropos here. Let’s call it a subjectivity sandwich. Feynman famously opined “There’s Plenty of Room at the Bottom”, but there ostensibly exists even more room at the top!

Okay so here it is, the first iteration anyway. Let me know what you think by commenting or pinging me on Twitter. In reality, this framework may not be truly linear, granular and hierarchical but there is hopefully some value to it. Perhaps the next iteration of this half-baked idea will be an open form: woven meshes, interlinked gears — an ontological meta-DAG!?!?

As we move from bottom to top, complexity of the agents in focus increase alongside subjectivity. But at sub-quantum scales, the Universe also appears subjective — at least based on observations through our current paradigmatic lenses. Interesting phenomena tend to emerge at the margins, between disciplines or even as a synthesis of several elements of the meta-stack. Perhaps it’s time to repurpose the wonderful bibliotechnical term marginalia to capture this essence.

Cryptocurrencies are a great example of systems drawing on a number of these components. Indeed at the network level these protocols are very simple messaging systems but can exhibit extremely complex and unexpected emergent phenomena — Forkonomy is a good example of an attempt to apply unusual lenses to characterise unusual goings-on. This may also help to explain why researchers possessing deep subject-specific knowlege pertaining to one of the areas which cryptocurrency draws upon — cryptography, distributed systems, protocol engineering, law, economics, complex systems — often find it difficult to communicate with fellow scholars rooted in another traditional pastime. Perhaps the rise of novel meta-disciplines such as complexity science show that one approach — in our parlance — to harness and further understand non-linear domains is to capture as much of the stack as possible.

An Ontological Meta-DAG? “Civilisation Tech Tree” (Sid Meier, Firaxis Games)

TLDR: Are we entering a new “Age of Techno-Dilettantism”?

TokenSpace in a Nutshell

Compressed Essay as Analytical Index, from 15k+ to <4k words. Providing a sense of pretext, context, concept & rationale to the work.

[Ideally paired with the TokenSpace Primer]


TokenSpace: A Conceptual Framework for Cryptographic Asset Taxonomies. Analytical Index As Compressed Manuscript

1 Introduction & Historical Review

Characterising the properties of physical and conceptual artifacts has never been trivial, and as the fruits of human labour proliferate in both number and complexity this task has only become more challenging. The quest of bringing order, process, context and structure to knowledge has been a critical but seriously underappreciated component of the success story of intellectual development in domains as philosophy, mathematics, biology, chemistry and astronomy from which the higher echelons of the ontological meta-stack which embody the richness of modern human experience have emerged. Despite the undeniable observation that we live in a society driven by finance and economics, these fields still exist in a relatively underdeveloped state with respect to wisdom traditions, and the very recent accelerationist emergence of cryptographic assets has completely outrun any scholarly approach to understanding this sui generis population of intangible, digitally scare objects.

1.1 Necessity for the Work, Regulatory Opacity & Uncertainty

Over the time of writing and researching this work (2017–2019) the sentiment and perceived landscape inhabited by the field of cryptoassets has shifted significantly. This work was conceived in the midst of the 2017 bull market and Initial Coin Offering tokenised fundraising frenzy as a potential remedy to mitigate the naive conflation of cryptoassets as self-similar, with potential legal, compliance and regulatory ramifications. Over the intervening time, various national and international regulators and nation-state representatives have made comments, pronouncements and legislation taking a wide range of approaches to attempting to bring jurisdictional oversight to financial assets and products built upon borderless peer-to-peer protocols. This has set up a game of regulatory arbitrage in extremis as participants in the wild distributed boomtown surrounding cryptocurrency networks hop between countries faster than legal systems can react and adjust.

1.2 Hybrid Character of Cryptocurrency Networks & Assets, the Complex Provenance & Nature of Decentralisation

The protocols, networks and assets springing forth from cryptocurrency innovation are complex, intricate and exhibit a high degree of emergent behaviour and phenomena which prove very hard to predict or explain ex ante. There are also significant semantic challenges with terms such as “blockchain”, “decentralisation”, “censorship-resistance”, “permissionlessness” and “immutability” being used without precise definition. For these reasons there are a great deal of challenges with legalistic approaches that are put forward by persons and entities either with vested interests or lacking a deep understanding of the nuanced factors at play within these complex systems and associated assets.

1.3 Legal, Economic & Regulatory Characteristics of Cryptographic & Legacy Assets

Before a rigorous regulatory approach can be taken with respect to cryptoassets, the nature of the objects themselves must be deconvoluted. This work takes the approach that -for regulatory purposes at least — these objects may be characterised with sufficient meaningfulness as hybrids between securities, commodities and moneys.

1.3.1 Nature of Money Throughout the Ages

The use of objects as money has accompanied human progress, at least for the past several millennia. Aristotle wrote in the 4th century BC about the desirable properties of a money as fungibility, divisibility, portability and intrinsic value. Jevons framed this in terms of usage as stores of value, media of exchange and units of account. Chung took Baudrillard’s notion of the simulacrum to theorise that as technological progress continues, the willingness and / or expectations of human societies to use more technologically advanced objects as money moved in tandem. Economics provided the concept of stock-to-flow, the varying extents of rarity of monetary objects was tested by advances in transportation and geo-arbitrage exploited asymmetries in small-scale monetary systems. As technology widens the horizons of human society, even globally rare monetary objects such as precious metals become susceptible to supply inflation from seawater or asteroid recovery. Cryptographic assets provide the first instantiation of algorithmically enforced and provable, universal rarity. Unforgeable costliness of production may be the necessary counterpoint to nation-state controlled seigniorage, debasement and politically motivated manipulation of monetary supply to serve purposes of legacy institutions rather than the populace.

1.3.2 What Makes a Good Become Commoditised?

The commodification of goods arguably dates back as far as the development of agrarian societies. Marxist economics defines a commodity good as the fruit of productivity, by human or machine. More generally any sufficiently widely useful and in demand object which becomes standardised can be regarded a commodity.

1.3.3 Regulating Securitised Asset Issuance in a Post-Howey Paradigm

A security is simply an agreement to provide benefits from an underlying asset in return for investment. US legal precedent such as the WJ Howey, Silver Hills and Reves vs. Ernst & Young cases showed how orange groves, financial “notes” and golf courses could be securitised, not just obviously financialised assets such as company equity and debt. The critical features of a securitised asset: investment agreement, capital risked, expectation of profit, entities relied upon to deliver returns, fractional ownership of underlying, governance rights, returns such as a cashflows in addition to capital appreciation.

Considering cryptoassets, many projects exhibit some or all of these characteristics but very few prior to 2018/9 were openly registered as securities. Some senior regulatory officials made comments which further muddied the waters by discussing certain assets as having been initially distributed in securities offerings but having become “sufficiently decentralised” to no longer bear those hallmarks without providing definitions or heuristics. This leads to the notion that many projects are strongly incentivised to engage in decentralisation theatre, to carefully configure the outward appearance of their networks and/or tokens as decentralised in order to circumvent compliance burden of securities regulation. Clearly a more sophisticated conceptual framework is required to rationalise the similarities and differences between the diverse constellation of cryptographic networks and assets existing now and in the future.

1.3.4 Legacy Assets Exhibiting Hybrid Characteristics

Marx wrote in great length in Das Kapital about the interrelationship between commodities and moneys and how a good might move from one to the other. Some short-dated government debt could be considered as securitised, commoditised and as a monetary equivalent. Precious metals have at times been considered commodities and commodity-moneys.


2 Classification Approaches & Design Science of Information Systems

Designing conceptual systems to systematise knowledge is an intellectually demanding pastime that requires a deep knowledge of the subject material to produce meaningful and useful outputs.There is still merit in attempts which do not meet their goals, as they may be steps towards their goal which require advances in theory, bodies of knowledge, empirical information or maturation of a scholarly ecosystem.

2.1 Definitions of Terms Within Paradigms of Classification

Classification: Spatial, temporal or spatio-temporal segmentation of the world Ordering on the basis of similarity

Classification System: A construction for the abstract groupings into which objects can be put

Framework: A set of assumptions, concepts, values and practices that constitutes a way of understanding the research within a body of knowledge

Typology: A series of conceptually-derived groupings, can be multivariate and predominantly qualitative in nature

Taxonomy: Empirically or conceptually derived classifications for the elucidation of relationships between artifacts

Taxonomic System: A method or process from which a taxonomy may be derived

Cladistic Taxonomy: Historical, deductive or evolutionary relationships charting the genealogical inter-relationships of sets of objects

Phenetic Taxonomy: Empirically derived groupings of attribute similarity, arrived at using statistical methods

2.2 Philosophy of Design Science & Classification Approaches

Bailey — ideal type, constructed type, substruction, reduction

For the most part typologies conceptually derive an ideal type (category) which exemplifies the apex (or maximum) of a proposed characteristic whereas taxonomies develop a constructed type with reference to empirically observed cases which may not necessarily be idealised but can be employed as canonical (or most typical) examples. Such a constructed type may subsequently be used to examine exceptions to the type.

“A researcher may conceive of a single type and then add dimensions until a satisfactory typology is reached, in a process known as substruction. Alternatively the researcher could conceptualise an extensive typology and then eliminate certain dimensions in a process of reduction.”

Popper Three Worlds vs Kuhn’s discontinuous paradigms

In contrast to Kuhn’s paradigmatic assessment of the evolution of concepts, Popper’s Three Worlds provides some philosophical bedrock from which to develop generalised and systematic ontological and / or epistemological approaches. The first world corresponds to material and corporeal nature, the second to consciousness and cognitive states and the third to emergent products and phenomena arising from human social action.

Niiniluoto applied this simple classification to the development of classifications themselves and commented:

“Most design science research in engineering adopts a realistic / materialistic ontology whereas action research accepts more nominalistic, idealistic and constructivistic ontology.”

Materialism attaches primacy to Popper’s first world, idealism to the second and anti-positivistic action research to the third. Design science and action research do not necessarily have to share ontological and epistemological bases. Three potential roles for application within information systems were identified: meansend oriented, interpretive and critical approaches. In terms of design science ethics Niiniluoto comments:

“Design science research itself implies an ethical change from describing and explaining the state of the existing world to shaping and changing it.”

Ivari considered the philosophy of design science research itself:

“Regarding epistemology of design science, artifacts of taxonomies without underlying axioms or theories do not have an intrinsic truth value. It could however be argued that design science is concerned with pragmatism as a philosophical orientation attempting to bridge science and practical action.”

Methodology of design science rigour is derived from the effective use of prior research (i.e. existing knowledge). Major sources of ideas originate from practical problems, existing artifacts, analogy, metaphor and theory.

2.3 Selected Examples of Taxonomy & Typology Approaches

Classification approaches can be traced back as far as Aristotle’s “Categories” and was applied thoroughly first in the natural sciences by botanists and zoologists such as Linneaus and Haeckel who employed a hierarchical and categorical binomial paradigm.

Periodic table of the elements, from occultist elementalism and alchemy to empirical verification of atomic structure and electronic bonding. Periodic tables are snapshots in time of the taxonomic progression, new elements and isotopes continue to be discovered and synthesised. Coal & tar trees.

Nickerson’s generalised and systematised information systems artifact classification approach (meta-taxonomy!) provided the methodological foundations for TokenSpace taxonomies, upon which the multi-dimensional framework was constructed.

2.4 Recent Approaches to Classifying Monetary & Cryptographic Assets

Burniske / Tatar: naive categorisation, two dimensions deep. Non-exhaustive, not explanatory.

BNC / CryptoCompare: data libraries, descriptive, more a typology

Kirilenko: nothing here — arbitrary data, scatterplot, risk/reward

BIS / Stone money flower: non-exhaustive, overlapping assignations.

Cambridge regulatory typologies: simple, unassuming, good! The classification framework next door. Potential for development.

Fridgen: a true Nickerson taxonomy. Cluster analysis to determine anisotropy in parameter space. Great but rather specific and limited in scope.


3 Designing TokenSpace: A Conceptual Framework for Cryptographic Asset Taxonomies

For a classification system to be useful and have explanatory value, it should address the limitations of understanding the differences between artifact properties within the populations of objects as exists at present. TokenSpace is intended to be a conceptual framework addressing issues arising from limitations of appreciation of the hybrid nature and time-dependence of the qualities of cryptographic assets.

3.1 Introduction & Problem Statement of Taxonomy Development

Taxonomies should be explanatory, exhaustive, robust to new objects and irreducible without losing meaningfulness.

3.2 Construction of the TokenSpace Framework: Components & Methodology

Conventional taxonomies are categorical and can be either flat or hierarchical. The classification approach should be built to discriminate for a meta-characteristic, with a series of dimensions asking questions of each object, of which two or more categorical characteristics provide the options which should encompass the property in question.

3.2.1 Building Robust Taxonomies based on Information Systems Best Practices

Elicited, weighted score taxonomy, intuitive, deductive etc.

3.2.2 Three Conceptual-to-Empirical Approaches to Short-Listing Taxonomy Dimensions & Characteristics

For the meta-characteristics of Securityness, Moneyness and Commodityness a series of attributes were produced from which to iterate through the construction of taxonomies.

3.3 Design Choices & Justifications for TokenSpace

3.3.1 TokenSpace as a Conceptual Representation of Spatio-Temporal Reality

Making an n-dimensional space with meta-characteristics as axes and asset locations based upon scored taxonomies.

3.3.2 Defining Boundaries

Bounds of TokenSpace between zero and one, possible to extend beyond to consider notions of good and bad characteristics but this is non-trivial.

3.3.3 Dimensionality & Clarity

Three spatial dimensions enables the construction of a simple visual output which a non-technical user such as a lawmaker, politician or regulator may readily use to make informed subjective comparisons of asset characteristics. Orthogonality for visual clarity, not necessarily real — the properties and hence taxonomies of Moneyness and Commodityness are clearly very similar.

3.3.4 Categorical & Numerical Characteristics

Categories have limitations — boundary conditions, edge cases, Goodhart’s Law susceptibility. Ranged indexed dimensions attempt to address this by trading off complexity for subjectivity, however the effectiveness of this approach is limited by the domain expertise of the researcher.

3.3.5 Score Modifiers & Weightings for Taxonomy Characteristics & Dimensions

The relative significance of each dimension to the overall score output by a TokenSpace taxonomy can be adjusted through weighted scoring, and indeed the taxonomies of Commodityness and Moneyness vary mostly in their weightings rather than the choice of dimensions and characteristics.

3.3.6 Time-Dependence

Time-dependence can be assessed by charting the evolution of asset scores and locations in TokenSpace over time.

3.3.7 Boundary Functions

Boundary surfaces may be used to delineate regions in TokenSpace for regulatory and / or compliance purposes. Taking the example of Ethereum and Hinman’s summer 2018 comments. ETH has moved through a boundary surface from “was security offering” to “no longer is”. Lack of precision in language, definitions, characterisation metrics and visual conception. TokenSpace can address these to some extent.

3.3.8 Non-point & Anisotropic Asset Locations

The model can be extended beyond simple point asset locations in space through use of error bars or probability density functions as utilised in molecular orbital space filling representation in chemical bonding. The way that anisotropic score densities could be arrived at is via functions representing some of the dimensions yielding multiple, incompatible, quantised or inconsistent results or by elicitation resulting in a lack of consensus among panel members.


4 Creating a TokenSpace: TS10

It is difficult to exemplify the TokenSpaces without wholesale reproducing the taxonomies, scoring outcomes and visual depictions themselves so discussion will be largely limited to the process and outcomes.

4.1 Iterative Construction of Taxonomies, Indices & Score Modifiers

TS10 Taxonomy Development Iterations:

1) Progression from intuitively reasoned shortlists to categorical & indexed dimensions

2) Assigned unstandardised characteristic score modifiers (weightings incorporated), reduced number of dimensions, some categorical dimensions consolidated into index form

3) Standardised characteristic score modifiers to separately apply weightings, further reduction of dimensions, collapsing some categorical dimensions further into indices for ease of application — at possible expense of increased subjectivity

THIS IS THE BIG TOKENSPACE TRADEOFF — MITIGATING “GOODHART MANIPULABILITY” AT THE EXPENSE OF SUBJECTIVE REASONING. TRADEOFF CAN BE TUNED BY BALANCING CATEGORICAL AND INDEXED DIMENSIONS.

4.2 Placing Assets in TS10

Taxonomies and weighting modifiers used to construct a TokenSpace with top 10 marketcap assets.

4.3 Cluster Analysis & Correlations

Clustering, statistical distribution and correlations studied. Clusters mapping to sub-populations or “species” of assets such as PoW moneys, federated systems, ICO “smart contract” platforms, ETH, USDT, BTC occupying unique space somewhat.

4.4 TSL7: Using TokenSpace to Compare Cryptographic & Legacy Assets

This case study exemplifies why maximising the range of meta-characteristic scores within the population of objects, in that no cryptographic assets are particularly good moneys yet, especially in comparison to legacy assets such as USD and CHF. Benchmarking and normalisation are very valuable tools and have been used heavily in the construction

4.5 TSTDX: Time-Dependence of Selected Assets

A number of interesting observations can be made from this case study, monetary metals are decreasing in Moneyness with time as Bitcoin’s increases — ostensibly as the digitalisation of human society corresponds to favouring similarly digital (“simulacrised”) money such as Bitcoin over specie. In this respect, silver is some way ahead of gold, being largely a commodity rather than a commodity-money in the present day. The loss of gold and silver backing on moneys such as the British Pound (GBP) and the US Dollar (USD) leading to loss of Commodityness, Moneyness and an increase in Securityness may also be rationalised as derealisation — a loss of mimetic gravitas in addition to simulacrum-related societal sentiment.

The time-dependence of cryptographic assets generally shows a trend of decreasing Securityness as the networks mature and assets become more adopted, distributed, widely held, useful and used. In concert Moneyness and Commodityness also tend to increase as more reasons to use, hold and transact with the assets emerge. Ethereum (ETH) is particularly remarkable as — in tandem with Hinman’s summer 2018 sentiments, what started as a securities offering of a centralised asset reliant on the efforts of a team of others for speculative gain has become (to some extent) more widely used, useful, held and distributed hence leading to a decrease in Securityness and increases in Moneyness and Commodityness. It could perhaps be said that Ethereum in particular is well on the path to desecuritisation, or indeed may have arrived at that destination depending on where boundaries are perceived to lie in TokenSpace. The US Dollar (USD) still possesses a strong Moneyness being the de facto world reserve currency, though its Moneyness and Commodityness have been declining since the abandonment of gold-backing and the rise of the petrodollar system.

5 Discussion & Considerations Informing Further Development of TokenSpace

Decentralisation theatre upsides / downsides
Often token issuers engineer their assets to have a perceived value proposition by apportioning future cashflows to the holders, via mechanisms such as token burns, staking or masternodes which are coarsely analogous to share buybacks but with critical and significant differences to the downside. Likewise masternodes, staking rewards or issuer-sanctioned airdrops map coarsely to dividends in legacy finance. However, if a token is deemed to be too security-like then exchanges may be reluctant to list for fear of future liability or compliance issues.

Limitations of TokenSpace
It is important to explicitly discuss the limitations of TokenSpace. For the purposes of a subjective classification system such as TokenSpace, as many attributes of cryptographic networks and assets are continuous, exhibit subtle variations and / or edge cases, a mixture of categorical and numerical discrimination is most likely the optimal approach. Therefore, the instantiations of TokenSpace which will demonstrate the most explanatory power will be hybrids of traditional and phenetic taxonomy types. This design choice is justified by the desired outcome of numerical scores as the output of the classification execution in order to populate asset locations in the Euclidean 3D space that TokenSpace creates. Conversely in the interests of pragmatism, a great deal of insight may still be derived from a primarily categorical classification approach with some range-bound indices and if this meets the needs of the user then it is an acceptable and valid design choice. Further it minimises over-reliance on measurable attributes which may be subject to manipulation for motivations related to decentralisation theatre.

Looking in the mirror
As with all information systems, the principle of GIGO (Garbage In, Garbage Out) applies. A number of potential pitfalls are as follows, and the author does not exclude oneself from susceptibility to any or all of these. The use of misinformed judgement, lack of methodological rigour in taxonomy construction, over-estimation of the researcher’s knowledge of the field or competence in applying taxonomic methodology, latent biases, poor quality / misleading data sources or judgements and a lack of appreciation of edge cases or category overlap may severely limit the usefulness of the TokenSpace produced and therefore its explanatory power. It must be reiterated yet again that TokenSpace affords a subjective conceptual framework for the comparative analysis of assets. The meta-characteristic definitions and choices, dimensions, categories and characteristics employed, score modifiers and / or weightings are all subjective and depend on the choices of the researcher which derive from intended purpose. It is entirely realistic that an asset issuer may tailor their taxonomies, score modifiers, regulatory boundary functions or a combination of the above to present a favourable assessment with respect to their biases or motivations.

Changing goalposts
Additionally, considering the changing nature of regulatory and compliance landscape may have a large bearing on what can be considered to be acceptable asset characteristics in compliance terms and may necessitate a re-evaluation of weightings and / or score modifiers. Some distinction between “good” and “bad” securities, moneys or commodities in an area of particular interest is a worthwhile area to explore, as it potentially extends the explanatory power of the framework if a meaningfulness-preserving approach to occupying a region between -1 and +1 could provide a coarse mechanism to do this, though the way that dimension scores and weightings are determined would have to be adjusted and naive methods such as taking moduli do not sufficiently discriminate as to the quality of an asset, nor would they easily be integrated into the weighted taxonomy approach — though individual dimensions could be assigned negative weightings rather than inverted dimension scores.

Future directions
Future planned developments include the construction of TokenSpaces with higher dimensionality and alternative taxonomies for different meta-characteristics intended for purposes other than increasing regulatory clarification. Scoring mechanisms including categorical and indexed dimensions, score modifiers and weightings may also be further refined and extended. Other approaches to generating asset coordinates for TokenSpaces will also be explored, with plans in place to form “digital round tables” with broad subsets of stakeholders to arrive at asset scores or ranges. Work is underway with collaborators to extend TokenSpace into DAOSpace in order to characterise similarities and differences of “Decentralised Autonomous Organisations” as opposed to assets. One interesting nexus of DAOSpace and TokenSpace is attempting to disentangle the design choices and properties of decentralised organisations (and their native assets) with respect to Securityness in particular. The SEC has already made it clear that TheDAO tokens (DAO) would be classified as securities and therefore profit-oriented tokenised DAOs must be designed carefully with this in mind should they intend to be compliant with existing regulations. Interestingly Malta has passed laws giving DAOs legal personality, meaning that another cycle of jurisdictional arbitrage may be underway, this time with organisations as well as or instead of assets. Likewise stablecoins with certain properties especially related to asset issuance may also be problematic from a compliance perspective so a potential extension of this work towards a StablecoinSpace classification framework for pegged assets is an avenue being explored currently.

A future goal of TokenSpace is the development of an environment which may be updated in real-time from various information feeds from market, semantic / linguistic and / or network data in order to provide dynamic information as to evolving asset characteristics as well as historical trends at varying points in time. This may facilitate the goal of descriptive, explanatory and even predictive techniques for understanding, rationalising or foreseeing trends, issues and opportunities relating to assets and networks before they become readily apparent from naıve analysis.

Want moar? Take a look at various TokenSpace materials to go deeper: Primer, Q&A, Manuscript (available on request), Lecture video (soon!).

Wassim Alsindi previously directed research at independent laboratory Parallel Industries, analysing cryptocurrency networks from data-driven and human perspectives. Parallel Industries activities and resources are archived at www.pllel.com and @parallelind on Twitter.

Wassim is now working on a new journal and conference series for cryptocurrency & blockchain technology research in a collaborative endeavour between the Digital Currency Initiative at the MIT Media Lab and the MIT Press. Our goal is to bring together the technical fields of cryptography, protocol engineering and distributed systems research with epistemic, phenomenological and ontological insights from the domains of economics, law, complex systems and philosophy to crystallise a mature scholarly ecosystem. Find him at @wassimalsindi on Twitter.