Recommended for digestibility – web version on PubPub190803-TokenSpaceManuscript
For further information or to provide feedback, contact info [at] pllel [dot] com.
TokenSpace presentation at the Token Engineering Global Gathering in Berlin, August 2019.
“Welcome to my TEGG Talk”
“How can we bring insights from complex systems and philosophy, protocol engineering, cryptography, law, and policy together to build the mature ecosystem.” Meet @WassimAlsindi at the TEGG 2019! Get your ticket now! https://t.co/TCZyZbwAGH pic.twitter.com/0gGiW9sv5t— TEGG 2019 (@tegg_io) July 10, 2019
Taking the layered systems approach to its supra-logical endpoint. Chains of blocks ALL the way down.
Welcome to Canberra’s latest premium residential high rise address. DM for PayPal details. #realestate #canberra #soundart #noise #3dprinting #mechanicalart #truesciencefact pic.twitter.com/LEGCIi7Md3— Brian McNamara (@Rarebeasts) May 6, 2019
I’ve been invoking a layered stack model (after OSI and Buterin) to attempt finer grain characterisations, explanations and rationalisations of various epistemic and phenomenological happenings in the domain of cryptocurrencies and blockchain-architected P2P networks for a while. It’s a helpful lens with which to attempt definitions of many of the most loosely employed terms in regular use — decentralisation, permissionlessness, censorship-resistance, immutability. Take a look at Reaching Everyone Pt II for the low-down.
“Layering has both technical and social implications: it makes the technical complexity of the system more manageable, and it allows the system to be designed and built in a decentralized way.”— شتر دیدی؟ ندیدی (@arbedout) May 13, 2019
How layering made building the ARPANet possible: pic.twitter.com/wOEE5d578C
The reason for the above framing is that the acid test of a conceptual framework’s robustness when harnessed to build classification systems such as taxonomies and typologies is its ability to accept new objects or be generalisable into a meta-metholodology. To this end, one can also readily frame the ontological quest (simply put, the pursuit of wisdom and meaning) as a layered stack of intellectual discliplines. I’m sure this notion of an ontological meta-stack isn’t a totally original approach, but having played with it for a while it’s become a useful conceptual lens with which to (try to) understand how trans/multi/inter/anti/pan/omni/supra/quasi/para/post disciplinary research (choose your favoured poison) differs from steady state scholarly work. Furthermore, layered stacks may be one of the most easily invoked mechanism to achieve differential discretisation whilst grappling with linear domains. Apropos of nothing: a few years ago I was completely preoccupied with the wacky idea of starting a serious-and-satirical research disorganisation-superimposition that I was very tongue-in-cheekily calling The Institute of Non-Linear Phenomenlogy. Stacks of layers need not apply within.
Overly individuated, siloed or specialised knowledge domains — typically mature fields — tend to be hyper-focused on very small regions of the spectrum rather like the spectroscopic disciplines which I spent a decade playing supramolecular detective with. A photophysicist / spectroscopist could spend an entire lifetime “playing with” a narrowly bounded set of quantised energy levels in an atom, molecule, crystal or superstructure.
Likewise, a researcher could spend a fruitful and fulfilling career looking for the same signatures in wildy different systems. I studied the same 𝛎(CO) vibrational signature in both exotic low-temperature astrophysical ices and in complex solution-based supramolecular assemblies. The same fingerprint can be exploited to provide rich information with respect to its environment on both sub-nanosecond and celestial timescales!
It would be remiss to attempt an approach to this weighty subject without discussing taxonomy, epistemology and more generally the design science of information systems. Taxonomy is one of the more powerful intellectual tools at our disposal with which to progress towards ontologies — although a classification system may not have truth value in itself, it may be an intermediate development and therefore useful in its own right.
Let’s not get too deep into the weeds here, instead take a look at various TokenSpace materials to go deeper: Primer, Q&A, Compressed Essay as Analytical Index, Manuscript (available on request), Lecture video (soon!).
Tegmark’s notion of subjectivity at the Universe’s margins (Planck limits, complex adaptive systems) with empirical objectivist domains betwixt seems appropos here. Let’s call it a subjectivity sandwich. Feynman famously opined “There’s Plenty of Room at the Bottom”, but there ostensibly exists even more room at the top!
Okay so here it is, the first iteration anyway. Let me know what you think by commenting or pinging me on Twitter. In reality, this framework may not be truly linear, granular and hierarchical but there is hopefully some value to it. Perhaps the next iteration of this half-baked idea will be an open form: woven meshes, interlinked gears — an ontological meta-DAG!?!?
As we move from bottom to top, complexity of the agents in focus increase alongside subjectivity. But at sub-quantum scales, the Universe also appears subjective — at least based on observations through our current paradigmatic lenses. Interesting phenomena tend to emerge at the margins, between disciplines or even as a synthesis of several elements of the meta-stack. Perhaps it’s time to repurpose the wonderful bibliotechnical term marginalia to capture this essence.
Cryptocurrencies are a great example of systems drawing on a number of these components. Indeed at the network level these protocols are very simple messaging systems but can exhibit extremely complex and unexpected emergent phenomena — Forkonomy is a good example of an attempt to apply unusual lenses to characterise unusual goings-on. This may also help to explain why researchers possessing deep subject-specific knowlege pertaining to one of the areas which cryptocurrency draws upon — cryptography, distributed systems, protocol engineering, law, economics, complex systems — often find it difficult to communicate with fellow scholars rooted in another traditional pastime. Perhaps the rise of novel meta-disciplines such as complexity science show that one approach — in our parlance — to harness and further understand non-linear domains is to capture as much of the stack as possible.
TLDR: Are we entering a new “Age of Techno-Dilettantism”?