This is a sneak-peak into the book Token Economy, Shermin Voshmgir, 2019
The Bitcoin white paper didn’t come out of thin air, and P2P networks are not a new phenomenon. P2P Networks are rooted in the early history of the computer and Internet, building on decades of research of computer networks, cryptography, and game theory.
The first computer networks were invented in the 1960s. ARPANET was a private network of American university computers introduced in 1969, initially funded by the Advanced Research Projects Agency of the United States Department of Defense. It turned into a global network in 1973 when it integrated the computers of research institutions in England and Norway. In 1974, it turned commercial with the integration of the first Internet service provider, Telnet. That same year, a paper was published describing a working protocol for sharing resources using packet switching among the nodes. A central control component of this protocol was the Transmission Control Program (TCP). In 1982, TCP’s monolithic architecture was divided into a modular architecture consisting a transport layer (TCP) and the Internet layer, also know as “Internet Protocol” (IP). The model became known informally as TCP/IP as the standard networking protocol.
Another breakthrough was achieved in 1983 with the introduction of DNS, which made the addressing of nodes within the network more readable. However, in these first-generation computer networks, the main focus was on connecting a public network of computers with each other, and resolving the question of addressing computers, and transmitting data. The network architecture was still based on client-server logic. Furthermore, secure communication was never a mainstream focus in the early days of the Internet. However, select researchers were intrigued by exactly this question.
Ralph Merkle’s cryptographic research in the early 1970s laid the foundation of secure communication over P2P networks. His work conceptualized how to resolve “secure communication over insecure channels” like a computer network, and laid the foundation for modern public-key cryptography. In his dissertation, he furthermore described a method of building collision-resistant cryptographic hash functions. He also filed a patent for a special type of hash table called a Merkle tree that allowed more efficient and secure verification of the contents of large data structures.
In 1976, Whitfield Diffie and Martin Hellman built on some of his ideas and created a mechanism for securely exchanging cryptographic keys over a public network. It was one of the earliest implemented examples of public key exchange, and also introduced the concept of digital signatures. Before public key methods were invented, cryptographic keys had to be transmitted in physical form, so secure digital key exchange over public networks was groundbreaking work, without which Bitcoin and subsequent technologies would not work. In 1978, Ron Rivest, Adi Shamir, and Leonard Adleman found a way to create a one-way cryptographic function that was hard to invert. Their algorithm – now known as RSA – introduced the era of asymmetric cryptography, which then evolved into the use of elliptic curves in cryptography – suggested independently by Neal Koblitz and Victor S. Miller in 1985, also a key technology in Bitcoin.
In the meantime, with the rise of the personal computer and the introduction of the Internet Protocol Suite, TCP/IP, the Internet became more widespread. However, usability was still a problem. You had to navigate the Internet using command lines, aka computer language. Tim Berners-Lee resolved this problem with his vision for the World Wide Web. He introduced a standard for creating visual websites with a relatively simple markup language, and navigating the Web with links, which point to other websites with a simple click. While from a publishing point of view, the WWW allowed everyone to easily be an equal contributor to information available on the Internet, data was still stored and managed behind the walled garden of servers. Neither TCP/IP nor the protocols building on it resolved the question of where to store and how to manage the data. In public computer networks, the structure of the system – network topology, network latency, number of computers – is not known in advance. The computer network can therefore consist of unknown and untrusted computers and network links. The size and composition of the network can also change at any time during the execution of a distributed program. The ability to provide and maintain an acceptable level of service in the face of faulty processes thus is a key in network resilience. The focus back in the day was on data transmission in a public network, which was already a hard problem to solve.
For economic reasons, centralized data storage and management became mainstream. The problem with centralized networks is that the system administrators, or institutions controlling the servers, are the only entities controlling the availability of information being shared. This means that if administrators decides to no longer distribute a file, or manipulate or censor data, they can simply do this on their servers, and the information will no longer be available to users.
In 1982, David Chaum introduced the concept of Blind signatures, which guaranteed the privacy of the sender of information. It was built for voting systems and digital cash systems. He introduced the idea of Ecash as an anonymous cryptographic electronic money or electronic cash system, that was commercialized through his company Digicash, which was used as a micropayment system at one US bank from 1995 to 1998. The system was dissolved in 1998, possibly because he was ahead of his time, as e-commerce applications were not that widespread yet.
In 1991, Stuart Haber and W. Scott Stornetta introduced a system where document timestamps could not be tampered with, introducing the earliest academic works on a cryptographically secured chain of blocks. Their aim was to certify when a document was created or modified “in a world in which all text, audio, pictures, and video documents are in digital form and in easily modifiable media.” In their initial proposals, they used centralized timestamping services. They then tried to distribute trust by requiring several users – that were selected through pseudo-random number generators – to timestamp the hash, instead of a centralized institution. A year later, in 1992, Bayer, Haber, and Stornetta included Merkle trees to the mechanism. This improved the system efficiency by allowing several document certificates to be collected into one block.
The first “Proof-of-Work” function was introduced by Hashcash in 1997 by Adam Back. The idea was to limit email spam and denial of service attacks, forcing computers to do some computational work. The original idea was proposed by Cynthia Dwork and Moni Naor in their 1992 paper, “Pricing via Processing or Combatting Junk Mail.” Before the emergence of Bitcoin many years later, the concept introduced by Hashcash was also used as a mining mechanism in b-money in 2004. B-money was an early proposal created by Wei Dai for an “anonymous, distributed electronic cash system.” It was proposed in the context of cypherpunks mailing-list discussions relating to possible applications of Hashcash, which was also published on the same mailing list. This cypherpunk mailing list – initiated by David Hughes – represented a group of activists advocating use of strong cryptography and privacy-enhancing technologies in every-day life as a means to social and political change. Many of the above mentioned individuals contributing key technologies that were later used in Bitcoin were active cypherpunks.
In 1998, Nick Szabo designed a mechanism for a decentralized digital currency he called “BitGold”, which implements many of his ideas around smart contracts and digital agreements. Szabo’s thoughts of self-enforcing agreements were formulated roughly around the same time as the Idea of Ricardian Contracts, introduced by Ian Grigg in 1996 (read more: Part 1 – Smart Contracts). While BitGold was never implemented, many refer to it as a direct precursor to the Bitcoin architecture. Just like Bitcoin later, BitGold imagined a PoW-based consensus algorithm in which computing power is spent to solve cryptographic puzzles. However, the BitGold proposal could not resolve the problem of double-spending in a fully decentralized way – sybil attack resistant – which was probably the reason it was never implemented. Szabo was speculated by many to be Satoshi Nakamoto – Bitcoin’s anonymous creator – a rumour he has denied.
In 1999, the music-sharing application Napster popularized the modern-day notion of P2P networks. It changed the way data was stored and distributed over the Internet. Napster created a virtual overlay network for file sharing, which was independent of the physical network of the Internet. In this virtual network, all computers involved formed a subset of the computers in the physical network. Data was still exchanged directly over the underlying TCP/IP network, but at the application layer, peers were able to communicate with each other directly. P2P networks increased robustness because they removed the single point of failure that can be inherent in a client-server-based system. If one computer on the network fails, the whole network is not compromised or damaged. Such computer networks need to be designed in a way that they tolerate the failures of individual computers, whatever the source of failure. However, Napster relied on the operation of central indexing servers, and was thus susceptible to shut down, after copyright infringement claims and a legal battle.
A new family of file sharing protocols spearheaded by Gnutella in 2000 eliminated such central points of failure. It allowed users to find each other and connect remotely, searching every node on the network, and therefore was more decentralized, and censorship-resistant. While Gnutella resolved the decentralization problem, they did not resolve the privacy problem. Third-generation file-sharing networks like BitTorrent used distributed hash tables to store resource locations throughout the entire network, in a cryptographically secure way. Distributed hash tables not only replaced indexing servers but also guaranteed the anonymity of its network actors and all data being shared over the network. These distributed hash tables are now also used by blockchains and other Web3 protocols like IPFS and Ethereum.
Even though modern P2P networks, ever since the emergence of Napster, resolved the problem of efficiently distributing data within a network, they did not resolve decentralized validation or verification of data. Another problem those files sharing networks did not manage to resolve was the “free-rider problem”: the fact that large numbers of users utilized resources shared by other nodes, but did not share anything themselves. It can cause the community to collapse. Free-riding was a result of the fact that users did not have incentives to cooperate. Cooperation furthermore consumed their own resources while degrading their own performance.
In 2004, Hal Finney introduced the first reusable PoW system before the emergence of Bitcoin. He introduced the idea that the value of an RPoW token is guaranteed by the value of the real-world resources required to “mint” a PoW token. Finney also received the first Bitcoin transaction from Bitcoin’s creator – Satoshi Nakamoto – in 2009. Finney apparently lived in the same town as a Japanese-American named “Dorian Satoshi Nakamoto,” who has often denied having been involved in Bitcoin. This fact added to speculation that he may have been Bitcoin’s creator, which he always denied.
In 2008, the Bitcoin white paper was published under the pseudonym Satoshi Nakamoto. It was shortly after the peak of the financial crisis and the collapse of major banks like Lehman Brothers. The aim was to provide a system for P2P electronic cash without banks. While the first specifications were implemented by Satoshi, a group of dedicated individuals gradually took over to implement further development of the code, which was finalized and deployed a few months later when the Bitcoin Network went live and the first block was created and the first bitcoin minted. However, it is interesting to note that the white paper did not talk about blockchain, only about a “chain of blocks.” The term “blockchain” became widespread years later when people started to replicate the Bitcoin codebase for developing alternative protocols.
While modern P2P networks such as Napster suffered from missing incentivization of network actors, early e-cash ideas were not able to defend against sybil attacks. The Bitcoin white paper was a game changer, as it proposed a protocol for collective validation of data. It introduced a consensus mechanism (“Proof-of-Work”), which allowed to store a growing transaction record – the blockchain – on each node of the network. Bitcoin thereby resolved the free-loader problem of previous P2P networks, by introducing tokenized incentives to motivate all actors to contribute to the system in a truthful manner. Even though Bitcoin was never designed with file sharing in mind, it eventually inspired a new class of P2P storage frameworks, which will be a crucial building block for a decentralized Web. These decentralized storage networks can now use the power of tokenized incentives to build on the legacy of previous file-sharing protocols, and incentivize their network actors with a native token, using a blockchain as a universal state layer. Examples thereof are: “Swarm”, “Storj”, “SIA”, and “IPFS”.
Full text and high resolution graphics available as paperback & ebook: Token Economy, by Shermin Voshmgir, 2019
About the Author: Shermin Voshmgir is the Author of the Book “Token Economy“. She is the director of the Research Institute for Cryptoconomics at the Vienna University of Economics, and the founder of BlockchainHub Berlin. In the past, she was a curator of TheDAO, and advisor to various startups like Jolocom, Wunder and the Estonian E-residency program. In addition to her studies at the Vienna University of Economics, she studied film and drama in Madrid. Her past work experience ranges from Internet startups, research & art. She is Austrian, with Iranian roots, and lives between Vienna and Berlin.
About the Book: Blockchains & smart contracts have made it easy for anyone to create a token with just a few lines of code. They can represent anything from an asset to an access right, like gold, diamonds, a fraction of a Picasso painting or an entry ticket to a concert. Tokens could also be used to reward social media contributions, incentivize the reduction of CO2 emissions, or even ones attention for watching an ad. While it has become easy to create a token, which is collectively managed by a public infrastructure like a blockchain, the understanding of how to apply these tokens is still vague. The book refers to tokens, instead of cryptocurrencies, and explains why the term “token” is the more accurate term, as many of the tokens have never been designed with the purpose to represent a currency. However, since tokens do have similarities to fiat currencies, the role of money as a medium of exchange is analyzed at length in this book. This book gives an overview of the mechanisms and state of blockchain, the socio-economic implications of tokens, and deep dives into selected tokens use cases: Basic Attention Token, Steemit, Token Curated Registries (TCRs), purpose-driven tokens, stable tokens, asset tokens, fractional ownership tokens, libra & calibra (Facebook), and many more.