Abstract

As intelligence moves from arbitrary and erratic patterns of human discretionary knowledge-building toward a more systematic and organic AI, there is a need for a new market mechanism to validate, distribute, and reward intelligent processes. Such an intelligent market is built on a systematic, scientific, replicable (SSR) process that is objective, accountable and can be validated and used by the community. This general intelligence or “alpha” should be content-agnostic and context-focused - an alpha process reconfiguring the block of the blockchain into ‘AlphaBlock’, an intelligent market mechanism. Alpha prediction has conventionally been associated with domain-specific content and is known to be predictive systems that are non-replicable and are mostly non-scientific. The author defines a General AI predictive process that can be fused into the blockchain block, transforming the blockchain into a multi-purpose predictive tool which self-builds, self-protects, and self-validates. AlphaBlock becomes the essence of everything linked with data predictability, evolving into an intelligence layer on the blockchain and the web. It is a predictive ecosystem which blurs the distinction between financial and non-financial data - ultimately removing barriers between financial and services markets. The blockchain can achieve this evolved state and become an intelligent market state if it crosses three key hurdles: First, it securitizes blockchain assets and creates new alternative assets and asset classes. Second, it resolves the incapability of conventional finance to understand risk effectively and enhances return per unit of risk (outperform the market) using a General AI process. Third, it must offer a better mechanism to address currency risk than what is offered by the existing fiat currencies and cryptocurrencies.

The author would like to thank Andrei Nagy and Skot Kortje for constructive criticism of the manuscript.

Content

1.     Introduction
2.     Price Discovery
3.     General Pricing Model
4.     Smart Contract
5.     General AI
6.     Data Universality
7.     Architecture of Data
8.     Interdisciplinary Intermediaries
9.     Predictive Transaction
10.  Securitization
11.  Assets Backed Validation
12.  Proof of Alpha
13.  AlphaBots
14.  Complex Network
15.  AlphaBlock
16.  Currency Agnostic
17.  Conclusion

 

 

1.   Introduction

Our society is highly prediction-dependent. Although mankind’s experience has always hung in the balance of forces outside its control, the measure of an advanced civilization is its capacity to understand risk and mitigate the extent to which natural forces threaten its well-being. The need for more efficient predictive capacity permeates our society. Of course, modern finance is rooted in risk management, but so too are many other industries. There’s a reason that quantitative science has taken over management process across the economy. But these predictive domains are connected and by virtue of the blockchain we are now able to leverage this cross-domain intelligence by connecting it in an AlphaBlock, creating a new marketplace of intelligent assets and supporting our society with a highly efficient architecture for mastering risk and optimizing returns across the economy.

The demand for this re-thinking of predictive domains should not be understated. Advanced economies across the globe, most acutely, are confronted with financial crisis thanks to an entitlement system dependent on the legacy risk management processes and institutions that govern the trillions of dollars of assets under management. The pensions and well-being of our society rest on the predictive capacity of the investment industry that provides the architecture for our current financial domain. The financial crisis overhang that has crippled and continues to threaten our global economy clearly shows that legacy prediction processes and framework are destabilizing. Systematic intelligence - alpha - is lost in our markets. Instead, alpha is at the whim of discretionary management that is self-perpetuating of crisis.

Non-financial domains e.g. weather, have trillions of dollars of real assets and productive capacity are at risk to natural forces. The predictive processes that translate this risk into transactions of widely-dispersed securitized assets are not optimally created and managed in its current framework. Weather predictive value does not have an exchange to drive efficiency and cross-pollinate knowledge. Linking this intelligence to the predictive intelligence of other domains would create a much more efficient and successful system for managing risks in all spheres of economic activity. All industries would benefit from the chain of alpha that exists in alternate domains. The future is an unknown - it is not the purview of man. However, there is a technology that allows us to estimate a more precise trail to its vision. That technology is the blockchain.

The innovation of the blockchain is transformational. It is far more than simply a secure digital process of transactions. It is a network of decentralized applications, an architecture with inherent trust built in its protocol that allows the community to interact without concern for the trustworthiness of participating members or a need for institutionalized control. The blockchain architecture is self-authenticating, and in combination with smart contracts allows for an unencumbered, free exchange of ideas and values that favors only the most efficient outcomes. For intelligent processes, this is liberating and precipitates exponential growth in value creation. The blockchain is a paradigm shift in the modern economy and will deliver a new mechanism for predictive assets to be created, validated, and exchanged. This is the AlphaBlock.

2.    Price discovery

Price discovery is the primary function of the market today. But it’s the perception of this price that determines its growth and decay. In the legacy market, this perception is subjective and lacks objective metrics. But why should something that is bought or sold be driven by a thematic interpretation - a story - rather than a systematic, scientific, measurable approach? Storytelling guides the market because the market is fragmented and controlled by agents based on region, asset, domain etc., and also because there is little initiative to unify markets by bringing accountability to predictive models (referred to as pricing models). This lack of competition and the absence of a level playing field is the primary reason why market inefficiencies cannot be arbitraged away.

The current blockchain technology is not immune from this legacy world challenge as it continues to focus on building an infrastructure that is transactional in nature - and thereby amplifying the market inefficiency. Notably, the rise in the value of cryptocurrencies is just like any other legacy world asset bubble - and it will suffer a similar fate! However, unlike the legacy world structures, the existing blockchain has many advantages in creating a new market which does not operate like the existing self- feeding, emotional, and irrational Ponzi scheme of the legacy market - where the higher asset prices go, the more insanely attractive assets become.

The blockchain characteristics of an interconnected market with no artificial credit, no fractional reserve banking, disintermediation, and a democratizing trust architecture give it an advantage in establishing a new market mechanism that creates new asset classes and dissolves the boundaries between financial and nonfinancial assets. It does this by redefining predictive pricing models leading to a more robust price discovery approach. This crucially clears the way for creating a better risk managing market mechanism that diffuses irrationality by offering objective intelligence metrics to the community. Such metrics would allow investors to think before they jump in with the herd and thereby extract their destiny from the madness of crowds.

A better price discovery allows blockchain to offer an alternative market mechanism which could resolve many of the legacy world problems. These problems result from human discretionary, non-systematic, non-scientific and non-replicable systems which rely on stories (narratives) to invest. To transform the blockchain into an intelligent market mechanism, simply referred to as AlphaBlock, it needs to overcome three hurdles to become an alternative and robust market mechanism. First, it must securitize the smart contract and other elements of the blockchain into new alternative assets, hence making them valuable and consequently transactable for exchange and transfer.

Second, it must embrace a General AI process that can resolve the alpha problem in financial and nonfinancial markets. Third, it must offer a better mechanism to address currency risk than what is offered by the existing fiat and cryptocurrencies. Price discovery is at the heart of the problem set.

3.     General Pricing Model

Think about it: at the heart of all market bubbles and crisis is a price. The price is driven by demand and supply. Demand and supply are driven by stories. Invariably, what starts with the wisdom of crowds ends with the madness of crowds. This is the three stages of Dow theory in which information flows from investors to speculators and, finally, to the proverbial shoeshine boys. This subjective and psychological story is repetitive, causing cycles to repeat.

Why does this happen? Reasons can be found in our cognitive biases. Because human memory is myopic, and history is too long and tedious to be assimilated by the human mind, which is more tuned for short-term gains and instant gratification. The few who master delayed gratification succeed as pointed out by the Stanford marshmallow experiment [1]. Those who can’t master it become the experiment, as pointed out by the Stanley Asch three lines experiment [2] and the Stanley Milgram obedience experiments [3] which explains why human beings are designed to herd. The endless repetitive oscillation between control and the lack of it is why social systems propagate bubbles and crisis.

Although economics and psychology can explain these bubbles well [4], they cannot solve the crisis. In fact, the legacy world is a natural self-feeding Ponzi scheme that is married to the current functioning of the market mechanism. The only way to solve bubbles and crisis is to change the way price is discovered [5]. Price discovery must be disconnected from the demand and supply tension that relies on storytelling [6]. The subjective psychological feedback loop has to be broken and replaced by a more objective metric. High price combined with high demand leads to bubbles while low price combined with high supply leads to crashes. These two extremities are counterintuitive to how investors should behave. In the goods market the same consumers show a different sensitivity, cutting down on demand as prices rise and vice versa. It’s the stories - the feedback and the cluster around these narratives - that cause the herd to ignore rationality and breakdown the price elasticity explained by economic models. [7]

How should we redesign a new robust economic model which can ease the bubbles and crisis? How do we create a market where demand is not just increasing because of increasing price, and supply is not just increasing because of a falling price? If instead storytelling price was a function of intelligence and intelligence came with an objective metric called alpha, then the price would increase because of intelligence - not because of demand inducing storytelling. Price would decrease because of lack of intelligence, and not because of supply inducing storytelling. Such a price would not see bulls charging to buy or bears charging to sell.

This new price would give both irrational bulls and bears pause to think - intelligently and consequently dissipate the speculative pressure whenever it will push to any extreme, positive or negative. The intelligent market, with its measurable alpha embedded in price, is not going to give an instant alternative to the current market mechanism, but it will definitely be the first step when demand and supply are influenced by an objective metric - an objective alpha price - and not by tips, gossips, and rumors leading immeasurable fear or greed.

This would be the cornerstone of the new market, a new way to discover price. Intelligent price is a measurable, scientific, systematic and a replicable process. If there is alpha, it should be the true metric. The challenge is to find it, validate it, secure it, and enhance it. The Intelligent Market is a place which nurtures alpha. But alpha can only be nurtured if there is an incentive mechanism. Until then alpha will sit on the fringe and let the market destroy itself, repeatedly.

We can break this vicious cycle if we disconnect the pricing model from an emotional function which is driven by subjective stories that feedback into the social systems and influence demand and supply and replace it with a general pricing model. This general pricing model is an objective function based on an alpha process that is measurable, scientific, systematic, and replicable. This process is verifiable and foundational - a better metric influencing the demand and supply.

Price ~ emotional function (subjective stories inducing demand and supply) - (i) 

Price = function (measurable, scientific, systematic, replicable, validated alpha driving demand and supply) - (ii)

4.    Smart Contract

The smart contract is the transactional foundation for the blockchain. The idea that a contract needs a mental clarity between parties before getting into an agreement is the discretionary challenge which the blockchain needs to overcome. The blockchain has to aspire beyond Nick Szabo’s “meeting of the minds”.[8]

Apart from the “meeting of the minds”, the prohibitive cost (computational, auditing, unforeseeability, hacking, etc.) incurred to honor the agreement is another limiting feature of the smart contract. We need a dramatic rethinking to continue the innovation from paper to digital to intelligence. It is not just about embedding the smart contract into hardware and software and keep the data secure but also keeping the data dynamic, valuable, reusable - and hence, intelligent. Redefining smart contract protocols cannot depend on a piecemeal process enhancement. The logic has to move beyond the current scope if we need to manage imperfect information [9] and asymmetric formalizations [10]. Third party validation has to be intrinsically woven into the smart contract. The answer lies in Nick Szabo’s definition of a transaction design.[11]

Transaction_decision = f(preferences, budget, environment, prices, price model) - (iii)

Excluding the pricing model in the function, the remaining variables (preferences, budget, environment) are expressions of discretionary choices of agents [12], with the respective three bound to change as the price variable changes. The changing price feeds into human discretion, influencing preferences and budgets, as well as how we measure the environment. The pricing model is the only element that can be systematized in the transaction design. It is key in how we design a transaction and how we understand the other variables.

5. General AI

A generic pricing model embedded in the smart contract should have a General AI [13] capability, which understands intelligence at the data level and can anticipate the mathematics of language with the same ease as the mathematics of finance or of quantum physics. Human intelligence, after all, is a subset of natural intelligence. A General AI that understands nature and is universal is more superior than the Human brain. Such a General AI should be able to learn and price assets across domains. It is an algorithmic framework that learns as it is trained. It has a global understanding but functions locally. It understands components and groups, inter-domain and intra- domain, convergence and divergence, signal and noise, etc. Intelligent design can only be in the end driven by intelligence, as only intelligence can improve and enhance itself.

The model can also be seen as a predictive system that is designed for disruption. It has to compete, win, and eventually lose. It is designed to continuously adapt, learn at an increasing speed or bow out. Its objective is to enhance transaction design in its essence and hence discover the intrinsic value of everything data. It's on a constant quest to offer the most optimal transaction given a set of conditions. It’s a model which is simple, organically resets, handles the ambiguous, and nurtures robust protocols. It’s the soul of the Smart contract.

The absence of such a predictive system or lack of a General AI framework to prices assets is why economic pricing models have failed to offer a solution to the bubble and crisis problems. The myopic focus on fundamental variables, idiosyncratic or systematic thought helped narrow the problem set and took the industry away from the general problem of extracting intelligence out of data. This was the primary reason pricing models could not scale and this is the primary reason why despite more than fifty-year history of factor models [14], the industry is unable to deliver alpha [15] and continues to stay married to a dysfunctional pricing system which has proved to be ineffective and redundant [16].

Blockchain has a choice. It must either rethink pricing or thoughtlessly adopt legacy pricing models. Rethinking and questioning is the more difficult path. This is why blockchain’s affinity for embracing old models also brings along the same legacy world problems - poor pricing, excessive speculation, boom and bursts - which would all have been fine if there was no alternative and the only way was five steps forward, three steps back. But as history tells us, in 1929 it was ten steps ahead and nine steps back, as markets collapsed by 90%.

Hence the answer to a superior market mechanism lies in a General AI pricing model that is a function of objectively verifiable alpha and not a creative non-replicable story. If 70% of scientific papers are not replicable [17], a Peter Lynch discretion [18], or systematic quant methodologies [19] that cannot be replicated in a secure nontransparent environment, then most of the portfolio management served by legacy asset managers is worse than the role of a dice. Putting your money with a multi-billion or trillion dollars Asset Manager might give herding investor flocks comfort as it’s easy to assume that the billions or trillions might know what they are doing, but there is a probability that this feeling of intelligence is a subjective illusion which can not protect us from the unforeseen risk of negative outliers like 1929, 1937, 1973, 1987, 1998, 2000, and 2008. This risk is immeasurable and potentially destabilizing.

And even if you assume a large drawdown risk like the risk of an earthquake, living on the fault lines in San Francisco for example, the risk does not simply vanish. In a long only market heavily overexposed to ETF’s and Indices the risk is more real than what is perceptible [20]. It took the market 25 years to reclaim the 1929 highs [21]. The blockchain can allow such General AI AlphaBots to train, learn, compete and hence prepares not only to enhance alpha but also to protect you from the probable and possible risk.

Nonscientific processes are essential for a General AI methodology because to be effective across domains, across preferences and potentially across durations, the methodology has to be robust, which is a hallmark of a scientific process. Such a General AI process may take some time to beat the human intelligence but it will eventually get there. The best human discretionary systems are not capable to adopt a General AI framework not only because of the non-scientific aspect but also because of issues of replicability. Warren Buffet’s systematic approach is hard to replicate. S&P 500’s 1923 [22] systematic process is replicable but it is a non-scientific methodology, which weights everything by size.

Though S&P’s simple rule-based systems have beaten everything else human, the strategy is incapable of scaling up in a domain agnostic blockchain environment primarily because of its non-scientific character. Their methodology is systematic but not capable of learning, enhancing, adapting, and cross-pollinating itself across domains. Any General AI process has an initial advantage over the S&P 500 methodology, although it will experience challenges when the problem sets become more complex.

6. Data Universality

Data Universality [23] is a prerequisite for a General AI process to deliver cross-domain alpha. As the world invests resources and time making sense of information, the data passes through stages of relevance and irrelevance. For more than half a century researchers have debated about intelligent systems that can make sense of this information.

Nature is an example of an intelligent system that selects relevant information and ignores the noise. Data is  the key channel for simulating nature, which is a complex ecosystem of interconnected domains. And since nature does not draw lines between rainforests and rivers, rivers and mountains, oceans and continents, our data systems should also be allowed to work in a seamless world where everything data is connected. You might say this obviously the case, but that’s not how our society works with data.

Typically, our data sets are fragmented. Financial data is different from non-financial data; consumption data is different from health data; social data is different from energy consumption data. All this data may sit in the cloud together, but it does not understand or relate to each other. We process this data for intelligence on a piecemeal basis. And even if we could see interconnections and anticipate snippets of trends, we don’t see the complete picture or the commonality of behavior. We are somehow lost in data and its content and we don’t see its context.

This is a problem with AI today: it is so much about the content that AI processes are functioning like the blind men figuring out the elephant [24]. Each blind man has his own interpretation. AI is biased today and in many cases can’t explain how it does what it does. General AI, on the other hand, is about discovering the elephant even if it means adopting a totally different approach to the more important data context over its content.

Data Universality is a commonality that governs a dataset irrespective of its source of generation or derivation. It is an intelligence that supersedes domains and comprehends complexity as intrinsic to data - be it intra-domain or inter-domain. Data Universality also defines behavior for a selection at a group level as it confirms and validates across a string of industries and domains in the network. Data Universality allows any General AI process to tap into any data point on the network.

This allows a user to define new problems and solve them. The process does not require a lot of data and is not computationally heavy. The solutions are robust and don’t change as the sample set changes. Simply put, the process simplifies the complex, understands the mechanism that generates uncertainty, and hence it does a better job of managing the risk inherent in a data point.

Blockchain’s general purpose architecture is built around the domains and its users and thus lends itself well to Data Universality by allowing it to capitalize on this unique opportunity to create a disruptive value for the users and market at large without relying on domain-specific frameworks which rely on non-scientific, non-systematic and non- replicable processes. The blockchain is designed to transact any perceptive value - be it financial or non-financial in nature. Blockchain in its new avatar acts like a global computer that digitizes the data collection on the world wide web, then navigates it to create data assets followed by assimilation of such assets for anticipation and intelligence using a General AI process.

Within the framework of Data Universality every data point is connected and is a part of the same ecosystem and intelligence is a dynamic feature of this network of data points. This ecosystem becomes the world wide web intelligent, and the only way to extract that intelligence out of it is to overlay a General AI system that can use the data for anticipation of both the intelligence signal and the measure of its statistical error. Eventually, it’s the trio of General AI, Data Universality, and blockchain that can transform static data records into an intelligence-enhancing system. This enhanced system will create more opportunities for the users who create data to secure and distribute these personal assets. But instead of lending it to intermediaries like Facebook, Twitter or Google for third-party monetization it will become a personal asset that can be administered, distributed, and monetized independently. The blockchain shifts power back to the user, allowing  us to take control of our data, bundle it, repackage it and benefit from it.

Combining and processing different kinds of data sets requires a framework with an architecture that is dynamic and modular. The framework should also be probabilistic

[25] to allow for training and simulations. It must take a mathematical form lends itself to benchmarking, which is paramount for ranking and comparing solutions. Benchmarking leads to transparency, which leads to transactability, thereby creating a market of intelligence assets. Data Universality is intelligent, as you can only enhance intelligence from what is already intelligence - in this case, the data.

Fig. 1: Unique features of Data Universality

1.png

 

Fig. 2: Data Universality Domains

2.png

For the Data Universal process there is no difference between financial and nonfinancial data. There is just one cohesive market covering all regions, all asset classes, and everything data. There is no segmentation. It is one unified data. The blockchain is the missing link that can take the Web from 2.0 to 3.0 followed by 4.0. And it can achieve this with a Data Universal General AI process.

7.    Architecture of Data

“Data in this world is infrastructure: a long-lived asset, general in purpose, capital intensive, and supporting multiple activities. Inference, by contrast, is short-lived, real time, trivially cheap, specific to a problem or task, continuously adapting, and perpetually self-correcting…The asymptote is where sensing, connectivity, and data merge into a single system. Every person and object of interest are connected to every other: the traffic readout on a mobile phone becomes the aggregation of all the data provided by all the mobile devices in the area reading the traffic. The world becomes self-describing and self-interpreting. At its outer limit, the digital map becomes the world itself. The world and our picture of the world are becoming the same thing: an immense, self-referential document. We are living in Borges’ map.” 

Navigating a world of Digital Disruption, BCG

If data had nature in it - in other words, intelligence - and if data indeed became the code (Neuman) [26] or we, the new society, were the Borge’s [27] map, the architecture of data becomes paramount. The structure is more important than the inferences that come out of it. According to Herbert Simon [28], the architecture of complexity [29] is intrinsically simple and hierarchal. The structure is always more important than the content. There is a commonality across various types of natural systems - including market systems. The whole remains more than the sum, suggesting complexity generated by a definable structure.

Fig. 3: Web is a bow tie

bowtie.png

The bow tie [30] architecture was first introduced in 2000. It is an architecture at every layer of the web, however you may slice or dice it. There will always be isolated elements, aspiring to associate with the larger structure but failing to do so - a few tendrils and tubes and pages coming in and out of the core.

Blockchain’s success lies in it aspiration to be natural like the web. To be natural we need an architecture that can persist. Though the weather is unpredictable it is not random - it persists and subsides. Consumption patterns are like the weather, they persist and subside. Even sentiment goes through the same regularity. It persists and subsides. Nature is predictable, even if aspects of it seem totally random. Random systems are systems in which no deterministic relationship exists [31]. Data interdependence is deterministic and, on occasion, non-intuitive. This is why the researcher should not be fooled by non-intuitive relationships. Even if there is an inference challenge, the indistinguishable nature is based on a structure.

This is why complexity brought it by the exponential increase can neither be resolved by human discretion nor by quantum computing. It needs a constantly evolving General AI which navigates the web, is self-describing, self-interpreting and is already an expression of nature. It is an intelligent web.

This architecture should have a structure. There should be blocks, modules, components, and layers. It should resemble a building or look like a bow tie, maybe be stacked. The architecture of data just like its stacked counterpart is about, collaboration, strategic thinking, ubiquitous connectivity, lower computing costs, seeking invisible patterns, software replacing hardware, the single universally accessible document, looking at data as infrastructure, quickly shared breakthroughs, self-correcting inferences, data sharing, etc.

This architecture is built on the assumption that similar challenges and strategies apply to many businesses, and all are required to anticipate not only immediate short term risks but must also anticipate the same temporal risks in the more distant future. Domains are not only interconnected. They are interdependent.

The Web 2.0 [32] is going through its data science revolution. It has community data and it starts to understand user behavior through user choices. We have crossed the stage from domain community to domain data.

The new architecture will be about the interaction between data from various domains. The standardization, integration, and interpretation across domains of data will be an integral part of the Web 3.0 [33]. This leads to data universality when data commonalities are explored for an individual (component) and domain (group) benefit, for inferences and to anticipate immediate and intermediate evolution.

Then the Web will be a thriving organism. This architecture could potentially drive Web 4.0, the ultra-smart agent which caters to various user needs. Web 4.0 won’t be simply artificial. It will be an intelligent web that does not distinguish between domains. When the web evolves to this architecture its structure becomes more important than its content and the value of a General AI is unleashed on a universe of data commonalities

The historical context of our information age helps us understand why the architecture of data precedes intelligence. With the Web 2.0 going through its data science revolution, the focus is on understanding consumer behavior through social actions. –There is a history of interdomain data traversing boundaries.  It was a few hundred years ago when contesting studies debated the effects of sunspots on the timing of economic activity. Now we could have studies contesting whether sentiment as observed by Google on bitcoin lead or lag bitcoin prices. [34] Indeed, alternative data has become the new buzzword in quantitative science.

However, the architecture of data does not assume static lead and lag relationships, temporary negative and positive correlations. Instead, it constantly adjusts and adapts to information, keeping the relevant data and ignoring the irrelevant data point. It standardizes and integrates data sources.

The cognitive web 3.0 will not just be about machines tagging datasets forming a common syntax, which enhances readability and interpretability but also about data universality, where data commonalities are explored for an individual (component) and domain (group) benefit, for inferences, and to anticipate immediate and intermediate evolution of the living organism, the World Wide Web. This architecture of data could potentially drive the transition to Web 4.0, the ultra-smart agent which caters to the community needs. 

Fig. 4: Architecture of Data

architecture.png

8.    Interdisciplinary Intermediaries

Despite the statistical fact that correlation does not imply causation, correlations are used to make important decisions. It was not long ago that Bangladeshi butter production theory of asset prices was an indicator followed by the world till it stopped working in 1993 [35]. George Taylor [36] talked about hemline index and bull markets. William Stanley Jevons [37] studied the relationship of sunspot cycles to economic cycles. There are research reports that show how to forecast using Twitter and there are reports suggesting that Dow Jones Industrials Index influences Twitter sentiment. [38]

The list of anomalies is long, starting from calendar month flavors like the January effect [39], “Sell in May and go away” [40], November to April Yale Hirsch [41] cycles, or it can be linked with misplacing like the equity premium puzzle [42], Size and Value premium [43], etc. Nobel prize winners have admitted that many of the anomalies are unexplained. We chose to casually look at our data seeking confirmation across

domains without robustness in results and replicability. This is why the industry is suffering from a replicability problem. We can backtest but we can not replicate it. We chose to work with a naive hypothesis without seeking more reliable solutions. The research is based on heuristics today, which has no place in the new world of blockchain and AI.

We don’t just need to solve the alpha problem; we also need to extend our models to solve bigger predictive challenges like weather and societal risk at large. The blockchain does not need to postpone the interdisciplinary interaction of its data to a later stage.

This interdomain knowledge can become the first step for the blockchain. Data Universality brings the inter-industry intermediaries upfront by looking at the idea of intrinsic value in data irrespective of its domain. Securitization allows a community to build not only predictive services for different segments and domains, but also convert them into measurable performances that can be transacted as assets.

The real science happens between disciplines. Blockchain has a choice to either be shepherded by legacy world biases or to design interdisciplinary solutions today. This would mean creating intermediaries that seek the intrinsic value of data irrespective of its domain. Imagine search engines delivering robo-advisory, weather bots forecasting sentiment trends, and fashion trends dictating auto sales!

This interdisciplinary architecture is a possible because of Data Universality. The blockchain is at a crossroad. It can either stay as a ledger or it can datafy itself, bundle interdisciplinary data, securitize those data sets as new financial assets, offer a mechanism to exchange these assets and eventually become the incumbent - the new market mechanism.

Fig. 5: Blockchain Evolution into a Data Universal Ecosystem

evolution.png

9.    Predictive Transaction

Fig. 6: Predictive Transaction

7.png

Prediction in its essence is a transaction with value. As the significance of the prediction increases, the transaction becomes more valuable. Society has not defined prediction in its most generic form as a systematic, scientific and replicable (SSR) [44] process which can anticipate. This is why a transaction assumes the core value of all our market systems. The moment we define prediction in its generality and start building on its significance by testing, validating, enhancing, we create an ecosystem where a predictive transaction subsumes all transactions.

The value of a non-predictive transaction is static. It rises or falls based on the demand and supply pressure. In contrast, the value of a predictive transaction is dynamic in nature. This dissipates the demand and supply influences, forcing the predictive assets to adapt - aligning and realigning constantly along a real metric.

Alice drinks coffee and pays for it = transaction - (iv)

Let’s take an example of Alice drinking coffee at John’s cafe and paying by bitcoin. Now, this is a simple transaction which is validated by the blockchain throwing wasteful energy at a transaction that has no predictive value. If instead of a plain vanilla transaction, the transaction could be framed as an SSR process that predicts whether Alice is going to drink coffee on a certain day.

Such an SSR prediction would be based on the purchase habits of Alice, her demographic profile, her social media habits and information that she has shared in an open data set. Or perhaps the bot is just using proxy data of people paying for coffee in bitcoin. This binary addition 1; Alice drinks coffee, 0; she doesn't add an anticipatory variable to the transaction.

Will Alice drink coffee today? = Predictive transaction [1,0] - (v)

The model is an SSR process which is verifiable, systematic, scientific and replicable. It could also be transparent (choosing to show the algorithmic process) or non- transparent. In both cases, it’s a non-manipulatable transaction embedded in the smart contract. And if the transaction cannot be tampered with, a simple addition of a prediction [1,0] factor makes the need for validation of each transaction redundant. The

non-manipulatable predictive model embedded in the transaction transforms a static low-value record into something of value. And what is of value can either enhance in value or fall in value. The transaction becomes a meaningful non-manipulatable metric. A predictive transaction, in essence, challenges societal herding, which is at the heart of risk in social systems. It also replaces storytelling, a predominant process currently used by society to sponsor non-SSR processes and the feedback loop wired into today’s transaction of value.

In the wake of this transformational shift, storytelling will be forced to align itself around the dynamic value of a predictive transaction. Instead of a narrative defining the value, it will become the value defining the story. And since the predictive value is open to disruption and challenge, it will come at the cost of enhancing the SSR process, i.e. the accuracy of prediction goes above 50% for the model to predict Alice’s coffee drinking days.

Just like the legacy world, the blockchain is transactional in nature. In its current form, the blockchain cannot become an alternative market mechanism which resolves the three problems. Replacing the transaction with a predictive transaction transforms the blockchain into a predictive ecosystem. This new self-validating ecosystem induces the need to rethink the current nature of consensus and the need for excessive energy- centric computational mining. Building the blockchain around a predictive transaction also creates a level playing field for all human innovation.

This is the SSR process. Being generic in nature the predictive transaction will also involve a standardized set of metrics, benchmark-ability, and re-usability across domains apart from a predictive purpose. The predictive purpose involves defining a predictive problem and addressing it with an SSR process. Predictive problems are everywhere and do not necessarily just involve capital markets. An SSR process which extends beyond a financial context to market sentiment, electricity, and weather, etc. makes for a robust predictive transaction which is of real value - a critical tenet for a novel market mechanism.

The key characteristics of a predictive transaction (tp) are validation, dynamic value, and duration. tp can be validated anytime and its longevity is defined by two periods i and j, Where i could be the inception point for the predictive transaction or simply a validation period. tp is a fixed duration (d) agreement between the generator of the predictive process and the agent validating it. Until the time the validation date (j) is less than the expiration date (d), tp remains alive. On duration date (d), the tp can expire.

Expiration is an essential element as market mechanisms should have a way of expunging non-meaningful predictive processes. Once the validation period expires there is no need to validate, no need to waste resources on dead transactions.

10.    Securitization

The fragmentation within financial markets and the distinction between financial and non financial markets should not be seen as a consequence of our research systems. Rather, it should be seen as a problem arising owing to lack of infrastructure. Now that we have the infrastructure it’s time we resolve the bottlenecks linked to segregation imposed by the society on markets. Moreover, with financial markets suffering from a lack of

accountability and replete with the sponsorship of non-SSR stories, removing boundaries is the only way ahead for a single market. It establishes a level playing field which allows for comparison between SSR processes and enables a domain agnostic accountability. The truth of alpha is brought to bear!

Once the predictive transaction replaces every transaction inside the blockchain, the next step is for the blockchain to securitize its assets. Securitization creates liquidity, allowing for pooling and transfer of risk and assets. Securitization of the blockchain assets also makes the distinction between financial and non-financial assets redundant as everything gets a data universal distinction.

Well-functioning markets need transactable vehicles which can embed value and allow for exchange. The blockchain has only one vehicle today: the cryptocurrency, which suffers from a legacy world fate. The other blockchain asset, the smart contract, is not precisely an asset - the transaction value it carries is a mere record of data. To put energy-centric computational energy behind such a smart contract is like defending something which has little value. Smart is a misnomer in this case.

Fig. 7: Dynamic Value of a Smart Contract

8.png

However, a smart contract based on a tp transforms into an asset as it acquires all the characteristics of the tp: the metrics, the standardization, the dynamic value. The tp also addresses the asset’s constant need for validation because it can be revalidated and reused. A smart contract on a tp creates an alternative asset class that allows the blockchain community to participate in a domain–agnostic, generic predictive process, breaking barriers between all asset classes and all kind of markets. The tp brings out the embedded intrinsic value in the smart contract and opens up new opportunities to price it. It strengthens the General AI process as new asset pricing models drive the markets away from conventional causality.

The tp smart contract gives the blockchain the capability to create assets, securitize smart contracts, and work as a true financing mechanism which has universal usage and predictive purpose. Such a blockchain will allow participants and incumbents to reduce their reliance on legacy world market mechanisms.

Since the tp has a large scope, deriving its strength from a broad predictive purpose, the specifications of the tp adds a structure to the scope. This makes tp very amenable as a business service and hence can be articulated in a smart contract. Any kind of predictive transaction can be embedded into a Smart Contract which changes in value and hence can be securitized.

11.     Asset Backed Validation

Fig. 8: Assets backed Smart Contract

9.png

The primary tp validation is followed by a dynamic value, which transforms the smart contract and facilitates the securitization process on the blockchain. The smart contract value (price) is disseminated on the network. The alpha metrics gather interest and attention from assets seeking investments on the securitized smart contracts. Since assets have a tendency to cluster around other assets in an increasing value, the smart contract validation consequently attracts financial assets backing smart contract assets. Different cryptocurrencies and different preferences will see smart contracts attracting assets.

Regulation and settlement reside in the legacy world. Blockchain enables the legacy world with a validation of the SSR (Scientific, Systematic and Replicable) process. Alpha generators face a lot of resistance proving their process. Blockchain predictive transaction validated in a secure environment offers an alternative approach to validate processes on an ongoing basis. There is no standardized approach for validation of a predictive process in capital markets. And we are speaking not just about financial market SSR processes but also processes that are non-financial and just predictive in nature.

Critical mass is a problem for every industry. In the financial Investment management industry it’s a bigger problem as the fees are low and the breakeven capital requirement is high. Blockchain-based asset validation can assist legacy investment managers by not only showing validation of their SSR process and also back them with crypto assets. Consequently, the sales lifecycle connected to bringing more assets under management is shortened. 

12.     Proof of Alpha

In an intelligent market system there is no proof of work, there is only the proof of alpha - whether the transaction is solving a predictive problem, whether it is systematic and scientific and whether it is replicable. Such a proof is essential because there is no place for subjectivity on AlphaBlock. Only SSR processes can be disseminated, validated and distributed on the AlphaBlock.

Stage 1: SSR has a clear use case in financial markets. Any financial model has to go through an SSR validation process. It has to demonstrate that it is scientific, systematic and replicable. If it can’t demonstrate that it fails the validation test and can’t be disseminated on the AlphaBlock platform.

Stage 2: In this stage, the SSR process goes through a rigorous validation test where the process demonstrates a historical backtest for a pre-specified period, e.g. 20 years.

Stage 3: In this stage, the SSR process is listed when it is considered live for dissemination and potential backing of crypto or fiat currency.

An SSR process in financial investment management industry has a high likelihood of working across global equities and even other assets like fixed incomes, commodities, currencies, alternatives and hence qualifies as a General AI process which does not depend on causality, themes, factors, etc.

Despite two Nobel prizes [45] awarded on the topic, the financial investment management industry is confused about what creates alpha. The alpha problem is why models don’t transcend asset classes and are disproportionately skewed towards equity rather than across asset classes. Most solutions are simply for a rising market.

Historically there has been limited alpha innovations but more innovations at the vehicle, instrument, market structure level. Moreover, it’s only now that the legacy world is seeking quantitative systematic solutions for alpha. This is why simply extending legacy world tools to blockchain will not solve the alpha problem. The proof of Alpha will involve a framework for testing various algorithmic scenarios.

A simple example of proof of alpha for investment management solutions is a framework which defines any portfolio with a universe data set to select from, weights to allocate to these selections (components of the data set) and algorithmically specified rebalancing schedule. As the complexity increases the portfolio can be enhanced by preset filters and conditions. This General AI framework should pass through all preset stages.

13.     AlphaBots

Fig. 9: AlphaBots compete and collaborate on the AlphaBlock.

6.png

Portfolio Value = function (Data, Weight, Rebalancing) - (vi)

Portfolio Value = function (Data, Weight, Rebalancing, Filters) - (vii)

Proof of alpha is domain agnostic and can address SSR process outside financial markets. The dynamic SSR value driving the story is technology calling the bluff of the social system models that have caused confusion and a sequence of crisis. The legacy system must articulate a coherent predictive process that is redefined as an SSR process.

Transaction validation is backed by consensus on the blockchain. When the transaction validates itself, inefficient mining becomes an alpha problem. Mining is a ritual conceived for a transaction network and has reached more than a few bottlenecks regarding mining costs and purposefulness. Only a fraction of mining is used for block creation today. The transactions are data structures that have no dynamic value and lack of dynamism means lack of life in the AlphaBlock context.

The predictive transaction tp lays down a new architecture where trust clusters around value – a predicative transaction which is real, standalone, verifiable, and hence more efficient. There will be many winners and many losers in the process but in the end, the purpose is to make better market mechanisms - not a cryptocurrency speculative bubble where owners perceive the value but have limited and destabilizing mechanisms to anticipate that value.

The future of blockchain will be a competing marketplace for AlphaBots - i.e., robots which engage in predictive SSR processes. Bots could be domain specific or engage in interdomain predictive solutions. AlphaBots create benchmarkable performance - objective and verifiable - that is validated by validation bots, with the alpha assets

distributed by the distribution bots. The distribution bots are the accountants for the assets backing the alpha bot performances. A legacy world problem is now entering the crypto funds space, where some experts assume the role of fund managers, confusing intelligence with information content. Unlike the human discretionary processes which are connected to a person (guru), alpha bots are backed by innovating corporates or individuals. There is a need to secure such alpha processes and have a reward mechanism for the multi participants engaging in the market mechanism.

AlphaBots is any SSR process that is open to validation on the AlphaBlock. The process can be alpha agents like prediction markets or data scientists, or simply machines open to competing on the AlphaBlock. Competition is important for the accountability of an SSR process. Without the validation accountability any alpha process is simply a subjective claim of alpha without an objective proof.

For example, prediction markets may or may not be systematic processes and when there are a few of them out there, there needs to be an objective risk-weighted measurement of their performance. The alpha generating industry is not obliged to standardize and measure risk.

According to a 2009 publication by Harvard Law [46], Prediction Markets and Law, “Uncertainty is a painful part of reality and hence the performance of prediction markets is witnessed to be inversely correlated with how valuable their predictions would be.

Idiosyncratic action and predictability of prediction markets create systematic biases in all information markets. Prediction markets don’t tell more than what participants can figure out themselves considering the underlying materials.”

Surprisingly, this research was published much before prediction markets started hitting ICOs [47]. The core notion of the wisdom of crowds not being subsumed by the madness of masses is why prediction markets and every alpha process must go through a validation scrutiny. Human ability is overrated and both history of performance and behavioral finance has proved that humans are overconfident about our abilities [48]. Fund managers, investment clubs, and individual investors fail as a group to predict the markets. This is why prediction markets and their experiment with the wisdom of crowds is an experiment destined to fail.

AlphaBots can also be seen as the corporates with a clear purpose of alpha generation, taking the society forward through objectivity, away from the road of unverifiable claims and gatekeepers. An ecosystem that is focused on consistent innovation, purposeful computation along with offering a better risk transfer mechanism which understands how to stabilize systems without the intervention of a central bank. A system that is composed of smart agents which appreciate and understand the complexity will flourish. There will be no more references to it as the invisible hand.

14.     Complex Network

For the 100 people who know about blockchain, 10 may have heard about Nick Szabo and from those 10 who have heard of him, maybe 1 understands the flawed Directed Acyclic Graph [49] DAG architecture of Blockchain. I don’t want to speculate how many out of these 1 understand the relationship of preferential attachment [50] with complex networks.

The key mechanism for blockchain technology to move beyond its distributed ledger status is the dynamic nature of the smart contract. When the smart contract becomes a dynamic predictive transaction based asset, it becomes reusable and can be combined with other smart contracts to become a new asset class. It also is forced to compete with other smart contracts to stay relevant and grow or simply decay.

Fig. 10: Smart Contract Reusability.

11.png

 

P(k) = B(k, γ)/B(k0, γ − 1) - (viii)

Fig. 11: Directed Acyclic Graph

dag.png



We must transform blockchain into a robust complex network before it can function as a vibrant market mechanism. Complex networks and complexity are about power law behavior. Physicists explain power law through the mathematics of preferential attachment, which is an incomplete science because it does not explain the second mover advantage where companies like Google take over the first mover companies like Yahoo. This is a simple and existing behavior of a complex network which allows for growth and decay as simultaneous actions. Moreover, the blockchain’s transactional sequence was not designed for feedback. This  is a consequence of its Directed Acyclic Graph (DAG) architecture which adds on to the blockchain’s incapability to behave like a vibrant market mechanism.

15.     AlphaBlock

“Change only comes when it is dramatic” Hammer and Champy (1993)

The society is information dependent - even more, when information has become datafiable and AI has started extracting knowledge from the data. When knowledge is enhanced with anticipation, it becomes intelligence. Hence the most likely candidate to become intelligent is the web, as it is not only a store of data but it is organic and allows for interaction of data, leading to new knowledge and layers of increasing intelligence.

AlphaBlock facilitates this transformation of a transaction data record to become a store of intrinsic intelligence which is inexhaustible and hence mineable. AlphaBlock is a unit of intelligence, which starts with building predictive solutions leading to an intelligent agent that can perform multiple human intelligence tasks like managing your pension fund. It’s essential to build such an alpha process as a market mechanism because just like price discovery, alpha enhancement needs filters and screening. The components of such a data universal structure have to compete for assets, be it community resources or transaction value. Meritocracy without legacy world challenges can only happen through a new market mechanism where domain-specific solutions are exchanged and re-exchanged seeking broader intelligence and driving the society ahead.

Fig. 12: AlphaBlock’s Smart Contract

12.png

AlphaBlock first participates by creating the cognitive web where machines read data, interpret it for intelligence, and mine it for tradable value. In the second stage, AlphaBlock allows for exchanging that value. In the third stage, the process becomes virtuous as repeated cycles of knowledge are extracted from information as the web gets datafied, creating the ultra smart agent on the web 4.0.

AlphaBlock also assists in the decentralized storage of data, an alpha problem. It serves as the microchip for the web. Intelligence replacing the smart. Society’s quest begins at alpha. We should get ready to embark on the journey that is a must for evolution.

AlphaBlock’s smart contract redesign based on the predictive transaction logic and general pricing offers an out-of-the-box approach to transforming the blockchain into a complex vibrant network. The smart contract becomes the building block which dynamically grows and decays as crypto and fiat assets backing it increase and decrease. This redesign of the smart contract also resolves the need for successive validation, hence fractionally reducing the computational requirements. The new architecture has the utility of predictive transaction and the robustness of a complex network which seamlessly connects to the world wide web, transforming the combination into a singular total market.

Fig 14: Smart Contract based transformation of the blockchain into an integrated alternative market mechanism on the world wide web.

onemarket.png

16.     Currency Agnostic

A market mechanism needs a currency or a mechanism for exchange. With the majority of ICO tokens losing value after launch as the promise of delivery falls apart, it’s essential to understand the risk in crypto assets. Lack of SSR (Systematic, Scientific and Replicable) processes and discretionary claims of performance is what creates risk in the first place. It’s essential for investors, speculators, and arbitrageurs to understand this risk and be aware of the fact that they are seeking value and not an asset per se. Behavioral finance has cited poor portfolio diversification as one of the key reasons for risk.

The future of blockchain currency is dependent on the predictive model like everything else. The players have a choice to use a discretionary subjective model or an SSR (Systematic, Scientific and Replicable) process. Just like legacy world assets, crypto assets can be enhanced with a general pricing model and by competing validated SSR processes on the AlphaBlock. If the pricing model can solve the alpha problem in finance, it can also create a portfolio of crypto currencies that is more stable than any other combination of its fiat and crypto peers.

Active managers in the legacy world have witnessed fee compression because non-scientific passive systematic and replicable methodologies (S&P 500) have started competing with the mutual funds and other active managers. The investors now can measure performance net of fees leading to assets moving into low cost passive methodologies. The same trend will happen in the crypto assets when AlphaBlock offers zero fee validated Crypto portfolios to the general public. Such SSR transparent processes can only be offered through a market mechanism that is currency agnostic and is driven by value retention rather the asset hoarding.

17.     Conclusion

Data is not considered an intrinsic asset, thus financial markets use a multitude of asset specific pricing models. A generic pricing model would require system thinking where every portfolio (financial or non-financial) would be considered a selection from a Universe, weights and rebalancing problem. Blockchain suffers from this lack of generality as Nick Szabo’s Smart Contract pricing function does not offer a pricing solution.

As asset creation requires standardized metrics which can evoke community consensus to invest and exchange, the absence of this general pricing renders the blockchain unscalable, straightjacketed and incapable to create investible assets. This makes the alternative crypto currency assets similar to the investment management assets which does not use generic pricing mechanism across assets.

The lack of generic transaction design is a common problem for both legacy world and blockchain. The financial industry suffers from negative alpha, limited product differentiation, and near-zero fees as assets herd existing assets. Above this, the belief

that low frequency investing is driven by manager skill which can’t be automated keeps the markets inefficient and prone to bubbles and busts and lost in a circular argument about whether one should invest in a market capitalized weighted benchmark or a factor weighted portfolio. The same problem also exists in the blockchain. The cryptocurrencies are not asset-backed and have limited relative performance metrics between them. The limited supply and momentum based demand keeps stretching the prices into a bubble.

A good transaction design drives alpha and a poor design pushes alpha into a post facto parameter rather than a pre asset allocation validation process. Markets today lack the mechanism that can evaluate Scientific, Systematic, Replicable (SSR) processes and even if there was a mechanism, the system would have to be inverted for assets to follow SSR alpha processes and not herd after other assets.

Because blockchain as a community lacks the interdisciplinary experience in finance, science, and technology, it unquestioningly embraces the legacy world specific pricing models and hence carries forward the existing alpha problems to the blockchain network. The community needs to focus on scientific ideas like preferential attachment, power law and comprehend the connection of blockchain’s Directed Acyclic Graph (DAG) architecture as one of the core reasons for blockchain’s inability to scale as a complex network and hence witness the natural decay and growth of its assets. The current system works like a pressure cooker without a steam whistle. The burst could impair the incipient blockchain infrastructure.

Blockchain needs to be made interdependent not dependent on the cryptocurrency. The distinction between what’s financial and what’s not financial has to go. A generic price model can price the Data which sits inside the Smart Contract, transforming the blockchain transactional vehicle into an investible asset of interest. A good transaction design allows for validation and hence motivation for asset backing. This inverts the existing asset herding model and converts smart contracts into alpha bots that compete on AlphaBlock - an ecosystem which enables participants to offer validation, distribution, and services around the SSR alpha processes. Such a system allows the blockchain to become a vibrant network which can grow, decay, and whistle at the same time.

An asset and domain-agnostic general pricing framework allows us to hack the blockchain architecture by pricing and hence converting the Smart Contracts into unique assets. This not only blurs the difference between financial and non-financial assets but also transforms the blockchain into a complex network and hence into a robust single intelligent market mechanism, which does not grow just at the frontiers but evolves into a vibrant entity like the web. The Intelligent Web.

References

[1] Mischel, W; Ebbesen, E. B. "Attention in delay of gratification". Journal of Personality and Social Psychology. 16 (2): 329–337. 1970.
[2] Asch, S. E. "Studies in the principles of judgments and attitudes: II. Determination of judgments by group and by ego-standards". Journal of Social Psychology, 1940.
[3] Stanley, M."Behavioral Study of Obedience”, Journal of Social Psychology. 1963.
[4] Shiller, R. “Irrational Exuberance”, ISBN   978-0691050621. 2000.
[5] Bossaerts, P. Kleiman, D. Plott, C. R., “Price Discovery in Financial Markets: The Case of the CAPM”. Social Science Working Paper No. 1032, SSRN, 2000.
[6] Shiller R. “Economics and the human instinct for storytelling”, Chicago Booth Review, May 2017.
[7] Prechter, R. R. Parker, W D. “The Financial/Economic Dichotomy in Social Behavioral Dynamics: The Socionomic Perspective”, Journal of Behavioral Finance, Vol. 8, No. 2, pp. 84-108, 2007.
[8] Szabo, N. “Micropayments and Mental Transaction Costs”, Nakamotoinstitute, 1999.
[9] Osborne, M. J. Rubinstein, A. "Chapter 11: Extensive Games with Imperfect Information". A Course in Game Theory. Cambridge M.A.: The MIT Press, 1994.
[10] Stiglitz, J. E. “Information and the change in the paradigm in economics”, Nobel Prize Lecture, 2001.
[11] Szabo. N. “Smart contracts reduce mental transaction costs”, 2006.
[12] Parunak et al. “Universality in Multi-Agent Systems”, 2004.
[13]Pal, Mukul. “Human AI”, SSRN, 2017.
[14] Zélia, C. Thierry. R. “Facts and Fantasies About Factor Investing”. 2014.
[15] SPIVA Statistics & Reports, SPIndices
[16]Montier, J. “CAPM is CRAP (or, the Dead Parrot Lives!)”, Behavioural Investing: A Practitioner's Guide to Applying Behavioural Finance, John Wiley & Sons Ltd, Oxford, UK. 2007.
[17] “1,500 scientists lift the lid on reproducibility”, Nature, 2016.
[18] Shefrin, H. “Beyond Greed and Fear”, Oxford University Press, 2007.
[19] Jounal of Portfolio Mangement, Quantitative equity Strategies, Special Issue, 2017.
[20] “The Silent Road to Serfdom: Why Passive Investing is Worse Than Marxism”, Bernstein, 2016.
[21] “Complete Elliott Wave Writings of A. Hamilton Bolton”, ISBN 0932750222. 1994.
[22] “Key dates and milestones in the S&P 500's history”,  Reuters, May 2013.
[23] Pal, M. “Data Universality”, Princeton UChicago Quant Conference, 2014.
[24] “Blind Men and the elephant”, Verse in Rigveda, 1500 BC.
[25] A. A. Markov. "Spreading the law of large numbers by quantities that depend on each other." "Izvestiya of the Physico-Mathematical Society at the Kazan University", Volume 15, art. 135-156, 1906.
[26] Neumann, V. J. Burks, A.W. “Theory of Self-Reproducing Automata”. Urbana and London: University of Illinois Press. ISBN 0-598-37798-0. 1966.
[27] Borges, J. L. “On Exactitude in Science”. 1946.
[28]  Simon, H.A.“Rational Decision-Making in Business Organizations”, Nobel Prize Lecture, 1978.
[29] Simon. H.A. "The Architecture of Complexity”, American Philosophical Society. 106 (6): 467–482, 1962.
[30] Debora et al., “Mining the Inner Structure of the Web Graph. Journal of Physics A-mathematical and Theoretical Physics. 2005.
[31] Bak. P. “How Nature Works: The Science of Self-Organized Criticality”, Copernicus, New York, U.S. 1996.
[32]  Darcy, D."Fragmented Future" (PDF). Print. 53 (4): 32. 1999.
[33]  Berners-Lee, T. Hendler, J. Lassila, O."The Semantic Web". Scientific American. 2001.
[34] Osterrieder et al. “Bitcoin and Cryptocurrencies - Not for the Faint-Hearted”. Advanced Risk & Portfolio Management Paper. SSRN. 2016.
[35] Leinweber, D.J. “Stupid Data Miner Tricks: Overfitting the S&P 500”, 2007
[36] Taylor, G. “Hemline Index theory”, 1926
[37] Jevons, S. “The Principles of Economics” 1905.
[38] Mao, B.J. Zeng, H. “Twitter mood predicts the stock market.”, 2010.
[39] Keim, D. B. “Size-Related Anomalies and Stock Return Seasonality: Further Empirical Evidence”, Journal of Financial Economics 12. 1983.
[40] Maberly, E. D. Raylene, M. P. "Stock Market Efficiency Withstands another Challenge: Solving the "Sell in May/Buy after Halloween" Puzzle” ”, 2004.
[41] Hirsch, Y. “Don’t Sell Stocks on Monday”, ISBN-10: 0140103759. 1987.
[42] Mehra, R. Prescott, E. C."The Equity Premium: A Puzzle" (PDF). Journal of Monetary Economics. 15 (2): 145–161. 1985.
[43] Fama, E. F. French, K. R. "The Cross-Section of Expected Stock Returns". The Journal of Finance. 47 (2): 427. 1992.
[44] Systematic, Scientific and Replicable (SSR) - A term coined by the author.
[45] (a) Markowitz, H. Miller M. Sharpe, W. Nobel Prize 1990. (b) Eugene, F. Shiller, R. Nobel Prize 2013.
[46] “Prediction Markets and Law: A Skeptical Account.” Harvard Law Review, vol. 122, no. 4, pp. 1217–1238. JSTOR. 2009.
[47] Percy, V. “Initial Coin Offering (ICO) Risk, Value and Cost in Blockchain Trustless Crypto Markets” SSRN. 2017.
[48] Ricciardi, V. Simon, H. K. “What is Behavioral Finance?”, Business, Education & Technology Journal, Vol. 2, No. 2, pp. 1-9, Fall 2000.
[49] Reid, F. Harringan, M. “An Analysis of Anonymity in the Bitcoin System”, arXiv:1107.4524v2, 2012.
[50] Simkin, M.V. Roychowdhury, V.P., Re-inventing Willis, 2006.