Over the post-event dinner at the Princeton – UChicago quant conference, the conversation veered to the definition of an intelligent system. My co-speaker (a physicist) from the conference defined an intelligent system as a system that is entropy reducing.

Science and Markets

Connecting entropy with intelligent systems was brilliant not only because we need a scientific approach to building and understanding systems but also because entropy is about change, a constant fixture of complex systems; markets being one of them.

Characteristics

Intelligent systems (IS) are self-aware, logical, have memory and are able to perceive, retain knowledge (information) and create models based on any implicit or explicit instructions. IS can comprehend complex ideas, learn quickly, learn from experience, make sense of things, adapt, reason, recognize patterns and can provide an efficient application of general laws.

Natural Systems

Natural systems (NS) are conserving and self-replicating. Maybe they are intelligent because they understand the limitation to anticipation; the limitation to anticipate uncertainty; limitation to predict fluctuations, confirming Werner Heisenberg's question to God, “Why turbulence?

The second law of entropy (positive change) and the creation of life (failure of Boltzmann's brain) are also a proof that NS can create and flourish under entropy flashes (expansion and reduction).

It does not seem that NS are designed to reduce entropy but to conserve till they reach the entropy flash, a kind of entropy window, when the transformation happens (a phase change), a new big bang (multiverse), an opportunity to replicate. NS don’t seem to be timing the phase change, they just wait for it; patiently. Maybe because NS understand that they can reproduce (replicate), thrive by creating self-similar structures. This is a good analogy for investment and trading systems, which need to conserve more than they burn. The objective is to stay in the game and not getting kicked out of it.

Signal and Noise

According to Nate Silver’s signal and noise, “The closest approximation to a solution is to achieve a state of equanimity with the noise and the signal, recognizing that both are an irreducible part of our universe, and devote ourselves to appreciating each for what it is”

We need to have noise before we see or extract out the signal. Doesn’t noise help us differentiate the signal? If noise was a necessary evil, why all the effort on reducing it? Since entropy is the driver for both noise and signal, embracing noise and fluctuation becomes an essential for any natural (intelligent) system.

Complexity and geometry

Naturality, commonality brings us to Herbert Simon’s complexity. How complexity was intrinsically simple and hierarchical. It was not just Simon, but even Thomas Hobbes who held that geometry was the branch of knowledge that best approximates the reasoning that should form the basis of a true philosophy; intelligent systems being one of them. Thomas Bacon also insisted on reaching to forms, while observing nature.

Counter Intuitive Systems

So now that we have a case for NS to be intelligent, how could we show that it could be smart enough to reduce entropy in an ecosystem of reducing entropy? Though the Princeton – UChicago event team members were supposed to change places (increase entropy) to mingle well with the guests, the debate had become too engaging to leave. We had to accommodate another chair (reduced entropy), to conclude and reach some kind of quantum conclusion (entropy flash).

And then it hit me, was Galton’s mean reversion, not an entropy reducing system, it allowed for divergence and convergence, a classic case for expansion (increase) in entropy and reduction (decrease) in entropy. The only reason it was prone to failure was because Praetorian extremities were making sure the divergences were strong enough to make the convergences meaningful. Mixing Pareto with Galton simplified the working of all natural systems, allowing for divergence and convergence; intuition and counter-intuition; momentum and reversion; expansion and reduction in entropy; fluctuations and seasonality in the same framework; a geometrical foundation for intelligence.