Insights

"The art of knowing is knowing what to ignore" - Rumi

Over the last few years we have witnessed major political events, Brexit and the US Elections in 2016 and 2020 to cite a few, where the outcome predictions and the consensus market implications were seriously off. Worse from an investment point of view even for those who did manage to predict the correct result, the subsequent market moves in some asset classes were not in line with what was originally expected. The amount of resources spent trying to predict the outcome of these events and their investment implications was considerable, yet only in hindsight were "rational" explanations and logical market implications constructed. In this piece we go over a number of challenges related to information and its interpretation from an investment perspective and we then look at how we can deal with these issues in practice.

Information Bottleneck:

As investors we consume large datasets and information in an attempt to extract more signal and less noise. We think the more information we consume the more signal we’ll extract but the human mind doesn’t really work like that. When the volume of information we consume increases, our ability to comprehend the relevant from the irrelevant becomes compromised. We become prone to placing too much emphasis on irrelevant data and losing sight of what’s really important and in effect increasing noise rather than the signal. In his book Antifragile [1], Nassim Taleb calls this paradox "the noise bottleneck". A good example of how human judgement can get clouded by irrelevant information is the often cited Linda problem which originated in the works of Amos Tversky and Daniel Kahneman [2]. In one of their questionnaires the researchers used the following:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more probable?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement

The majority of those asked chose option 2 even though mathematically the probability of two events occurring together is always less than or equal to the probability of either one occurring alone. Tversky and Kahneman argue that most people get this problem wrong because the human brain uses a simplifying procedure called representativeness to make this kind of judgment. As a result option 2 seems more "representative" of Linda based on the description of her, even though it is clearly mathematically less likely.

To process information we rely more and more on quantitative models, but in building these models this paradox also presents itself as the more data and complexity we add to the model does not necessarily mean we have a better model. In fact this complexity could simply introduce a false sense of confidence and security that inhibits our ability to see models blind spots and to act promptly in circumstances where the models can be fallible.

Models Limitations:

We find it useful as quants to always remind ourselves that models are like maps which should not be confused with the reality they are trying to depict. The map isn’t the territory and the theory isn’t what it describes, they are simply ways we chose to interpret a certain set of information [3]. It is very easy to fall into the fallacy of allowing the model to become its own reality and it is as if our code comes to life and we forget that reality is a lot messier. For starters maps can be wrong or ill prepared to guide us in uncharted territories, but even where they are correct one needs to remember that by construction maps are an abstraction where information is inherently lost to save space.

When constructing a good model it is important to strike the right balance between the model complexity and its ability to abstract reality. Models need to be complex enough to closely depict reality yet simple enough so that users can easily monitor their behaviours and are not blind to regimes and scenarios where they might be less effective. To illustrate this point consider the following simple exercise where we use the least squares method to approximate points distributed along a sinusoid function (with a small noise) using polynomial functions with various degrees n. At n=1 this corresponds to a simple regression and doesn't fully capture the true shape of the function. At n=15 this introduces too much complexity and it starts to fit noise. Overall at n=3 we already have a good approximation.

Obviously there are various statistical techniques to deal with over / underfitting issues but in our experience a lot of times any additional benefit from more complexity comes at additional costs that far outweigh the benefits.

Constructing Narratives:

In financial markets prices often move before the narrative. We seem to have a need to find daily explanations of why the market moved in a certain way. Often the better answer to the question as to why the market moved by X% on a given day is randomness [4]. The daily stories we hear to explain the moves should be very often seen as entertainment to grab attention at best and a possible barrage of distractions from what’s really important at worst.

These facts are well known by most market practitioners yet very often we find ourselves reading and repeating such stories. So why does this irrationality persist? Research in behavioural economics helps shed some light on a variety of cognitive biases that affects the judgement and decision making of experts and amateurs alike. As it turns out many contentious and erroneous beliefs are related to our human cognitive biases and can be traced to imperfections in our capacities to process information and draw conclusions. In other words, as humans we are wired to find association and explanation when there is none and we hold such dubious beliefs not because they satisfy some important psychological need, but because they seem to be the most sensible conclusions consistent with the available evidence [5]. People seek such stories and explanations because they seem, in the words of Robert Merton, to be the "irresistible products of their own observation" [6]. That is to say, they are the products of flawed rationality rather than irrationality. Research in psychology has also confirmed such human tendencies when studying Illusions to understand the principles of perceptual processing [7]. When presented with the figure below (Figure 1) most people see bright rays coming from the center. However, these shimmering rays are entirely illusory as they are not actually there yet our brains still interpret the information presented to us in the most plausible fashion by "connecting the dots" and producing the rays in this example.

Figure 1. The "scintillating starburst" stimulus

The growth of the internet, 24-hour television, mobile devices, proliferation of data harvesting devices and robo-journalists automatically generating news stories means that we are living in a world of chronic information overload. For the past two decades data creation has been exploding according to projections of the International Data Corporation (IDC) (Chart 1). Hal Varian, the chief economist at Google has quantified this explosion in created data, "Between the dawn of civilization and 2003, we only created five exabytes; now we’re creating that amount every two days. By 2020, that figure is estimated to sit at 53 zettabytes (53 trillion gigabytes) – an increase of 50 times." The implications is that going forward the "noise bottleneck" will only get worse and so as investors it is fair to ask how we should deal with these vast amounts of information in a way that enhances our edge rather than eroding it.

Information Fasting Or Less Is More:

The tendency to be more confident about being right when we have more information has echoes of the "drunkard’s search" or the "streetlight effect", which refers to a well known joke where a policeman sees a drunk man searching for something under a streetlight and asks what the drunk has lost. He says he lost his wallet and they both look under the streetlight. After a few minutes the policeman asks if he is sure he lost it here, and the drunk replies, no, and that he lost it in the park. The policeman asks why he is searching here, and the drunk replies, "this is where the light is". We may laugh at this but as investors we need to avoid falling into a similar trap and avoid poring over tons of data simply because it is easily and readily available.

We are naturally inclined to assume that more data always leads to better decision making. In reality we need to resist that temptation because such preoccupations run the risk of ignoring what really matters to the investment case. Once we recognize that more information does not necessarily lead to better decision making, this frees us up to spend more time on thinking and less time on processing. The goal is rather than subconsciously adopting the common narrative, to cultivate an independent ability to analyse information, imagine non consensus market scenarios, and reach one’s own investment conclusions in order to avoid lazy consensus positions and portfolio blind spots.

Filtering Noise And Dimensionality Reduction:

As important, is to avoid spending time in places and on information that is specifically designed to inhibit one’s ability of critical thinking and of building a proper analysis model. The task of deciding when and what information is relevant gets more complicated in a world where many information providers and platforms now deploy armies of behavioural scientists to harness techniques that do not necessarily appeal to our reason or seek to persuade us consciously with information and argument. Rather, these techniques change behavior by appealing to our non rational motivations, our emotional triggers and unconscious biases. One of the most interesting pursuits in this area is the concept of priming [8] which explores how human actions and emotions can be primed by events and stimuli that one is not even aware of. In the famous "Florida effect" experiment a young group of students who were exposed to words associated with the elderly (Florida, forgetful, grey, etc..) walked at a significantly slower pace subsequently and all the while they were not even aware of their altered behaviour [9].

The implication is that to retain any chance of objectively analyzing information that we are daily bombarded with, one has to almost fight their first instincts and recognize that most of the data we consume is biased and incomplete by design and for a reason. As well, focusing on the medium to long term investment outlook rather than on day to day noise and finding ways to explain the high level macro drivers of one’s portfolio is very helpful in filtering noise. In practical terms, not all information is equal and even if a portfolio may have a large number of trading strategies at any given time it’s very important to establish a hierarchy of the big macro themes that can have a significant impact on performance.

Understanding Model Limitations:

The first step to truly understanding a model is to understand and respect its limitations by being vigilant and stepping back to understand the context in which the model is useful and where the cliffs might lie. The advantage of using simple models with few parameters is that their limitations and assumptions are easier to see and analyze and as such the user is hopefully better prepared to step in and guide the model in regimes and situations where it was not built to handle. In analyzing model performance it is helpful to deploy a systematic process that is constantly recording any deviations for out of sample analysis and for comparisons against initial expectations. While it may be very hard to completely eliminate the impact of cognitive biases, one should at least strive to reduce those by striking the right balance between the use of quantitative techniques and their own judgment.

References:

[1] "Antifragile: Things That Gain from Disorder: 3 (Incerto) Paperback – 28 Jan. 2014" by Nassim Nicholas Taleb

[2] "Extension versus intuitive reasoning: The conjunction fallacy in probability judgment (October 1983)" by Amos Tversky and Daniel Kahneman

[3] "Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics Hardcover – 1 Jan. 1995" by Alfred Korzybski

[4] "Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets Paperback – 1 Jan. 2007" by Nassim Nicholas Taleb

[5] "How We Know What Isn't So- The Fallibility of Human Reason in Everyday Life" by Thomas Gilovich

[6] "The self-fulfilling prophecy, Antioch Review, 8,193-210 (1948)" by R. K. Merton

[7] "Introducing the scintillating starburst: Illusory ray patterns from spatial coincidence detection" by Michael W. Karlovich and Pascal Wallisch

[8] "Thinking, Fast and Slow Paperback – 10 May 2012" by Daniel Kahneman

[9] "Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action, New York University" by John A. Bargh, Mark Chen, and Lara Burrows

This document serves, among other things, as marketing material. This document has been prepared by Quantum Investing Limited solely for the purpose of providing background information to the person to whom it has been delivered. The information contained herein is strictly confidential and is only for the use of the person to whom it is sent. Please refer to the important disclosures. Disclaimer

Prev Share: