In the Rethinking Treasury series, we have discussed at length how behavioural biases can impact our decision making and hence the importance of supplementing our intuition with more systematic and quantitative approaches to analysing risk.

    In this sub-series, we will take a step back and look at various sources of cognitive errors.  Last month, we looked at some common cognitive errors, and what these can tell us about how our brains work, where we need to be wary of our intuition and perhaps look to other tools and technology to assist us with our decision making.

    This month, we discuss two other well-known cognitive errors in relation to “narratives” and how they can actually affect our risk management decisions.

    Conjunction fallacy a.k.a. the “Linda problem”

    Linda is 31 years old, single, outspoken, and very bright.  She majored in philosophy.  As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

    Which alternative is more likely?

    1. Linda is a bank teller.
    2. Linda is a bank teller and is active in the feminist movement.

    Many people chose option 2, contrary to the rules of logic. The joint probability of two events occurring together can only be less than or equal to the probability of either one occurring.

    To see why, let’s assume that people think there is only a 5 per cent chance that she’s a bank teller and 90 per cent chance that she is a feminist.  Mathematically, Linda would only have a 5 per cent * 90 per cent = 4.5 per cent chance of being a feminist bank teller.  However, in our head, the addition of this fitting description would actually increase the perceived likelihood which is illogical.

    Daniel Kahneman1, who devised this experiment, postulated that our brains operate under a “representative heuristic.”  In other words, we make statistical judgment in our head based on how well the event fits the narrative or story, but not by logical tools such as Venn diagrams or mathematics.

    In this information age, narratives have become a driving force of markets.  People make decisions based on how well a narrative fits their world view.  In turn, people are known to selectively accept narratives that support their initial arguments and ignore those that don’t.  A fallacy known as the “confirmation bias.” (We will discuss this in more detail in Part 6 of this series).

    Therefore, the danger with narratives is that whilst they may be expedient for our brains they can also blind us from the broader picture and even logic, and lead to overconfidence in the face of imperfect information or uncertainty.

    Availability bias

    “List 2 ways in which you feel the course could be improved”. Professor Craig Fox2 at Duke University was asking his students to complete a one-page mid-course evaluation form.  The final question was to ask students to rate the course on a scale of 1 to 7. 

    This may seem run-of-the-mill, except that half of the students, at random, actually received a slightly different questionnaire.  The only difference was that these students were asked to list 10 ideas for improvement, rather than just 2. 

    Unsurprisingly, students who were asked to list 10 ideas for improvement did offer more suggestions than those who were asked to list only 2, but not by much (an average of 2.1 versus 1.6).

    In other words, our brains operate under the “availability heuristic,” a mental shortcut that equates ease of retrieval with importance.

    However, more surprising was that the course rating miraculously improved by more than half a grade amongst students who were asked to list 10 ways the course could be improved, compared to those only asked for 2 (an average of 5.5 versus 4.9).

    Paradoxically, soliciting a larger number of critical comments appeared to improve the perception about the course.  How was this possible? 

    This is because our brains form perceptions based on how easily examples and counterexamples come to mind.  In other words, our brains operate under the “availability heuristic,” a mental shortcut that equates ease of retrieval with importance.

    By asking for more critiques than students could possibly recall, students had to work harder, and think much longer, to come up with more suggestions.  Having worked so hard to come up with negative comments about the course, their brains told them that the course must not have been too bad.

    Key Takeaway for CFOs and Treasurers

    The “Linda problem” highlights the fact that our brains often process information in the form of stories, rather than numbers.  The perceived likelihood of an event is primarily influenced by how vivid or plausible the single story is in our head.  As we clearly saw, this “representative heuristic” opens the door to logical inconsistencies. 

    “Availability heuristic” shows us that when we evaluate multiple stories to form an overall opinion, we take another shortcut by equating importance with ease of retrieval.  The most easily recalled stories are often the most recent, vivid, or potentially sensational and they wield unduly large influence.

    Perhaps more concerning is that the “course evaluation problem” indicates that our brains may even be susceptible to manipulation by others.  Our perceptions may be altered by tweaking the mental efforts required in certain tasks.

    In risk management, it is easy to see why humans are not always reliable in assessing probabilities as we can be easily influenced by narratives and the availability of those narratives.  This problem is exacerbated by constant information overload filling our brains, whose adaptive strategy is to selectively seek out conforming narratives and ignore the contradicting ones. 

    Narratives can become closely intertwined with our emotions, and as we have explored in previous articles, allowing emotions to influence hedging decisions can be far from optimal.

    Being aware of these biases is the first step to de-bias our treasury decision making process.  Systematic, logical processes can also help guide us in the right direction.

    In the next article, we will explore other types of fallacies when it comes to randomness, which is inherently incompatible with “stories.”

    Disclaimer

    More, collapsed
    COVID-19: How have clients’ requirements evolved?
    The needs and requirements of clients have been completely transformed by Covid-19. HSBC Issuer Services summarises the key trends they have observed.
    Join the conversation?

    Join our Linkedin group to get an unparalleled view of macro and microeconomic events and trends from a bank that is a leader in both developed and emerging markets.