In the Rethinking Treasury series, we have discussed at length on how behavioural biases can impact our decision making and hence the importance of relying less on our instincts but more on systematic and quantitative approaches to analysing risk.

    In this sub-series, we will take a step back and look at various sources of cognitive errors. In part 1, we saw how our brains are prone to missing the big picture as we prefer to make relative decisions rather than absolute ones. In part 2, we saw how narratives act as mental shortcuts and can sometimes lead to logical inconsistencies. Last month, we saw how our brains, conditioned on narratives, do not deal very well with randomness. Here, we will look at some cognitive errors related to our emotional defences.

    Self-serving bias: the need to feel good amidst uncertainty

    As we saw in the last article, with random payoffs, dopamine can lead us to become “hooked” to the very source of those random payoffs, e.g. stock markets, roulettes, social media etc.

    This seems to be at odds with another important trait of human nature that we discussed in our first article of Rethinking Treasury, where we highlighted some studies revealing that the pain of losing is felt twice as strongly as the joy of winning the same amount. So this begs the question. Why don’t compulsive gamblers seem to be affected by “risk aversion?”

    One explanation can be found in another well-studied cognitive tendency called “self-serving bias.” In essence, we are conditioned to ascribe success to our own abilities and efforts, but ascribe failure to external factors. We selectively outsource our failures to “randomness” ex post facto.

    In evolutionary terms, such “illusion” may be necessary for us to fortify our mental resolve amidst all the uncertainty that we face. It is a way to help us persevere during hard times and allow us to bask in the glory of our efforts when they have been overcome.

    This may have worked well with our ancestors but in highly random situations such as modern financial markets, the same self-serving bias cannot backfire on us more. It is easy to become overconfident as we attribute trading gains to our skills and trading losses to pure bad luck. (For more on overconfidence, see fifth article of the Rethinking Treasury series.)

    The “Blame the Victim” fallacy: the need to distance ourselves from unfortunate events

    Moving even closer to our inner psychology, Professor William Ryan discovered that when an unfortunate event befalls others, we tend to, at least partially, blame the victim rather than the event itself or the aggressor.

    For example, say someone has just had their wallet stolen by a pickpocket. Rather than accusing the pickpocket, one part of us might secretly blame or question whether the victim had taken enough care of his own wallet. The same psychology can be observed in more serious abuses such as sexual assaults, violence and racism, etc.

    It is hypothesized that this mental defense is needed to maintain our general belief that bad outcomes do not just happen to good people. If they do, we tend to think that there must be a reason and that these people have done something wrong. This fallacy blinds us from the fact that randomness might be at play.

    In financial markets, it is easy to read news about large, unhedged losses or similarly, large trading losses and blame those companies for “not having done better.” Granted, often companies that make those headlines have room to improve in their risk management policies. The truth is, even companies with more robust risk management policies can at times be adversely impacted by some large, unexpected events.

    Therefore, rather than emotionally distancing ourselves from the “victims,” companies would be better advised to leverage their experiences and allow them to play “devil’s advocate” on their own risk management policies.

    Probably one of the most widely known cognitive biases, “sunk cost fallacy” occurs when we evaluate decisions based on sunk costs that have been incurred in the past and cannot be recovered

    Sunk cost fallacy: we get emotionally attached to things, especially our own ideas

    Probably one of the most widely known cognitive biases, “sunk cost fallacy” occurs when we evaluate decisions based on sunk costs that have been incurred in the past and cannot be recovered. This is irrational because logically, only future costs and benefits should be taken into account. If this sounds counterintuitive to you, it is totally understandable because we all form emotional attachments to past actions, decisions and personal investment.

    According to McKinsey & Company, it is estimated that 80 per cent of unprofitable businesses still survive after 10 years because of sunk cost fallacy.

    In risk management, we often see a reluctance to close out or hedge those positions that are already in large losses, as that would mean “locking in” those losses and losing the opportunity for them to bounce back. This is classic “sunk cost fallacy” at play.

    Also at play here is the fact that people can become “risk seeking” and less sensitive about further losses when they are already incurring large losses. (See our first article of the Rethinking Treasury series for more about “risk seeking behaviour”)

    In markets, sunk cost fallacy can also affect our market judgment. Fact-based rationality should mean that our decisions will change according to changing evidence. In reality, however, because we are often emotionally invested in our own ideas – especially those that have been articulated or publicized – we are much less willing to change our positions.

    In evolutionary terms, we may be pre-programmed to have a built-in loyalty to ideas in which we have invested time or identified ourselves with. Consider our relationships, careers, political beliefs and even our children. We cannot just abruptly part ways with them on some fresh evidence without having incurred some significant “emotional costs.”

    Key takeaways for CFOs and treasurers

    In this article, we have seen how our emotional need to feel good can affect our decision making.

    • In random markets, “self-serving” bias can make us overconfident about our skills in predicting the markets, by ascribing good results to personal traits, and bad to randomness.
    • The “blame the victim” fallacy is also complicit since we would always want to emotionally distance ourselves from those “victims.” This in turn leads us to underestimate the likelihood that we might one day become one of them.
    • “Sunk cost fallacy” reveals why we can be stubborn in our ways, because we can easily become emotionally attached to our own ideas and losing positions.

    Of all types of cognitive errors being discussed, these “feel good” fallacies can be the most difficult to eradicate, since they connect with us on much deeper emotional levels.

    Critically evaluating and challenging our own decisions are therefore vital. As mentioned earlier, playing devil’s advocate could be an effective means to provide an alternative lens through which to view the same situation, help ourselves be a little more detached from our personal positions, and create a more robust decision making environment.

    Next month, we will talk about other cognitive errors related to our own social instincts, which are another set of powerful emotional drivers.

    Disclosure and disclaimer

    More, collapsed
    Join the conversation?

    Join our Linkedin group to get an unparalleled view of macro and microeconomic events and trends from a bank that is a leader in both developed and emerging markets.