This article discusses the critical influence and importance of psychological errors and biases in investing – what are they, how do they manifest themselves, and how do we reduce their influence?
Are humans excellent and industrious problem-solvers? Do we process all information in the same way? Does our cognition function like an algorithm, always making the most logically sound decisions, devoid of emotion? If we look at psychological research aiming to answer these questions, particularly in the fields of behavioural finance and economics, we find that the general answer is not really. The scientific consensus is that humans seek to conserve mental effort and often treat incoming information in a biased and unequal manner, in addition to exhibiting predictable patterns of irrational decision-making which result in suboptimal outcomes and choices – these are all interconnected concepts. As much as we love to frame our species as the pinnacle of intelligence in the animal kingdom, we are kept in check by our natural limitations which remind us that we are easily fooled. It could be argued that we are victims of our own success: the rapid pace of human innovation has left us stranded in a modern environment which brings complex problems – such as making calculated investment decisions – that we are not biologically adapted for, leading to pervasive cognitive pitfalls and biases. Consider the following problem and only continue reading when you have a solution:
“A bat and ball cost €1,10. The bat costs one euro more than the ball. How much does the ball cost?”
What was your answer? Unless you are a particularly careful and critical thinker, it was probably wrong (see the end of the article for the answer). Don’t worry, you’re not alone – Daniel Kahneman, a renowned and Nobel Prize winning psychologist who discussed the puzzle in his book, revealed that more than 50% of university students at Harvard, Princeton, and MIT (who we would presume to be relatively smart) gave the incorrect answer. The puzzle is designed to fool you: it lures your mind with the deceptively obvious answer of 10 cents, which is actually wrong. If you had checked your logic and reasoning before determining that the answer was 10 cents, you would have realised that this could not be true: if the ball costs 10 cents and the bat costs one euro more, or €1,10, you would end up with a sum of €1,20, which cannot be correct.
This puzzle ties in with the cognitive miser theory of human cognition, which suggests that we are mentally ‘lazy’ and always seek the path of least resistance (i.e., the path which induces the least cognitive strain). Although it may seem surprising to some, our use of heuristics – ‘lazy’ mental strategies that emphasise efficiency over the occasional error – is actually crucial. That’s because our brain has a limited capacity (e.g., we can only hold about 7 items in our short-term memory) and heuristics allow us to conserve mental effort for when it is truly required, even though determining the ‘go time’ for the abandonment of such heuristics may not always be easy (as we saw in the above puzzle). Kahneman split human thought processes (also known as cognition) into two categories. System 1 is automatic, rapid, almost or entirely effortless, and not subject to voluntary control – the intuitive system that would have led you to the wrong answer in the puzzle. Abilities which fall under the command of this system include: detecting the relative distance of objects or reacting with disgust. In contrast, System 2 requires voluntary control, significant effort and the allocation of attention. This system is employed for more complex tasks, such as: spotting a man with red hair in a busy subway or checking the validity of a complex argument.
How can an investor apply the knowledge that we always seek cognitive ease? The most obvious application is to double or triple-check one’s logic in order to prevent errors like the one made by most people in solving the bat-and-ball problem. After arriving at a conclusion, such as a per-share value for a company calculated using a discounted cash flow analysis, re-check your assumptions to ensure that you have directed sufficient attention and effort to the things that matter (e.g., your projected future cash flows and the discount rate). In a similar vein, avoid directing attention and giving weight to information of minimal value, such as a stock price chart, which could incite an automatic (i.e., System 1) reaction that affects your investment decision. To illustrate what I mean, see whether you can relate to the emotional milestones labeled on the fictional stock price chart below:
An investor who has thoroughly studied the stock market should know that the historic stock price chart should be assigned little or no weight in an investment decision. The market value is distinct from the intrinsic value of a company – the former fluctuates, whereas the latter does not nearly change to the same degree, giving the investor the chance to profit from ‘mispricings’. In other words, the only thing that matters is the here and now: what do you think the company is worth, what can you buy it for at this exact point in time, and how large is this discrepancy? The same way that an investor must separate an abundance of daily noise from valuable information, he must also be cautious when analysing deceptive variables such as historic stock price movements that System 1 preys on – we always seek to find illusory trends in random and unpredictable patterns. Even with a full suite of due diligence in hand, a final look at the stock price chart could lead an investor to balk at what otherwise could have turned out to be a profitable investment.
Another important realisation is that we are biased in our information processing. These biases are well-researched and I will aim to show you how they could be related to investment decisions. Let’s start with confirmation bias, which is a tendency to assign greater weight to (i.e., take more seriously) information that aligns with our beliefs than information which is in conflict with them. This also means that we actively seek out information that matches our belief. In a classic psychology study, participants were presented with evidence that both supported and opposed their view concerning whether capital punishment was an effective deterrent. Although one might expect that participants attitudes would become more diluted and moderate after such an exercise, this was found not to be the case: participants attitudes actually became more polarised (i.e., more extreme on either side) and evidence in favour of participants’ views was evaluated as being more persuasive and valid than opposing material. Arguably, the study also illustrates the problem of starting with a conclusion and gathering evidence thereon, which entrenches confirmation bias.
Confirmation bias is equally active when evaluating a company or other security as an investment. For example, we may believe that a company is solid and that the market has unjustly applied a discount to its stock (it is cheaper than it should be). In turn, we actively look for evidence (e.g., investment journal articles) which confirm this belief and disregard or assign less weight to opposing evidence (e.g., a weak balance sheet and poor solvency). Such a biased process could lead us to make the wrong decision. How do we overcome this bias? Besides remaining conscious of it, we can use a due diligence process that is augmented by scientific principles. For instance, we could start out with a falsifiable theory (this stock is undervalued) and aim to amass evidence both for and against it. In other words, instead of simply trying to confirm our beliefs, we should also try to falsify (disprove) them: it is easy to gather evidence which supports a theory, but all it takes is a few conflicting discoveries to refine or disqualify it. This principle of analysing a problem through different perspectives is addressed by Charlie Munger, Warren Buffett’s long-term friend and business partner, who counselled the following:
“Invert, always invert. Turn a situation or problem upside down. Look at it backward.”Charlie Munger – Vice-chairman of Berkshire Hathaway
Another psychological pitfall that investors could fall victim to is the framing effect. Behavioural economists are generally enamoured with utility theory, which posits that humans consider two factors when making choices: the desirability/attractiveness of an outcome and the probability of achieving it. However, utility theory, in its implicit assumption that humans are rational, does not address the variable of how decisions are framed (i.e., how they are presented to the decision-maker). Crucially, it turns out that framing can have a dramatic impact on decision-making, which was best illustrated by Tversky & Kahneman (1981).
Consider the dilemma on the left, which was taken from Tversky & Kahneman’s (1981) study. The choices for ‘Problem 1’ are framed in a certain way.
The numbers in brackets after each option indicate the percentage of participants who chose that option: we can see that 72% chose Program A whereas 28% chose Program B.
If the problem is framed in this way, the choice seems obvious: Program A provides a guarantee that 200/600 people will be saved, whereas Program B is highly risky – there is a 2/3 chance that no-one survives.
On the left, you can see ‘Problem 2’. In this problem, the dilemma (an Asian pandemic outbreak) and associated options (i.e., the programs) are the exact same as in ‘Problem 1’ – they are just framed differently. Program C is Program A and Program D is Program B.
This change of framing has a dramatic effect: only 22% of participants chose Program C/Program A whereas 78% chose Program D/Program B – the results from ‘Problem 1’ were practically flipped due to a simple change in presentation.
Relating all this back to an investor’s activities, the framing effect may manifest itself when reading company reports. Public companies feel the heat of analysts’ expectations and the short-term pressure of quarterly reporting, which may lead them to frame poor results as positive results. This is often done through altering the wording of reports. For example, let’s assume that Monster Inc.’s most recent quarter produced earnings-per-share (EPS) of €0,78, compared to last year’s equivalent result for that quarter of €0,86. Let’s also say that analysts’ expectations for the quarter were €0,76. Management A states the following: “Monster Inc. produced a strong result this quarter of €0,78 EPS, beating analyst expectations of €0,76″. In contrast, Management B states that: “Monster Inc. generated €0,78 in EPS this quarter compared to €0,86 in last year’s equivalent quarter”. Evidently, Management A is being deceptive and avoidant in its reporting, which should incite suspicion on the part of the investor. The obvious way to avoid the framing effect in this context is to start with the financials of a report and assign less weight to management’s comments. Unlike the occasional management team, the numbers don’t lie.
The anchoring effect refers to our tendency to use an initial unit of information as type of benchmark to make ensuing judgements, even when this information is essentially irrelevant. Tversky & Kahneman (1974) conducted an experiment to demonstrate this. The authors spun a wheel of fortune which was rigged to land at either 10 or 65 in front of participants, who were then asked to write down the number. Subsequently, participants were asked to answer the following two questions:
“Is the percentage of African nations among UN members larger or smaller than the number you just wrote?”
“What is your best guess of the percentage of African nations in the UN?”
The experimenters found that participants who had been exposed to the number 10 made average estimates of 25%, whereas those that had been exposed to 65 made average estimates of 45%, therefore demonstrating that participants based their judgements on the arbitrary number that was produced by the wheel of fortune. The anchoring effect is reliable, reproducible, and prevalent in the investment world. Consider the following example. An investor purchased an undervalued stock in the midst of the March 2020 COVID-19-induced stock market crash for €10 per share, calculating that it was worth at least €30. By December 2020, the market has recovered, and the investment has produced a 50% profit on paper – the share price now stands at €15. Anxious to secure the profit, the investor sells the shares without re-checking his due diligence, therefore not being conscious of the fact that the shares could still double before they reach his estimate of fair value. In this example, the investor has used the €10 entry point as an anchor to make the sell decision, netting him a 50% return instead of a possible 200% profit. Anchoring could also manifest itself when deciding to add to an investment (e.g., by purchasing more shares in a company). The entry price of €10 could be used as an anchor point to purchase more shares when the price drops to €6. The shares suddenly appear cheap although they could in actual fact be expensive because the company’s fundamentals have changed (e.g., the business outlook has turned grim). How do you avoid the anchoring effect? Always rely on your due diligence and reassess companies fair value before either buying more shares or selling them – don’t let your psychology govern this decision.
There are many more biases out there, including myopic loss aversion, availability bias, and representativeness bias, which are beyond the scope of this article. Ultimately, to become a successful investor you must be acutely aware of your own cognitive shortcomings and control for these as well as you can, which is best done through educating yourself thoroughly on the subject. Doing so skews the odds in your favour, which is crucial in an endeavour where the odds are stacked against you.
The answer to the bat-and-ball problem is 5 cents.
Johan Lunau – 09/06/21