In Memoriam: Daniel Kahneman


Dear investors,

Last week, Daniel Kahneman, a professor of psychology at Princeton and one of the pioneers in research related to behavioral economics, who questioned the premise of classical economists that people always make economic decisions in a rational manner, passed away at the age of 90. For his academic work, he received the Nobel Prize in economics in 2002 and remained an influential thinker until the end of his life. In 2011, he published his book “Thinking, Fast and Slow”, a worldwide best seller.

Kahneman's work gained great prominence among investors for mapping so-called cognitive biases: tendencies of the human mind to, in certain situations, deviate from pure logic and make bad economic decisions. Every experienced investor recognizes how the emotional side can affect investment decisions and lead to undesirable results. So, understanding how human psychology relates to the investment process and striving to minimize the influence of cognitive biases on one's decisions is essential.

We have already talked about this topic on several occasions, but it is so extensive and relevant that there is always something more to say. We seek to bring here some of the knowledge developed by Kahneman, who contributed enormously to our investment analysis processes.

Two systems in the same mind

Our brains have two very distinct modes of operation. The first of these, called by psychologists simply System 1, is responsible for our quick and intuitive reasoning. It is automatically active all the time, requires no special effort on our part, and we have no voluntary control over it. It is this system that recognizes known people, coordinates our daily movements, perceives dangerous situations and decides in fractions of seconds how to react to them.

The other is System 2, responsible for the sophistication of the human mind, which we properly understand as rationality, capable of processing information and abstract concepts in a methodical and logical way. In contrast to this greater sophistication, System 2 is slow and requires a conscious effort to focus on a specific problem for as long as necessary to solve it. It is this system that performs mathematical operations, analyzes complex situations and performs academic activities in general.

Each of these systems has its practical usefulness. System 1 takes care of most of our everyday decisions, which would be impossible to make using directly analytical methods. Imagine driving to work having to explicitly calculate the best steering wheel angle for each curve that must be taken along the way. This is the best system for all cases where it makes sense to sacrifice precision to gain agility. It is enough that the act of turning the steering wheel of a car is approximately correct and the speed allows corrections throughout the movement, so the intuition of an experienced driver works perfectly well in this situation.

System 2 is what allows us to transcend the level of animal intelligence and achieve uniquely human achievements. Despite being slower and more difficult to use, it is what allows calculations to be carried out elaborate enough to launch a satellite with the necessary precision to place it in Earth's orbit, something virtually impossible to do intuitively.

The problem with the existence of these two distinct modes of operation is that, sometimes, they both act simultaneously in a conflicting way. In other words, they reach different conclusions. This is the primary source of cognitive biases. The easiest way to understand the nature of a cognitive bias is through an analogy to optical illusions. In some situations, your brain misinterprets images. An example is the image below, where two straight lines with exactly the same length appear to have different sizes because of the shape of the arrows at their ends. If in doubt, feel free to measure both lines.


System 1 is the one that perceives the straight lines as having different lengths. System 2 is the one that understands the error and accepts proof that the lengths are equal after measuring both. Note that, even after verifying that the lengths are equal, you continue to see the straight lines with apparently different sizes. This is the nature of conflicts that arise from cognitive biases. Sometimes your instincts point in a different direction from your own rational analyzes and proving that the analyzes are correct does not change your instinctive perception, so making decisions based exclusively on the results of your analyzes is not as easy as it may seem. It requires a conscious effort and a “vote of confidence” in the superiority of your System 2. This is when many investors fail, follow their System 1 (which is always psychologically more comfortable) and end up making bad decisions, even though they have the intelligence and knowledge needed to develop a better action plan.

Action of biases in investments


In theory, investment decisions should be made according to the probabilities of gain and loss implicit in each investment opportunity. Whenever statistics favor the chance of gains, an investment opportunity would be advantageous. However, pure mathematics rarely predicts the decisions actually made by people in real situations involving analysis of probabilities of gains and losses. Consider the following situations:

Which of the alternatives would you prefer?

A. Win R$ 40 thousand for sure

B. Flip a coin. If it lands heads, you win R$ 100 thousand. If it comes up tails, you win nothing

The vast majority of people prefer to be certain of earning R$ 40 thousand, even though it is the mathematically less advantageous option, since the expected value of alternative B is R$ 50 thousand (50% x R$ 100 thousand).

Now consider the reverse situation. Which would you prefer between?

A. Lose R$ 40 thousand for sure

B. Flip a coin. If it comes up heads, you lose R$ 100 thousand. If it comes up tails, you lose nothing


Now would you rather flip the coin and count on luck? Most people prefer alternative B in this situation. Note that the mathematics involved are very similar, so there is a logical inconsistency between preferring alternative A when the problem involves gains and alternative B when the same problem involves losses. However, people consistently make decisions this way.

Studying this type of inconsistency in decisions involving probabilistic analyzes is why Daniel Kahneman and Amos Tversky, his research partner, created Prospect Theory. The main finding is that people feel the impact of gains and losses differently: losses are felt more intensely than gains of the same value. Under this distinct perception between gains and losses, people are instinctively more risk averse in situations involving potential gains and accept greater risk when the situation involves potential losses.

Anyone who has invested directly in shares has probably felt the effect of this bias in the following situation: when you buy a share and it rises a lot, the instinctive response is to want to sell it to crystallize the profit obtained so far. When a stock drops, your instinct is to want to hold on to it until the price returns to at least what you paid for it.

Another bias mapped by Kahneman and Tversky is that the perception of economic value is not only associated with the absolute value itself, but with the variation that this value represents in the total gain or loss. The greater the gain or loss, the less likely you will feel about an additional gain or loss. For example, if the profit you make on an investment goes from zero to R$ 50 thousand, the feeling of gain will be much greater than if you already have a profit of R$ 500 thousand and it increases to R$ 550 thousand, even if the amounts of additional earnings are exactly the same.

In investments, this second achievement enhances the harm of the first. When the stock rises, you tend to give less value to the chance of it rising a little more and prefer to sell quickly. When it falls, you tend to see the lesser harm in seeing it fall a little more and prefer to continue in the hope that it will rise again. Thus, sometimes, investors sell good investment theses too early – and, commonly, see the stock rise well beyond the price of the hasty sale – and continue holding stocks that no longer have good earnings prospects for longer than they otherwise would have. reasonable, in the hope of avoiding the psychological pain of crystallizing the injury.

These are very specific examples of the long list of cognitive biases mapped by Kahneman. For those who want to read a little more about the subject, we talk about some other biases relevant to investment decisions in our letter “The flaws of the human mind”, from April 2022. However, the best source is the book “Thinking, Fast and Slow” itself, a read that is not only useful, but also quite enjoyable.

How to deal with biases in investments

We have already commented that, in the same way that optical illusions are inevitable, it is impossible to temporarily turn off your System 1 so that the cognitive biases caused by it disappear. Even Kahneman, after decades dedicated to studying the area, admitted that he had not been able to eliminate his biases. So, what can be done to avoid cognitive errors in investment analysis? Here, we'll mix Kahneman's recommendations with our own experience.

Identifying the action of cognitive biases, in real life, is a less obvious task than it may seem at first glance because, when there is an important decision to be made, it is normal for those involved to be focused on the topic of the decision itself, and not on the possible cognitive failures that their own minds may be subject to while thinking about the topic. Thus, mitigating the chances of cognitive failures requires a series of measures.

The first step is to study the cognitive biases that can interfere with investment decisions and the situations in which they typically manifest themselves, as it is very unlikely that you will notice their manifestation without having a clear image of the behavioral pattern you are seeking to identify.

The second step is to make a conscious effort to remain vigilant to possible biases during real decision-making processes. Your memory will not always be complete and reliable enough for you to identify the biases that tend to interfere with your own decisions by remembering past situations. This exercise is easier to do when working as a team, as biases are often more easily identified by an external observer than by yourself. Each person is more sensitive to certain biases, depending on their personality, and knowing what they are helps a lot to mitigate their effects.

The third step is to structure a formal investment analysis process that facilitates maintaining the necessary rigor and discipline so that each decision is as rational as possible. An excellent practice is to record all relevant points of each analysis in writing. Not only to keep the history of analyzes and be able to evaluate mistakes and successes in retrospect, but writing enhances the capacity for structured reasoning. In the same way that it is much easier to perform mathematical operations on paper than in your head, it is easier and more assertive to perform complex qualitative analyzes by writing down each step of reasoning than working purely from memory.

Even with a well-structured analysis process, perpetual attention is required to execute it with discipline. As the predominance of System 2 depends on conscious effort, it is enough to be careless for System 1 to take over and introduce some bias into shortcuts of reasoning that, at the moment, will probably seem completely appropriate.

The target that no one saw

Kahneman's contribution was so relevant because it brought clarity to a field full of subjectivities and conflicts with classical economic doctrine. The premise that intelligent, qualified people would make economic decisions in a purely rational manner has survived so long in the academic world because it is, in fact, quite reasonable. The surprising realization is that, in reality, things are not like that.

The very notion that our instincts can lead us to bad decisions is not obvious. Popular culture goes in the opposite direction, placing intuition as something that surpasses rationality and points us in the right direction in the most difficult moments. In fact, the stereotype of a great investor in cinema is one who follows his instincts with courage and boldness, in big bets that quickly bring him a fortune. The great investor is usually almost the opposite: someone who doubts their own instincts, analyzes each opportunity methodically and avoids letting their emotional side influence their decisions as much as possible.

“We all think we are much more rational than we really are. And we think we make our decisions because we have good reasons for doing so. Even when it's the opposite. We believe in the reasons why we have already made the decision.”

—Daniel Kahneman

Check out the comments from Ivan Barboza, manager of Ártica Long Term FIA, about this month's letter in YouTube or in Spotify.

Would you like to sign up to receive our next letters?

Sign up for our newsletter

en_US

Invest with us

Before you leave, would you like to sign up to receive our upcoming letters?