When our financial decisions don't fit what we think is rational
In the last post, we introduced a learning journey on emotions and cognitive biases in financial decision making. In it we mostly focused on some conceptions about whether financial decisions should and even can be made completely without emotional input. We pointed to the findings of contemporary neuroscience, which indicate that is not very meaningful to consider actual decision making processes of any kind that are totally detached from the brain's emotional processes. In this post we dig deeper into such findings.
Difficulty in understanding emotions
Our understanding of emotions as individuals does not always come as easily as we should really expect it to do. Although we all know how it feels to be sad or afraid or joyful, it is often quite difficult actually to describe what the feeling "inside us" is like. Interestingly, the descriptions that we often find are the easiest to make may tend to tell more about what our body is feeling than about our internal mental state, and an important part of the modern theories about emotions actually concerns itself with what bodies are doing when we experience emotions. We will return to this at a later point, when we look more closely at some of the field work we did in our research that was funded by the European Commission - part of which included using sensors to take a look at traders' physiology in the process of financial decision making. Also, I would mention that there are parts of the course that we have developed that focus on labeling emotions, which is a useful exercise as part of learning about emotion regulation, which is very much at stake in our research and the course we developed.
Screenshot from feedback video in Part 1 of the couse.
Decision making biases
Since observing and understanding our emotions is not always as straightforward as we might like it to be when we are trying to learn more about our decision making processes, we may consider another primary approach to looking at how decision making can deviate from what we might expect of it, if we were asked to describe coolly and analytically how our decisions should be. Such deviations are often called cognitive biases by theorists, a concept that was most successfully introduced by the late Amos Tversky and his close collaborator, Nobel Prize winner, Daniel Kahneman (author of the recently best-selling book, Thinking Fast and Slow). In the course, we have also chosen to call these decision biases (or decision making biases) at some points where we wish to emphasise biasing in a particular decision making context rather than conveying it as a bias in cognition per se. However, we are not really making a taxonomical distinction here but rather just a rhetorical one - the two terms may be considered to be synonymous in our usage.
As mentioned in the last post, in certain settings, much of our decision making must occur under great constraints on our cognitive effort, both in time and "space" (for instance, our ability to make complex computations or recall facts from our memory). Under these kinds of circumstances, researchers have identified and explored a host of shortcuts, through many interesting and rigorous experiments. They have shown that when these so-called heuristics come to be deployed in many settings the outcomes do not conform to those expected by the norms of rational decision making. It is in such settings that we often come to regard our decisions to be "irrational", however, Kahneman and many other important theorists have come to use a different term to describe a model of decision making of this kind: bounded rationality, a term coined by another Nobel prize laureate, Herbert Simon. The adjective "bounded" refers to the limitations on the use of information described above.
Although there is some very important and illuminating debate about how we should think about bounded rationality, what is most important for us to take away here is that although our decision making may not look rational from the perspective of theories of rational choice as a kind of optimisation, it is often part of a process that is deployed where such an optimisation can not be accomplished due to constraints on our mental processing. It may be helpful to better understand some of the shortcuts we use in order to shed some light on some of the biases that arise in our decision making as a result of them.
Screenshot from feedback video in Part 1 of the couse.
Checking ourselves during decision making?
In the course that we have developed, we present one key bias that has been shown to be highly prevalent in decision making at large: disposition effect, the propensity to realise profits more readily than losses. Kahneman and others consider disposition effect to be an example of narrow framing, which is the phenomenon of examining a risky prospect in isolation from its relative importance among other risky prospects. Understanding this mechanism is potentially useful, since there are actually approaches that can be taken to reappraise the importance of a prospect in the greater scheme of things - but how do we do that? How do we check ourselves in the middle of having a perspective that seems to arise spontaneously in the course of our decision making under risk? If it were as easy as just knowing about such biases, shouldn't we be able just to use that knowledge?
Or isn't there a straightforward mechanical way to manage this?
I remember years ago hearing a market maker on the trading floor refer facetiously to a new kind of order not yet to my knowledge. We all know that there are day-orders and orders like "trailing stops" (usually used to protect some profits) and "good-till-canceled", but in a steadily falling market he laughingly referred to the changes in the order book he was seeing as "move-when-near" orders - stop orders that the traders pushed further and further down ahead of the approaching fall in the price of the asset: in short, losses they refused to realise even when they had originally placed such an order exactly for that purpose, as a "stop loss"!
Indeed, in the introduction of the publishable summary of our final report on our research, we state the following:
Much financial training has, to date, focused primarily on imparting propositional knowledge and increasing
people’s understanding. However, investors may have appropriate knowledge, but despite this go on to be ruled
by their attitudes, habits, or emotional states. Emotions mediate both rapid expert situation recognition and the
application of expert intuition but also important persistent biases in decision-making such as framing effects
and the disposition effect in particular.
And yet, despite this, there are many well-adapted traders, the aforementioned deployers of rapid expert situation recognition and expert intuition, who seem to have found their way in the face of these many biases that seem to loom large in financial decision making. Our research focused on this kind of expertise and possible approaches to imparting it to more novice decision makers and supporting it.
In the next post, we will look a little closer at how our research investigated this question of expertise and what we found.