I have always considered myself a rational and logically grounded person. My views of the world are always unbiased, and my decisions are consistently based on reason. And when I make mistakes, I can comfort myself by reminding myself that my intentions were good.
The truth is, as a business leader, I have made hundreds of irrational decisions in my career, some benign, some not so. Most of the time, these decisions felt logical to me. They didn’t feel as if emotions drove them. I was very confident in my thought process and convinced I had all the elements I needed in hand to make the right call every time.
So how could I make so many irrational decisions while being convinced of their rationality? That is the theme I would like to explore in this piece.
We live in a society that expects us to be rational
To make sure we are talking about the same thing, let’s agree on what “being rational” means. According to the Oxford dictionary, the adjective rational means:
Based on or in accordance with reason or logic
What is reason, you may ask?
the power of the mind to think, understand, and form judgements logically
For ancient Greek philosophers, reason was considered a virtue. It was regarded as an ideal to strive for, not as a status quo of our human condition. Philosophers would dedicate their lives to its pursue, knowing that they would likely never quite master it.
But apparently, we came to implicitly agree as a society that we should expect each other to be rational and act logically daily. Letting one’s emotions being the primary driver of behaviors is perceived as acting juvenile. We expect adults to control their emotions and use reason when making decisions.
It’s even more striking in the world of business. Businesses used to be human enterprises, but they have evolved into cold, logical money-making machines. Many companies expect all their employees to have developed Greek philosopher levels of rational thinking. For these companies, there is no place for emotions in the workplace.
I find it quite paradoxical that we expect ordinary people to excel at things that even erudite scholars found hard to do.
Have we discovered the secret sauce to rational thinking in the last two thousand years? I have serious doubts.
But more importantly, why is rational thinking so hard in the first place? We are cognitively advanced sentient beings capable of inventing quantum physics, bitcoin, and curling. How hard could that be in comparison?
Who’s in charge? Emotions, intuition, and System 1
The world makes much less sense than you think. The coherence comes mostly from the way your mind works — Daniel Kahneman, Thinking Fast, and Slow
The world is very complex, and doing a complete analysis of a situation before making any decision would overwhelm our brains and paralyze us. To alleviate this, we have developed over time powerful cognitive shortcuts that allow us to make quick decisions so we can at least feed ourselves and maintain a bare minimum of social interactions without using too much brainpower. We call them emotions.
The part of our brain dealing with emotion is called the Limbic System. It includes the hippocampus, the hypothalamus, the amygdala, and the thalamus (although experts disagree on what actually is in the Limbic System). You probably have heard about the amygdala and its role in processing emotional responses such as the fight or flight response.
We usually associate it with more primal emotions such as fear, anxiety, or aggression, and truth be told, it has kept us at arm’s length from dangerous situations.
But emotions can be much more complex, and so can be their corresponding responses. Unfortunately, our amygdala won’t give us any explanation when it triggers such a response. It’s not supposed to. It doesn’t have the time for that. When there is a Lion in front of you, you run, and you’ll thank your amygdala for the instant adrenaline kick.
Now, not every decision is a life or death situation. How can our emotions influence our decisions when there is no urgency?
In the Harvard Business Review’s article “Why Good Leaders Make Bad Decisions,” the authors mention a concept called emotional tagging.
Emotional tagging is the process by which emotional information attaches itself to the thoughts and experiences stored in our memories.
It is an extremely efficient decision processing mechanism. We recognize a particular pattern of information within our memory, and we tag that pattern with an emotion. Any time we encounter that same pattern, the corresponding tagged emotion gets triggered.
For example, I eat raw shrimp and get sick. An association gets created between raw shrimp and disgust. Next time I see a raw shrimp, I’ll probably feel nauseated.
As powerful as this mechanism is, it is susceptible to distortion. The article describes three common red flags:
- self-interest: we naturally place more emotional importance on information that interests us, making it easier for us to see the patterns we want to see
- emotional attachments: our judgment can be affected by our attachment to people, places, things, etc. And we’ll tend to favor the things to which we are attached
- misleading memories: memories of a particular event tend to change over time, sometimes quite significantly. We are, however, not conscious of such alteration
Nothing is more responsible for the good old days than a bad memory – Steven Pinker, Enlightenment Now
So emotions are sitting on the driver’s seat, but we are also capable of logical reasoning. How does it all tie together?
System 1 and System 2
In his book “Thinking, Fast and Slow,” Dr. Daniel Kahneman, holder of a Nobel prize for his work on behavioral economics, proposes a compelling model to explain the duality of intuition and logical reasoning.
He suggests that we have two cognitive systems that he calls System 1 and System 2. These systems are not actual parts of our brains. They could be pictured as two independent circuits, working in parallel. System 1 is responsible for intuition and fast decision-making, and System 2 is in charge of more involved cognitive operations that rely on complex logic. For example, adding 2 + 2 or brushing your teeth can be managed by System 1, but multiplying 12 x 17 or reciting the alphabet backward requires System 2.
System 1 is fast, automatic, and effortless. System 2 is slow, intentional, and physically demanding. Scientific studies on subjects required to perform complex tasks involving their System 2 have shown similar observations: pupil dilation, higher blood pressure, and increased heart rate.
So as you can imagine, we will tend to rely on System 1 whenever possible. The vast majority of decisions we are making every day are taken care of by our beloved System 1. We popularly call that being in auto-pilot mode.
What we don’t realize is that because of its automatic nature, our System 1 is making a lot of the calls for us.
System 1 excels at pattern recognition, categorization, and simplification. Left unattended, it will happily convert complex problems into much simpler and more predictable patterns. Once System 1 identifies a specific pattern, it will instantly retrieve a solution from its database and call it a day. The better the match is, the stronger the conviction will be. And when System 1 is convinced it has solved the problem, it will bypass System 2. “No need to spend energy on this, I know what I’m doing!”.
That all sounds very convenient, what could possibly go wrong?
Heuristics and other cognitive illusions
Ultimately, decision-making is about evaluating the likelihood of something good or bad happening. We need mechanisms to assess the probability of uncertain events. Dr. Kahneman’s research highlighted how we tend to rely on a minimal number of heuristics to reduce such assessments’ complexity.
Without listing them all, below are some of my favorite.
Our System 1 likes to convert complex problems into simpler ones, of which it has a ready solution.
This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.
You probably find planes scarier than cars, and you are probably more worried about terrorism than food poisoning. In both occurrences, you would be statistically wrong. We tend to allocate more likelihood to events of which more information is available to us.
People tend to assess the relative importance of issues by the ease with which they are retrieved from memory – and this is largely determined by the extent of coverage in the media
Are you hungry? What do you want to eat for dinner? How about SO_P?
If I asked you to complete the word SO_P, you would most likely put a U in the empty slot. But if I had asked you instead when is the last time you had a shower, and what do you use to wash your body, you would probably have put an A. The fascinating phenomenon is called priming and takes place in many different forms and situations.
It occurs when we consider a value for an unknown quantity before estimating that quantity; our estimations of that quantity will be influenced (anchored) by that first value. That’s the reason why you always want to give your number first in a negotiation!
Hindsight and why we believe we are logical
Overconfidence is fed by the illusory certainty of hindsight
We were not designed with a change log system. Every time we change our views about a particular topic or belief, we lose our ability to recall our views before changing our minds.
Because we can’t faithfully reconstruct our past beliefs, we underestimate how surprised or uncomfortable we may have been by past events. It creates a sentiment of inevitability; we knew it all along.
This phenomenon is called hindsight bias and explains why many things seem to make sense to us. It may also explain why we are convinced that some of our decisions were completely logical, even if they were purely emotionally driven.
Nah, I’m pretty sure I wouldn’t fall for this.
You do not believe that these results apply to you because they correspond to nothing in your subjective experience. But your subjective experience consists largely of the story that your System 2 tells itself about what is going on
What can we do about it? Waking up our System 2
As we’ve seen, when it comes to rational thinking, we are not really set for success. Is there anything we can do to improve our odds?
When it comes to decision-making and how to reduce the influence of emotions, biases, heuristics, and other mischiefs of System 1, Kahneman offers a simple but powerful recommendation. Delay decisions. It sounds overly simplistic, but postponing when we make a decision is the most reliable way to make sure emotions didn’t dictate it.
Not expecting “simple” solutions
Looking for simple solutions and shortcuts is the playground of our System 1, and it will be happy to oblige on any occasion. If you find a very simple (and often elegant) solution to a problem, take a moment to play the devil’s advocate and try to find situations where your solution wouldn’t work. You would be surprised to see how hard it is to argue with your System 1 once it has made its mind!
A corollary to not expecting simple solutions is to make problems harder (or at least “look” harder) to engage our System 2 forcefully. A study showed that making the font harder to read on a problem significantly increased the chances of not falling into its trap; problems such as “if five machines can make five items in 5 minutes, how many items can 100 machines make in 100 minutes?”.
Reflecting on our actions and how we feel about them is a challenging but powerful way of increasing our self-awareness. More specifically, we want to reflect on:
- Realizing that we may have made decisions unconsciously
- How we feel before and after a decision
- How could our feelings have influenced the said decision? Yes, the process is as laborious as it sounds.
It is much easier to spot inconsistencies in others’ judgment than in our own. The best way to get a full picture of a situation, including our blindspots, is to ask for feedback.
… two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness — Daniel Kahneman, Thinking Fast, and Slow
Creating an environment of safety
Our amygdala can’t make the difference between physical and psychological aggression. The more stressful our environment is, the more likely our emotions will take full control of our decision-making.
There’s no room for facts when our minds are occupied by fear — Steven Pinker, Enlightenment Now
Promoting safety in our working environments can reduce perceived aggression and keep our amygdala at bay. Such settings are therefore more conducive to rational thinking.
But is it really what we want?
As we’ve seen, rational thinking is not an innate skill and requires both an acute awareness of how our emotions influence our decisions and a will to do something about it.
I agree that developing a capability to read our emotions and feelings is beneficial and can help us understand who we are and what is important to us. But I also think that there are times when letting our emotions decide for us may be preferable.
Creativity is one thing that comes to mind. Many creative decisions are inspired by intuition and driven by emotions. There is nothing rational about going somewhere you’ve never been before. Sure, expanding our horizons can be rationally explained as a survival mechanism, but why choose a specific direction over another?
Another one is empathy. We are social animals and need to feel that we belong to a group. We have developed a complex intuition framework that allows us to gauge in real-time how well integrated we are to our group. These intuitions prime on many of our daily behaviors and judgments. For example, we can’t really explain it, but we definitely “feel” it when someone is socially awkward.
Lastly, decisions purely driven by rational thinking are usually not the most humane. No need to look further to see why many companies alienate their people. Ironically, these businesses also expect their employees to be creative and innovative without letting them being human. Unfortunately, that’s not how we work. We are only human.
What I’ve learned
- Emotions have served us well for hundreds of thousands of years, but they haven’t had the time to adapt to the complexity of our social interactions and norms. The cognitive shortcuts that we call emotions, biases, and heuristics are now working against our sociological interests
- I’m severely biased, and simply being aware of my biases is not enough to protect myself from them
- Our brains haven’t developed any incentive mechanism for rational thinking. Being rigorously rational is a tiresome endeavor, and we’re not even hormonally rewarded for it
- Stressful environments are not conducive to rational thinking, let alone rational decision-making. We need to feel safe if we want to think straight
- Emotions make us human. Maybe our societies should be more emotional and less rational
- TL;DR: It’s not rational to expect others to be rational
- “Thinking, Fast and Slow”, Daniel Kahneman: https://www.amazon.com/Thinking-Fast-Slow-Daniel-Kahneman/dp/0374275637/
- “Why good leaders make bad decisions”, HBR Review: https://hbr.org/2009/02/why-good-leaders-make-bad-decisions
- “Enlightenment Now”, Stephen Pinker: https://www.amazon.com/Enlightenment-Now-Science-Humanism-Progress/dp/0143111388/
- “The Leading Brain”, Friederike Fabritius: https://www.amazon.com/Leading-Brain-Neuroscience-Smarter-Happier/dp/0143129368/
- “Heuristics and Cognitive Biases”: https://www.verywellmind.com/what-is-a-heuristic-2795235