How Kahneman, Tversky, Taleb and others change the way we think about rationality and what it actually means in light of modern cognitive psychology.
What comes to mind, when you hear rationality? Who comes to mind, when you hear rational? For most of us, the rational is the counterpart of the emotional, the rational is the profit-oriented businessman and the defendant of cold, logical reason, like Spock in Star Trek.
Continue to read here if you are willing to embark on a quick trip into rationality and to learn what it really means in light of modern cognitive sciences. It is, surprisingly, not what you expect and there is much we can learn for our practical lives from a new understanding of rationality.
Why this matters
I believe we can agree that in today’s world, many things are run by rationality. Competitive markets reward reason and logical analysis over compassion and emotional inference. Much more, though, rationality matters for each and everyone of us, because it is the scalpel that allows us to dissect an interwoven and sophisticated world into supposedly meaningful chunks. Rationality allows us to navigate the sea of possibilities in the future, which is to say our life. As Ray Dalio, founder and CEO of the most successful hedge fund to date, puts it: A successful life is the result of good decisions. Thus we ought to be interested in how to make the best decisions, which is the art of rationality, I suppose.
Any process of evaluation or analysis, that may be called rational, is expected to be highly objective, logical and „mechanical“
The classical view of rationality, especially by critics is one of cold analysis and decision devoid of emotion. This is true for the perfect definition of rationality as in mathematical logic, but flawed when we try to apply it to human beings. After all, human cognition cannot be split into two clean parts, where one is rational and the other emotional. Emotions are the result of bodily functions and circumstances just as our ability to think in abstract concepts.
In the following article I want to introduce you to the notion that rationality in the clean text-book definition does not exist in humans.
What, then, is rationality in humans?
Herbert Simon, a turing award laureate, introduced the term of bounded rationality, which limits the demand to be rational as a human to an amount which is computationally feasible given our cognitive limitations. That is to say, we must replace the ultimate, all-considering version of rationality, the homo economicus, with bounded rationality, because a human being is never able to consider all variables and every possible scenario. Thus our rationality is bounded by our intellectual abilities, according to this school.
Nicholas Nassim Taleb, author of the INCERTO book series, takes this further and puts it in a more relative manner:
“The only definition of rationality that I found that is practically, empirically, and mathematically rigorous is that of survival”Nassim Nicholas Taleb
Thus everything that advocates survival of the human, the species or the planet can be understood as rational. Note that this possibly also incorporates religion, mysticism or spiritual beliefs, for they might offer the framework for an individual to do well. Take religion, which through its clear rules and social customs thought us to avoid many dangers that lie hidden in a world without tradition, long before the laws of nation states became persistent.
It is essential to remark that rationality does not imply virtues, rather it is the way by which we achieve those virtues. The often mistaken substitute for rationality in economics is rent-seeking or profit-seeking. This however, is too narrow-minded, for “There is nothing irrational, according to economic theory, in giving your money to a stranger, if that’s what makes you tick. And don’t try to invoke Adam Smith: he was a philosopher not an accountant; he never equated human interests and aims to narrow accounting book entries.”, as Taleb puts it.
In this sense, to be rational is to be congruent with your goals. Rationality does not imply good or bad, for only evolution and time will tell, what is ‘good’ and ‘bad’ in respect to our survival. Also, the notion of rationality I want to elaborate on, only applies to deeds, never to thought. This is a practical distinction, for we are never able to analyze what’s in the head of someone. Even that which is conveyed by language is possibly skewed, so that we rely on the actions taken under real circumstances. That is, as Taleb puts it, actions taken while there is skin in the game. Skin in the game, formally, means that he who takes an action has to face the consequences, positive and negative. Rationality, Taleb submits, can only be studied under real circumstances, where there is something at stake for the subject of discussion, otherwise the human mechanisms of decision are altered beyond recognition, as we will see.
We have now set the terms of rationality, by discussing different viewpoints of it. In the following, I want to look at bounded rationality, or in particular the systematic errors that arise from our limited cognitive abilities. There are interesting phenomena to be found.
The Rationality Mafia – Kahneman, Taleb, Yudkowski and the fallacies of thought.
Enter the Rationality Mafia. A term I coined for the students of rationality, that started to rip rationality in the old sense apart. It is unclear and beyond the scope of this article to trace the roots of the thoughts, which is why I want to focus on some of the most prominent contributors. This summary is neither sufficient nor complete, but it gives an idea.
The systematic experimental study of reproducible errors of human reasoning, and what these errors reveal about underlying mental processes, is known as the heuristics and biases program in cognitive psychology.Elezier Yudkowsky
What are heuristics? And biases? A heuristic, simply stated, is an approximate solution to a sophisticated problem, it is basically a more or less informed guess. It is closely related with bounded rationality, as we will see, because using heuristics is the art of making good decisions based on limited information, because all information required is either incomprehensible or simply not available.
A bias is a statistical term, which roughly tells us that when making a decision, we consistently miss the target, because we tend into a specific direction. We are biased. Put another way, a bias towards A, in a decision between A and B tells us, that without any further information or background, we would decide for A.
Let me introduce you to little Danny. When little Danny, as a jewish refugee of the second world war, arrived in Palestine in 1946 at the ignition of the jewish state it was already clear that he had a special relationship to people. His thought was analytical when he tried to understand what makes them do the things they do. The young boy went on to study psychology, and to change the world. For the theory he developed with his close friend and soul-mate Amos Tversky effects every single human being on earth, whether we like it or not. Most, as little Danny a.k.a. Daniel Kahneman had to learn over time, don’t like it. Essentially, the two psychologists proved in a rigorous scientific manner that even – or rather especially – experts are prone to systematic errors in decision making under uncertainty.
Now, let’s get concrete! What did Kahneman and Tversky find, that is so shocking?
The bandwidth of their research is enormous, covering a multitude of heuristics and biases, of which I will only cover a few for you to get a sense of the matter. You can find a comprehensive list here.
Also, the book Thinking Fast and Slow, by Kahneman gives a great overview of the subject with elaborate explanations, studies and anecdotes.
Anchors, Priming and our Relationships to People
Kahneman and others showed the effects of something that became known as anchoring. An psychological anchor is pretty much what you think it is: an anchor. Imagine a negotiation, where Jon and Paul are trying to find a prize for, say, a pro laptop. Jon knows about anchoring, Paul doesn’t. So Jon throws an anchor by stating that he’s only willing to pay 300$, which is ridiculously low and shocking for Paul. Yet, the psychological anchor works, in that Paul is now much more likely to settle for 1000$, which, in comparison, is much more than 300$, but still much less than the laptop is probably worth. An anchor can, suprisingly, be totally independent of the subject matter.
Studies on this where conducted, where respondents were asked to estimate the percentage of african countries that are in the United Nations. They had to turn a lottery wheel before they answered, which was manipulated to always yield either 10 or 65. Now, although the respondents were made aware that the two tasks of turning the wheel and answering the question had, obviously, nothing to with each other, the ones who saw the 10 guessed around 25% while the ones who saw the 65 guessed around 45%.
That, my friend, is anchoring. We are, literally, drawn to it.
Closely related to this effect is priming, of which anchoring is actually a special case. Priming is basically about setting the subconscious frame or lens through which we see the world. Studies were performed where people looked at a screen where the researchers presented a word for a fraction of a second. So short, that none of the respondents could read anything or even grasp, that a word in their language was shown. However, the subconscious was faster and it knew more.
The words that flickered on the screen, unreadable for the conscious mind of the participants were different for two groups. One was of positive emotion (Love, Compassion, Peace) while one was of negative emotion (Hate, Disgust, Insult). The respondents were then shown ambigous sentences and had to interpret their meaning. Those who where subliminally primed for negativity, interpreted the senctences, who guessed it, negatively. All that, remember, without being aware what words they saw previously.
Another funny example is that people primed for old (using words like old, senior, care home, etc..) started to move slower. And the effect works the other way around as well. Groups told to move deliberately slow for a few minutes later recognized words associated with high age more easily than those not. It sounds like a mind trick, but it is the result of our mind working by association, almost all the time.
The study of priming and anchoring tells us, that our subconscious often knows more than we do, and that our decision aren’t all to rational all the time. Just how irrational in an economic sense it can get is the content of Prospect Theory the paper that later earned Kahneman the Nobel prize in economics.
Prospect Theory – Market Decisions aren’t rational
One of the pillars of classical economic theory is the concept of the rational agent. The theory assumes that human actors in the market decide and act rationally. Kahneman and Tversky have shown, that this is all but wrong. Not only in the sense of bounded rationality, where we say that humans are only as rational as their ‘capacity’ allows, but in the sense that humans actively contradict themselves in specific cases.
Studies where conducted in which the respondents had to decide between two financial options:
A: (4,000,.80), or B: (3,000, 1.0)
C: (4,000,.20), or D:(3,000,.25).
I guess this: Problem 1: B, Problem 2: C, right?
the notation means that in problem 1, we can decide between A) 4000$ with 80% probability, or B) a sure 3000$. Think about it for a second, how would you decide for Problem 1 and 2?
When you view it rationally however you can prove that by deciding for B and C respectively, you contradict yourself, because in both cases the second option is 25% more likely (0.25/0.20 = 1.25 = 1/0.8) than the first, just on a different scale. So essentially the two problems are the same, just with the respective probabilities shifted, which should not affect our rational judgement.
What Kahneman and Tversky have shown in Prospect Theory is that the value we attribute to money in either gains or losses is not a linear function to the actual money. Which is to say that we dread losses more severely than we appreciate gains. And that we value certainty higher than increased probability.
Why that is
After seeing that in fact we are very bad rational decision makers and that indeed many of our rational decisions are not rational at all but rather biased guesswork of our subconscious, we are left with the question: Why?
Kahneman, in Thinking Fast and Slow, develops a compelling model that helps to understand our mode of operation: System 1 and 2. System 1 is the ‘subconscious’, that makes quick, intuitive decisions based on association (which is why priming works), while system 2 is the conscious, deliberate mind, which can think logically and draw reasonable conclusions. Now the point Kahneman develops is this: System 2 is costly and hard to activate, while system 1 is our default mode, idle status so the say. And humans are essentially energy savers, for evolutionary reasons. Having a brain as big as ours is already a burden for a wild living animal. Thus using it as little as possible, or rather, only using it when required is essential. We use System 1 about 90% percent of the time. Put differently, only 10% of our decision are actually made by our conscious, ‘rational’ mind.
Studying and reading about the effects and severity of biases in human cognition makes one ask: How can it be that we are operating on a system so skewed and prone to failure? The thing is, we aren’t. Using heuristics and relying on our intuition works perfectly fine, most of the time. Imagine you had to deliberately consider every single decision you have to make during your day. From tying your shoes, to the way to work to the the distance you have to push your brake pedal in order to stop your care at the stop light. Your head would explode. Thus, having a automatic system 1 is crucial. It is just that we are moving into a world that is more and more alienated from the original natural environment our organism is used to. A world of spreadsheets and statistical probabilities for which our hunter gatherer brain is not tuned.
It’s for a good reason that Daniel Kahneman always introduces himself as a pessimist.
Nicholas Nassim Taleb
With his background in quantified trading of stocks and statistics, Nicholas Nassim Taleb is well equipped to write about decision making under uncertainty and he does so in his very peculiar, controversial manner. He made it his passion to stick his finger into the wounds of those who fall for unconscious biases and after reading his book one is likely to get the opinion that everyone besides Taleb is just too stupid to act rationally.
But Taleb has a point, a good one, for which it is crucial to fight, and to fight hard. As we discussed above, knowing theoretically about the biases and flaws of heuristics isn’t enough. In fact, it makes you even more prone to erroneous thinking. That is why Taleb is ‘raging’ on and on against those who still don’t understand the implications of his work.
Other than Kahneman, Taleb has focused on decision making in light of the very unlikely, a.k.a. the black swans. Black Swans are the unknown unknown. Events with enormous impact, that cannot be predicted or anticipated, but only prepared for. In his book of the same name, Taleb elaborates on a few biases that basically mirror our inability to deal with the highly unlikely in a rational manner. In his critique he basically tears apart large part of the economic profession, especially traders like himself, for they fall for the hindsight bias and the narrative fallacy – our tendency to confabulate stories around events to make them have sense and the belief that history can tell us something meaningful about the future.
Talebs writing is more a philosophical essay than a scientific argument, but surely worth reading. In the Black Swan you can also learn more about Mediokristan and Extremistan, other creations of Taleb to classify the world.
Yudkowski has written hundreds of block posts on the topic of rationality that are now summarized in his book Rationality: From AI to Zombies, where the title gives a sense of his own profession: Artificial intelligence. As a decision theorist he discusses the consequences of AI in light of possible cognitive biases. That is to say, he tries to make sure we don’t miss the supergau of intelligence, because our thinking is biased. A lot of his ideas on rationality are also simply conveyed in the beautifully written and entertaining fan-fiction version of Harry Potter: Harry Potter and the Methods of Rationality.
After having read Kahneman, the concepts Yudkowsky elaborates on aren’t new. In fact, they are pretty much directly imported from Kahnemann’s work Judgment Under Uncertainty: Heuristics and Biases, written together with Tversky and Slovic. Yudkowsky, however, adds many considerations to the discussions and gives meaningful real life examples, beyond the the constructed case studies.
What to make of it
It is unlikely that our brains will adapt quickly enough to handle the requirements of the world we move into. A world where information probabilistically handled and generated by computers controls almost everything.
We are no intuitive statisticians
This is the message of this article. We are not as rational as we think, and we are even worse when dealing with statistics. Our brain is just not made to think in these abstract concepts.
So for us at ProzessBasis, practically oriented as we are, the questions arises: What can we do about it?
The answer that management psychologists came up with is this: systems. Our system 1 is bad at guessing and judging based on probabilities, but our system 2 can learn the formulas and rules. The only way to avoid making critical mistakes, then, is to activate system 2.
This is easier said than done, and still a lot of research is required to provide frameworks of ‘deliberate thinking’, but it is a path to go. The consequences are just too fatal, the stakes just too high and the opportunities just too promising not to follow this track.
Be rational about rationality
So as you go ahead, stay on lookout for biases and heuristics. You will be quick to find them in others if you’ve read Kahneman or Yudkowsky, but you’ll hesitate, naturally, to find them in yourself. Read Taleb to stumble over more of your own little flaws, and yet finish the book thinking, that you are now better off than those who haven’t read it.
The only thing that helps is consistent training and the awareness, that there is likely always some form of irrational judgement involved. Thus, as Taleb has said, at least try to be rational about your rationality.
- Undoing Project by Michael Lewis
- Thinking Fast and Slow by Daniel Kahneman
- Black Swan by Nassim Nicholas Taleb
- From Rationality to AI by Elezier Yudkowski
- Nicholas Nassim Taleb Blog on Medium
- Elezier Yudkowsky on Cognitive Biases and AI