Human thinking is complicated. I further find it ironic that we find it so difficult to think about our own thinking. The reason for this is that we are not aware of all of the processes that go into the workings of our own minds. When you think about it, that makes sense. If we had to monitor the mechanisms by which we process information, and then monitor that monitoring, we would use a lot of mental energy in a potentially endless loop of self inspection. This could easily become paralyzing.
So mostly we engage is automatic subconscious problem solving, which uses simplified algorithms to produce decisions which are fast and good enough, absent awareness of what those algorithms are. When we have the luxury for more introspection we can indulge in some analytical thinking as a check on our intuitions.
Added to this, we have made our own world incredibly complex. In a way we have overwhelmed our own intuitions (what psychologists call system 1 thinking). We are fumbling through complex technological and scientific questions involving a world-spanning civilization with a brain evolved for a much smaller and simpler world. This means we need to rely much more heavily on system 2 thinking – careful analytical thinking. This involves metacognition, or thinking about how we think.
Psychologists Kahneman and Tversky have arguably had the most dramatic effect on the study of decision making starting in 1979. They put forward the whole notion of cognitive biases and heuristics, or mental shortcuts we substitute for careful analytical thought.
The Mechanism of Substitution
I have written about many such biases and heuristic here over the years, and here is a new one – the mechanism of substitution. Actually, it is not very new, but rather a different way of looking at existing biases. It is more of a mechanism for biases and heuristics rather than one itself. At it’s core the idea is simple, when confronted with a complex problem we substitute a simpler problem we can answer and then go with that answer.
A great example of this is the availability heuristic. The question – how common is this phenomenon, is complex to answer. We would have to survey a lot of objective information in order to really answer it, with proper controls and representative samples. This is too much work, so we intuitively substitute a far simpler question – how easy is it for me to think of an example of this phenomenon? If an example readily comes to mind, we conclude the phenomenon is common. If we can’t think of an example, we conclude it is rare.
This shortcut is reasonable in many circumstances and is an indicator of how common something is. You can also see how much more effective this heuristic would be if your entire world consisted of a tribe of 300 people who never traveled more than 20 miles from their village. In our modern world, not so much.
You can also see in this example how substituting a simpler question is a mechanism for a specific heuristic. Further, it demonstrates how our brains prioritize efficiency, which is primarily gained through simplifying processes.
You could go through many of the biases and heuristics and see this mechanism at work. We have a left-most digit bias. When estimating how large a number is, we focus on the left-most digit. That is why items are often priced at $19.95 – we really do look at the “1” and are partly fooled by this.
We see this mechanism at work in marketing and consumer thinking about complex technologies. For a time computers were marketed based largely on their processor speed, as if this one number was a good representation of overall power. Digital cameras are often marketed with megapixels as the one measure of quality.
When asked to predict how a person or system will behave in the future (including ourselves) we tend to substitute the complex problem of considering many possible variables with the simple question of – how do we feel, or how is the system behaving right now? We have a huge bias to extrapolate current trends into the future, because it is a far simpler question to answer than one in which we consider all the factors that can disrupt current trends.
Analogies often work by this mechanism. When asked a question about a complex phenomenon, we search for an analogy to a known phenomenon and then shift the question to the known analogy, even if it doesn’t quite work.
The problem isn’t that we use analogies or heuristics to inform our decision making. That is perfectly reasonable and likely effective. The problem comes from substituting our analogies and heuristics for analytical thinking about the actual question we are confronting.
Substituting easy questions for hard ones is also just one part of a more general phenomenon of oversimplification. Again – I think vertebrate brains are fundamentally organized to prioritize efficiency, making quick judgments that are mostly true. What matters is the resulting behavior and how adaptive it is, and it’s easy to see how the trade-off between accuracy and speed would be optimized. Simplifying problems to manageable analogies is not a bad strategy overall, as long as we don’t confuse our simplistic analogies for reality.
Another example of oversimplification is reliance on simple narratives to explain the world. We not only substitute simple question for more difficult ones, we substitute simple answers for more complex ones. We tend to develop narratives that have apparent explanatory power, and then overuse those few simple narrative to explain a complex messy world.
For example, “natural is better” is one such narrative. Navigating all the complex and shifting scientific evidence, government regulations, and marketing claims in order to figure out what is best for our health can be overwhelming. It is far simpler to go with an easy narrative – anything natural is better.
Conspiracy thinking also serves this role – all the complex details of reality can be explained by invoking a grand conspiracy, which can accommodate any evidence.
Political parties are essentially institutionalized strategies for oversimplifying the world. Political parties have ideologies, which are a limited set of principles that guide decision-making and priorities. If you are a libertarian, than liberty is everything. If you are a progressive liberal than egality is everything. Applying this one filter to all political questions is clarifying. Forget about balancing various legitimate principles that are at odds with each other, that results in the “mushy middle” or “accommodationist” thinking. Stay true to your one principle.
Of course, even that is an oversimplification of human behavior. People are multifarious beings with a host of complex motivations and beliefs. But there is a clear tendency to oversimplify, and this tendency leads to biased thinking that diverges from reality.
This is why people can so vehemently disagree about what should be factual questions. They are applying different filters, heuristics, and narratives.