Post No.: 0148
A personal self-monitoring and self-control of our biases shouldn’t be relied upon (well there’s the bias of believing that one has gotten one’s own biases under control or that biases mainly affect other people(!)) ‘System two’ (our slow, deliberate and effortful conscious analysis) can be ignorant or even completely in the dark as to what ‘system one’ (our fast, automatic and effortless subconscious or unconscious instincts or intuitions) is really thinking, and it’s certainly inherently lazy (or an alternative interpretation is ‘economical’) because thinking is hard work. It’s also not spectacularly alert but tries to maintain control as best as it can with its limited mental resources.
It’s not the case that system two always gets it right (e.g. try mentally calculating the product of 3,279,347 and 253,168 in your head right now! People also make conscious logical fallacies all the time, such as ad hominem or ad populum arguments). And it’s not that system one always gets it wrong – but when system one gets things wrong, we won’t likely know how it did or even question whether it did, whereas we’re conscious of, more vigilant of, and will therefore more likely admit to our system two mistakes (e.g. questioning our answer to the calculation above).
We can learn about our innate biases and heuristics but it’s still incredibly difficult to get us to stop acting on our instincts (to stop being slaves to system one thinking). For instance, people can blind-taste products and learn that the most expensive is not necessarily always the best, but then can blind-taste again but with new products and will again assume that their favourite must be the most expensive or artisan! (Read about how our expectations and preconceptions shape our sensory perceptions too in Post No.: 0121.) These intuitions are really hard to shift, even if we’ve been educated via firsthand experiences to question them. These firsthand lessons sometimes get rationalised away e.g. people who hold strong mental stereotypes of people of a certain ethnicity, who then meet people of that ethnicity who don’t fit that stereotype (they could even be their friends), often use the rationalisation ‘but they’re unusual and don’t count’, and so their mental stereotype of that ethnicity persists.
Even otherwise highly intelligent people frequently fall for tricks, illusions, biases and other systematic errors of intuition. (General intelligence doesn’t mean specific intelligence in every field or area of knowledge and understanding; knowing a lot of things doesn’t mean one knows everything.) So arguably the answer is that we all need other people (who are educated in the subject of biases, intuition and heuristics) to point out our own errors of intuition and biases.
Most people think that their subconscious (below conscious) or unconscious (beyond conscious) intuitive self just does exactly what their furry conscious self does except faster, when this isn’t true – this is an intuitive belief in itself, and an erroneous one! Your intuition tells you to trust in your intuition so what hope does one have to realise that one’s intuition cannot always be trusted(!) It’s like, metaphorically (or maybe even literally), a mind-control program telling you to trust in the mind-control program and to not question it.
If it weren’t for scientific experiments and hard data to show us that our intuitions are frequently fallible, we may never have known that they are frequently fallible, particularly in certain contexts. Probably the best way to understand how prevalent cognitive biases and errors are is by personally trying out various cognitive illusions, judgement tests and decisions under uncertainty, such as would you accept a bet on a fair coin toss where you’ll win $110 if it comes up heads but you’ll lose $100 if it comes up tails? (Unless losing $100 will catastrophically impact your lifestyle or you hate money then it’d be rational to accept the bet – but most people won’t.)
The mental work that produces the impressions, intuitions and many decisions we receive and make in our day-to-day lives goes on below and beyond our conscious awareness. You may be able to answer what you’re consciously thinking about but not what’s going on beyond this conscious level. Consciousness is only the tip of the iceberg of total cognitive perception and processing, and it typically comes late into the game too. You may believe you know what goes on in your own mind, thinking that one conscious thought leads in an orderly fashion to another, but this isn’t even typically how the mind works. Most impressions and thoughts arise in your conscience experience without you knowing how they got there – you cannot trace the origin of your beliefs or thoughts before you finally become consciously aware of them.
Yet even consciously, people often don’t understand themselves enough to understand their own true preferences (e.g. wanting a less born-privileged/posh/elitist political leader but then deeming such a person too common for leadership if one arrives), or to anticipate their own behaviours and feelings in the very near future (e.g. the rapid adaptation to the novelty/amazement of a new purchase, such as a new kitchen gadget that’ll barely get used again even though one genuinely thought that one would use it everyday as one stood in the queue in the shop to buy it). Woof!
So the assumption of habitually being rational and in full control of our waking lives is incorrect because of the multitude of heuristics and biases, priming effects, subconscious manipulations, our personal internal models, the reliance on arbitrary, crude, superficial and/or irrelevant factors, and so on, that work on our subconscious and unconscious (that we’re obviously not consciously aware of). We’re not always being rational even when we believe we’re being rational. We all tend to overestimate our voluntary, autonomous, conscious and rational control of our own judgements, choices and predictions, and tend to underestimate our involuntary, automatic, subconscious and unconscious and irrational influences i.e. we think we’re almost always in manual control of ourselves and know exactly why we do and choose the things we do and choose, and think that virtually nothing gets past our conscious minds or that nothing is surreptitiously influencing us beyond our conscious awareness (whether from external sources or from within ourselves).
Logically, unless our lives are constantly being analysed in detail (such as in scientific experiment contexts), we’re not aware of how much gets past our conscious minds precisely because it’s gotten past our conscious minds! (Well our lives are increasingly being constantly analysed in detail – not for our education but for companies to exploit our largely predictable patterns of behaviour! Their algorithms wouldn’t make them a lot of money if humans weren’t so by and large predictable rather than idiosyncratic. I’m not saying don’t use their services – just demand knowing and be aware of what’s going on in the background and make better-informed choices.)
Irrationality doesn’t just arise from emotions but from the search for mental shortcuts and from the limitations of our cognitive machinery. These can lead to errors. So systematic errors in thinking and illogical choices can arise from the design of our cognitive machinery rather than merely the corruption of thought by emotions; not that emotions are irrational in and of themselves – it just depends on how appropriate, excessive or insufficient they are for a given situation.
People may claim, regarding a psychological manipulation that’s pointed out to them for their conscious awareness, “That’s obvious” or, “I knew that already” – yet they’ll still fail to guard against it in the future and won’t be aware of it every time (e.g. when was the last time you reframed a statistic between relative and absolute figures, or reframed a decision problem between narrow and broad perspectives, when one was presented to you?) We all generally think we know ourselves and assume we are mostly rational but we don’t know what crude, automatic intuitions are working within us below our consciousness at almost every single moment of our lives, unless we’re taught about them and are extremely vigilant at all times – but that’s incredibly effortful and system two is lazy (or likes to be ‘economical’).
Indeed, much of the time we’re fine, less stressed and more efficient living on automatic pilot, but there are particular situations where there are high probabilities of our intuitions failing (e.g. we’re frequently poor at intuiting logic or statistical information) and/or when there are potentially high costs for failure (e.g. when the long-term is at stake, such as our health or the environment). It’s primarily these situations where we are better off slowing down and questioning our instincts.
We evolved to naturally cope better with social problems than abstract ones, even those that have the same logical structure. We also tend to try to anthropomorphise abstract problems, if possible, to make them easier to comprehend. We tend to be overconfident in what we (think we) know and underestimate how much we don’t know. We tend to overestimate how much we understand the world and underestimate the role of chance in events (hence e.g. interpreting patterns, causalities and stereotypes where they don’t or shouldn’t exist). We frequently fail to acknowledge the full extent of our fuzzy ignorance and the uncertainty of the world we live in. And a major problem with all our cognitive biases and logical fallacies is that they are routinely (intentionally or unintentionally) exploited by others without our awareness – and this reason alone is a good reason why we should all get educated on this subject and be more alert!
Woof! There’s still a lot to be said about this subject, and the mind is one of my favourite subjects too. I hope it’ll all be valuable to you to better protect you from being manipulated and misled by others who wish to unduly profit from you.