Post No.: 0651
Our ‘system one’ counts on simple inborn, and adapted, instincts or heuristics, like ‘fat is delicious so gimme more’. And, unfortunately, even when we try to apply our conscious ‘system two’, we can still end up seeking for and relying on overly simple rules. Moreover, our reflex reactions are frequently (as I call it) ‘knee-jerk 180° flip’ reactions, such as ‘if fat is bad for our health because it makes us obese then I must cut it out from my diet completely’. But some fat is vital for our health.
If a person mistrusts someone then they might start to believe they should never trust anyone ever again. It’s been discovered that there are beneficial bacteria and being too clean may be unhealthy, so some have taken this to mean that it’s okay to be dirty, or to even eat dirt! Some additives and preservatives are harmful in large amounts but that doesn’t mean natural is always best or we should ban all chemicals (not that we could because all matter is made of it!) Not all statistics are lies just because some are. Not all fears are unjustified hence there are things to fear other than just fear itself. Just because politicians often lie, it doesn’t mean we should therefore unquestionably trust wealthy businesspeople. In history, we’ve gone from pushing eugenics (believing it’s all about genetics/nature) to pushing behaviourism (believing it’s all about the environment/nurture). There are ‘false dichotomies’ like ‘if you’re not with us then you must be against us’, or if an authoritarian communist state is bad then a selfish ‘me’ society must be best.
Some more knee-jerk ‘one extreme to an opposite extreme’ reactions include parents who felt neglected, starved or didn’t have any nice things when they were young spoiling their own kids excessively with materialistic stuff and food. There’s having no control in one’s life before to wanting and expecting absolute control now. Or just because some illegal drugs cause problems in society, this doesn’t, alone, necessarily mean that they’re therefore better off legalised. (Think about the greater problems that alcohol and cigarettes cause, or firearms where they’re relatively easily accessible.)
We shouldn’t knee-jerk into totally eradicating all salt or carbohydrates. An unhealthy person who eats all manner of highly-processed junk might have a knee-jerk ‘epiphany’ that a raw food diet or ‘clean eating’ lifestyle is therefore healthy. There are plenty of extreme diets or lifestyles that claim to be healthy but aren’t, because being healthy doesn’t involve extremes but balance and harmony.
Somewhere inbetween, or a bit from both ends, almost invariably produces the optimum – whether we’re talking about health, socio-politics, upbringing methods or whatever. A (more reliable) general rule is that no extreme is good (e.g. zero government versus all government, starvation versus obesity) hence, theoretically, optimality should always be sustainable and within reach.
But it’s as if we’re drawn to believing that things are always ‘black if not white’ or ‘white if not black’. We like simple rules – all-or-nothing, black-or-white – because they’re cognitively easier to remember and apply, even whenever we consciously search for patterns and try to learn, never mind regarding our innate instincts. In the quest for efficiency, we oversimplify.
There are lots of scary headlines that can cause knee-jerk reactions in parents, including learning that vaccines can cause rare side-effects then going from believing that they’re beneficial to believing their children should never take them. Sometimes parents give too much by being over-protective, by over-comforting or over-feeding due to their worry of depriving their child when they are nowhere near doing so. Parents often understand and accept there’s a problem and it’s down to them to change it but then, in a knee-jerk reaction, go completely to the opposite extreme by, for example, banning all sugary snacks hence battles during mealtimes; after which many parents eventually give in to their children’s screams and tantrums. (Parents must be able to stick to their guns when they say ‘no’. They won’t starve their children by feeding them healthier meals or snacks – only if they don’t feed them at all.) Banning is such an ultimatum for a child too, and makes them desire what’s banned even more (due to reactance – which is a kind of unthinking knee-jerk reaction itself). Thus the occasional sweet/candy isn’t a problem – just not too much or too often.
There are cases where we cut funding for police patrols because there seems to be so little crime – but that might be precisely because there were police visibly on the beat! (It’s like wondering what’s the point in continuing to exercise when one hardly gets ill?!) So things can knee-jerk flip from extra security to too little, or some other working solution to subsequently cutting corners, profligacy or complacency.
Voters, in reaction to perceiving there’s too much immigration and loss of national identity, might vote for a rightwing demagogue. You’d think that it should settle on the most moderate centrist parties perpetually winning – but instead it periodically swings from right to left to right, etc.. Protest votes are examples of voting in an opposite way just because people feel dissatisfied with the current way, rather than because they really understand that opposite way. There are cycles of strong regulations then deregulations, booms then busts, high public spending then austerity, relative collectivism then individualism; or binges then ‘detoxes’ or yoyo crash dieting. When things are going well, we start to get lax or wasteful, and then when things go badly, we have to tighten up again, etc..
People who hear the views of someone who’s adamant, compared to someone who’s tentative, about a prediction of a far future outcome in a chaotic domain like politics or economics, will tend to trust in the adamant person rather than the better-informed tentative person. Genuine experts who try to make long-range predictions in chaotic domains are never cocksure, yet laypeople will tend to regard the surer person the wiser one instead – after all, no one would be that sure unless they know they’re absolutely right, right? But the less one knows about expertise, the more one stereotypes expertise. And when they find out this ‘expert’ was totally wrong, laypeople can crudely deem all experts as useless in a knee-jerk 180° flip from one extreme stance to another. They’ll blame the experts when people should really understand that adamant predictions are foolish in chaotic domains.
Too many of us fall for deceptions in this so-called ‘post-truth’ era – we may keep on trying to blame the liars but we’ll only start to learn once we accept that we’ve been too easily misled ourselves. But a common response is a knee-jerk 180° flip towards not believing in anything the ‘establishment’ says, and then being too easily misled into believing in ‘alt news’ sources that espouse opposing lies. That’s crudely swinging from one extreme to another – believe everything, believe nothing, believe the complete opposite. Reverse psychology can occasionally work because of this blunt heuristic.
Jumping from one extreme to another demonstrates naivety and/or panic, like over-correcting left then over-correcting right during a vehicle skid. It’s a problem of learning things to only a superficial level, concluding patterns from too few data points, and thinking a little yet too little. We therefore need to continue to learn more and more rather than terminate our education on a subject once we think we’ve sussed out the patterns or heard the first view we’ve accepted as making sense. Our knowledge and heuristics will then become more refined and nuanced rather than overly crude and simple.
Unfortunately however, our first accepted view on an issue tends to be sticky because confirmation bias then sets in, which keeps our worldviews stable. For example, we might hear about, learn about and accept that climate change is a hoax first, and even if new climate change evidence is later presented to us, our initial reaction is to reject it because we have already invested in believing that it’s a conspiracy.
So rather than now seeing an issue as more complex than it at first appeared, we tend to rather stick to the simple picture that we’ve already invested in, hence – more than just gathering more information – we need to continue to remain open-minded even after we have already pitched our tent on one side of a fence.
A little bit of knowledge can be a dangerous thing – it can make us arrogant because we’ll think we’ve learnt all there is to know to make our mind up about something. But we don’t know what we personally don’t know, and the other side might actually be the side that has more going for it, or somewhere in the middle might be most fair? It’s of course okay for anyone to argue their case, whatever it may be – as long as they and we remain humble, and remain listening to each other’s opposing views and concerns.
Well there are often two or more sides to every story, but this isn’t always the case, and so to be passive or lazy in thinking that ‘the truth must therefore be in the middle’ is a blunt heuristic too. It’s down to the balance of evidence and critical analyses on a case-by-case basis. Often in debates, even though, perhaps, 98 scientists take one stance and 2 scientists take another, each side will get represented by 1 scientist each, which therefore doesn’t reflect the true consensus – it makes it look like the argument is 50:50 when it’s more like 98:02. And this can mislead us into thinking that the balance in the debate is more equal when it isn’t. This is done in the media to try to be balanced and ‘fair’ to both sides of an argument – but it’s actually an unfair arrangement if this balance didn’t exist in reality. It makes the less supported side seem more supported or more common a view than it really is.
It takes a lot more to change an existing worldview than to adopt a worldview starting from a neutral position. Religions, cults, businesses and brands know this, hence try to shape children to their ways of thinking as young as they can get them because most children have yet to settle on a view on most issues. What we tend to believe in is also affected strongly by our peers or ingroup, who we generally don’t want to depart from, hence they play a key role in shaping or changing our minds too.
Few of us want to put in the effort ourselves to sift through a lot of data, so we defer this task to others (sometimes to those with brash, confident, charismatic personalities who superficially say they love us a lot). But they’ll cherry-pick which information and conclusions to present to us that suits their agenda. So even when people lose trust in official experts, they end up listening to others who believe they know better i.e. self-styled ‘experts’ by another name! Even those who distrust experts still listen to other people who claim to know things. If ordinary folk don’t or can’t sift through lots of raw information for themselves, or at least apply critical thinking to what they hear from others, then they, as voters and consumers, will forever be at the mercy of various sides with their own self-interests telling them what’s ostensibly the truth.
Woof. So a knee-jerk 180° flip reaction is an overly crude heuristic in assuming that the directly opposite or opposing side must be the correct, truthful or optimum side if one side has been wrong before, untruthful or sub-optimal. It’s related to false dichotomies. One oversimplistic rule-of-thumb is replaced by another oversimplistic rule-of-thumb because our intuitions desire cognitive ease. We can therefore go from one extreme to another. We believe there’s a clear battle between good and evil. Our simple minds prefer simple rules and beliefs but the world is seldom simple.