Post No.: 0775
It’s better to treat your beliefs as hypotheses that are to be tested rather than to be defended.
So whenever you form a belief, try to immediately think of some conditions or pieces of evidence that would cause you to change your mind if they ever transpired. Part of this approach includes looking for reasons why you might be wrong, not just reasons why you might be right. This in turn means regularly exposing yourself to those who challenge your views, not only those who agree with them; and listening to views that tax your mind, not only those that make you feel good. If you believe that nothing could ever change your mind then you’ve become a non-thinking zealot. You’re no longer the owner of an idea but a slave to one.
Instead of thinking of your beliefs or ideas as being your identity – think of them more as experiments that are to be tested. So if they fail then it’s no problem if you change direction and try something else. For instance, you’re not a ‘socialist’, ‘democrat’, ‘capitalist’, ‘communist’ or whatever – you’re just someone who’s searching for the best way to run a country at a particular time. And if you’re not wedded to any idea and the sooner you can admit to anything that isn’t working, the fewer and less catastrophic your mistakes will be. Be humble in what you think you know. Concentrate on improving rather than proving yourself.
View arguments as opportunities for collaborations to reach the right or best conclusion, not as contests where you’re trying to defeat your opponent. A ‘scout mindset’ seeks to understand and get to the truth, whatever it may be. A ‘soldier mindset’, meanwhile, seeks to win first and foremost, and so attacks opponents and defends positions. The latter mindset descends to barking, using threatening stares and physical postures, deepening one’s voice, using a patronising tone, interrupting other speakers, being insistent, and basically makes attempts at displaying dominance in the exchange – as if dominance makes one’s arguments more correct. (It doesn’t.)
Someone with a solider mindset will also potentially use dirty tricks by employing fallacious argumentation, like attacking a straw man, ad hominem attacks, slippery slope fallacies, guilt by association, false dichotomies, thought-terminating clichés, and appealing with emotions more than evidence. An example of attacking a straw man is misquoting someone in order to try to discredit them on that point i.e. attacking a point they didn’t actually make.
A scout mindset involves remaining calm and polite, showing curiosity towards why the other party disagrees with you and why they believe what they do. It involves talking about your own beliefs with varying degrees of confidence (like ‘I’m 79% sure of x’) instead of making black-or-white assertions. It’s sometimes not about reaching the correct answers (because sometimes we cannot verify them) but acknowledging the limits of our knowledge. This mindset also involves examining the causes of your own beliefs rather than assuming that they’re objective facts. Logical reasoning can sometimes be accomplished fast and intuitively, but it usually requires slow and deliberate thinking.
Our rationality quotient, or RQ, could be said to be a measure of our rational thinking. Although they overlap, ‘epistemic rationality’ involves trying to achieve the most accurate beliefs about the world; whereas ‘instrumental rationality’ is motivated reasoning – that is to achieve a goal, like winning or steering a consensus towards what one most prefers.
Epistemic rationality requires questioning one’s present beliefs, considering the viewpoints of those one disagrees with, updating one’s beliefs based on the fluffy flow of new evidence, and trying hard to consciously mitigate any cognitive biases and errors of perception, processing and memory. Present and ask for evidence. It’s about caring about what’s true without allegiances to any side or conclusion.
Meanwhile with instrumental rationality – presenting fallacies and falsehoods might even be useful if it’ll ultimately help one ‘win’. If we detest this kind of behaviour amongst politicians then we should detest it from ourselves too!
But it’s far easier to spot and criticise other people’s irrationalities and biases than our own! Also whenever we’re presented any purported proof that confirms our beliefs, we’ll just happily accept it all as true; whereas whenever we’re presented purported proof that disconfirms them, we’ll apply hard and effortful critical thinking to scrutinise and try to poke holes in its veracity!
This is an asymmetric or biased evaluation of the evidence depending on whether we initially agree or disagree with something, or want it to be true or false. If we, say, have a bias against charities and believe in for-profit businesses in a low regulation environment, then if we hear a news story about a scandal involving a charity, we’ll take it as inherently damning for all charities generally. Yet if we hear a news story about a scandal involving a business, we’ll reason it as only a one-off and not representative of all businesses.
We might search for design flaws in the methodologies of any studies we don’t like the conclusions of, while we don’t bother doing the same with studies that present any conclusions we do like. We might interrogate a stranger harder than our own kin after they’ve been involved in a kerfuffle we didn’t witness. Or we might doubt and question the ‘propaganda’ that’s coming from another country’s media but we’ll readily accept as true the ‘facts’ that come from our own country’s media concerning geopolitical disputes.
Unfortunately, university debating clubs are typically set up to emulate adversarial duels, which doesn’t encourage students to consider nuanced arguments or conceding points to the other side; because if one did, one would likely be kicked out of the team! Indeed, many politicians have honed their ability to BS via partaking in such debates when younger, and it tells!
It’s not just politicians though. Whenever we wonder how we can reason with unreasonable people who aren’t aware of their own unreasonableness and who just automatically think they’re right when they’re wrong, we’re usually thinking ‘yeah, those people are really annoying’ – but we don’t usually include ourselves in this category(!) And that’s how biased we can be. The key is for us all to understand this to hopefully tame our condescension and encourage us to have more reasonable conversations with each other – by seeking for and relying on hard evidence if a debate concerns an objective scientific fact, and by exploring and being open to alternative views if a debate concerns a subjective philosophical position.
Human bias is a tough nut to crack though – even knowing, understanding and accepting the logic of all the known cognitive biases isn’t enough to get people to stop being biased. It’s not an inescapable part of being human though. It just requires the right mindset – a scout mindset!
It’s nevertheless worrisome how little people understand about themselves. Humans still have a lot to learn about humans. Well if human brains were so simple to understand, humans would’ve needed to be simple creatures to do so(!) (Then again, we don’t fully understand the brains of the animals we consider less sophisticated than humans either!)
Without a humble mindset, we have little hope of solving many of the world’s major problems because everyone will stick to doing exactly what they’re doing right now because everyone already thinks they’re right and doing the right things. Arrogance is a major barrier rather than something to be proud of – we need humility in order to understand that we need to build collaborative relationships to make things happen at scale, rather than seeking to ‘win’ arguments for the sake of winning. We need to understand each other above thinking in terms of ‘them dumb’ and ‘us superior’. We might have to say to ourselves every morning, “I could be so arrogant that I won’t realise that I’m arrogant” and, “I’m not at the centre of the universe even though my perceptions place me at the centre of my own universe.”
One of the key attributes of true experts is knowing how much one doesn’t yet know. Of course they do know a lot – plenty more than laypeople, but they also know about what they don’t know and so will factor that into account. So it doesn’t necessarily mean ‘anyone’s guess is as good as any other’ just because an expert admits there are still unknown things – we should still sensibly follow the best of what we currently collectively know and there’s still plenty that true experts do know. Individuals who have trouble admitting to being wrong, or feel smug rather than humble towards those who are percipient and courageous enough to be able to admit to their own mistakes, won’t grow. We’ve all made howlers and said the wrong things occasionally, thus those who cannot notice or admit to it cannot be credible.
So reminding ourselves that we don’t know everything is probably the key to staying humble and thus open to listening to others, staying in ‘learning mode’ and cooperating to find the best, most supported, truths and stances.
Things automatically taken as ‘you’re wrong if you don’t accept this wisdom’ should really be questioned too. For instance, it’s now highly questionable whether the advice of ‘complete the full course of antibiotics even if you already feel better, in order to avoid antibiotic resistance’, is sound advice anymore because the unnecessarily prolonged use of antibiotics may actually pose a greater risk of antibiotic resistance; with a few exceptions based on case-by-case evidence for specific diseases. (Although I guess one should still adhere to the instructions of one’s doctor.) It also used to be unquestioned for a long time that dinosaurs were all dull-coloured and didn’t have feathers unless they flew, yet we must realise or remember that all birds alive today are descendants of theropod dinosaurs.
So when we lack humility, we’re at risk of exposing ourselves to be the ignorant ones for failing to understand the far-thinking and cutting-edge ideas that others are trying to get us to consider. They may know something that we don’t, hence our assumptions, jumping to conclusions and negative judgements of them will backfire spectacularly. Thus if someone poses different ideas to the ‘accepted wisdom’ – don’t automatically presume they’re the ones being ignorant. Embrace the questioning of ‘accepted wisdom’ and don’t terminate thought by calling people names if they disagree with the current majority. Yes, extraordinary claims require extraordinary evidence, but the questioning itself shouldn’t be curtly curtailed – pause, think, then reply; or just pause and think on it. If a tradition has ‘always been this way for a hundred years’ then does this mean one should stick with the formula or does it mean it’s about time to freshen things up? (A hundred years is hardly that long anyway in the scheme of things.)
Don’t be a strut noddy, or someone who’s too idiotic to realise that they’re idiotic. This means the only wise thing is to always remain modest, which isn’t really the same thing as being submissive since one can still try to persuade others towards one’s side – it’s just that one won’t try to patronise others. It takes intelligence to recognise intelligence. We can never work out how absolutely stupid we are because we cannot work out how much more intelligent we could be. Like how cockroaches probably cannot comprehend what humans are doing even when staring right at human activity – the smartest humans ever probably won’t be able to comprehend what a ‘higher evolved’ natural or artificial intelligence will be able to comprehend. If we cannot understand them then why are they stupid?! Therefore the simple logic is that all arrogance is itself asinine.
Woof. So let’s all practise the scout mindset, and exercise our epistemic rationality.
(I’ll never think I know enough. That’s why I’ll never stop wanting to learn more… and why I’ll probably never ever feel ready enough to actually promote this blog(!))