Post No.: 0833
Furrywisepuppy says:
You’ll struggle to reason people out of views they didn’t reason themselves into, and this includes yourself and your own views too.
We aren’t purely rational creatures so it can at times be frustrating trying to reason with each other because we aren’t simply guided by reason. The explanation for why commercial advertisements routinely manipulatively appeal to our emotions more than reason is because it simply works.
When we think our opinions or beliefs are automatically facts, it’s almost pointless trying to reason with each other because we’re being unreasonable and don’t even realise how unreasonable we are! How can we be objective when our subjective perceptions are our own realities? We have such a high capacity to see what we want to see, hear what we want to hear, and believe what we want to believe – the reality we perceive with our attentional biases and fallible memories is our own reality, which isn’t always the reality. We can delude ourselves into believing in almost absolutely anything we want to believe in despite the most robust evidence (e.g. that the inauguration of the 45th President attracted the largest audience ever, or an election was stolen). It’s as if 2 + 2 = 5.
Seeing can be believing. But what did we actually see? For instance we saw an effect, but jumped to a conclusion regarding its cause. Besides, believing something doesn’t make that something true. We often don’t believe what we can’t see or haven’t seen with our own eyes too. But these heuristics mean that we’re frequently duped and frequently blind.
We don’t have to be liars because we can inadvertently fool ourselves and genuinely believe what we believe from our own intuitions and experiences due to various conscious and unconscious cognitive biases that affect our judgements. Due to confirmation bias, we have a tendency to actively avoid seeking any information or people that or who will disconfirm our deeply held beliefs, in order to preserve them. We hate actively proving ourselves wrong since we think this’ll reflect badly on our social reputation as someone who’s knowledgeable and worth listening to. Yet in doing so, our reputation is being damaged by our stubbornness.
Our worldviews are resistant to change. If we do come across evidence that disconfirms our worldviews, our first reaction is to hunker down and go into defence mode. We work hard to question and rationalise away troubling facts if it serves our greater personal agenda, while we accept anything that supports our existing beliefs without much question. We rationalise events away, move the goalposts, present fallacious arguments. Our beliefs harden and we work to press our own messages more aggressively to try to drown out the voice of the opposition.
Everyone says they want to know the truth, but they don’t really – most people really want reassurance that the world is the way they already think it is. Genuine revelation, or knowledge that changes minds, upsets our worldviews, and we’ll typically detest anyone for upsetting our worldviews proportionately to how deeply we believe in a particular worldview. We don’t like anyone who basically makes us feel stupid!
Cognitive ease is another reason why we’re more ready to accept without critique facts or ‘facts’ that we want as being true (e.g. research that says chocolate boosts our health so eating lots of it is fine if we enjoy consuming chocolate, or that climate change is a myth if we want to carry on with our polluting lifestyles). Sometimes we don’t want to learn that someone whom we thought was a good person was actually bad because, if we had great, long and cherished memories with this person, we don’t want these memories tarnished. (After an event is over, the memories are all we have left of it.) We also don’t wish to feel that we’ve been duped and are bad judges of character. So sometimes we don’t apply reason at all – we just want something to be true and that satisfies our minds because the alternative is more aversive to believe in. If something makes us feel good, is easy to accept or is attractive then we’ll question it less.
So we don’t really want to hear things that challenge our existing beliefs – we want to hear things that make us feel good and morally and intellectually superior to others i.e. things that tell us we are and have been correct rather than fools. One of our most pervasive biases is the desire to make ourselves feel good and to avoid feeling bad, and it feels bad to be wrong or to lose, hence we seek to avoid this bad feeling via biases such as confirmation bias, even if this risks us going against the actual truth. We don’t pay as much attention to, or will discount, data if it doesn’t fit with our existing worldviews and expectations – we’ll instead seek and highlight data that does support the ideas and beliefs we already believe to the true, and we’ll naturally gravitate towards sources that’ll more likely do this for us like our social media filter bubbles and echo chambers. Listening only to these sources can also result in us thinking that we hold the ‘silent majority’ view, and so we might end up assuming that, if we don’t get our way, the ‘loud minority’ got their way.
Beliefs are therefore to do with tribalism too – we tend to automatically accept the positions that come from our own tribe (e.g. political party supporters, nation) and scrutinise more the positions that come from rival tribes. Believing in what our ingroup believes is another cognitive shortcut – instead of individually and effortfully scrutinising (or even paying attention to in the first place) every piece of information we receive for ourselves, we look towards other members of our tribe and tend to follow them. We’ve picked the right side because we (believe we) are smart. We’re a member of the smart group – and so many smart people cannot be wrong, right?!
Being correct is subordinate to feeling good or to serving our agendas or sides – and that’s why the ‘survival of the fittest’ doesn’t result in the lies or BS automatically going extinct over time.
Since we like to feel clever and correct, and don’t like feeling dumb or wrong, we can dislike anything that’s too technical, complex or nuanced for us too. Meanwhile, a conspiracy theory can present a simplified story of the world (e.g. that everything can be explained by a powerful hidden organisation that’s out to get us) and make us feel smart.
We hate feeling like defective or weak and thus hate admitting to being wrong. That’s why we should never make our opponents feel humiliated – they’re more likely to change their minds if they can do so whilst saving face. Otherwise they’ll fight for dear life to reason why they’re not fools, particularly concerning their deep-seated beliefs that reflect their identities and life choices. There are many costs to being seen as wrong, as well as many costs to changing – including reputational costs, invested stakes, one’s close social circle may all be believers in the same things thus to give up a belief could mean upheaval in our friendships and family, or giving up a belief may be perceived as giving up a certain hope or sense of security.
Change alone is difficult because it takes much time and energy, reflection and upheaval – but change is often necessary for growth. It takes a lot of internal and external resources to dismantle something that seems to work satisfactorily and has been stable in our lives, and that we’ve heavily and/or long invested in, and there’s a perceived uncertainty as to whether whatever replaces our old belief will work better for us too.
But what we need to know is that there’s a difference between having disagreements with others and letting those disagreements define who we are. The latter can make these debates feel like matters of life or death. Your heart races, pulse pounds and you can feel a surge of adrenaline. Is a leopard mauling you? No. You’re just arguing with someone on an online forum!
So we can defend our beliefs as if they were our bodies. Our worldviews are often too tied to our personal identities, as if any attack on one’s views is an attack on one’s very core identity as a person hence they must be defended to protect one’s name and honour. As social animals, our perceived social reputations can be considered as important to protect as our bodies, but we must critically understand the difference between the discomfort or anxiety of having our own worldviews challenged and the physical danger of being barked at with aggressive language, being called names and physically threatened. Yes sometimes people will use aggression to attack you personally in lieu of attacking your arguments with verified evidence hence you must protect yourself – yet you may at times feel like you’re being physically threatened by an opponent even though they’re only causing you to question your beliefs. Woof.
We could perhaps disengage from debates altogether to keep our blood pressure down – but this denies us the opportunity to understand other viewpoints. It’s not good when we leave a discussion or ‘cancel’ simply because we disagree with what we’re hearing. We can listen to understand their point of view. We can accept that we alone don’t have all the answers and certainly don’t have a monopoly on the truth. And if you don’t want to feel like your ego is being attacked – simply don’t have an ego! It won’t necessarily mean we’ll change our own mind or meet somewhere in the middle, but we can make a serious effort to incorporate the genuine concerns of others into our own picture of the world. Discomfort should therefore be tolerated.
Don’t form or join closed cliques that exclude outside views. Expose yourself to as many diverse faces and views and try to connect with as many different people as possible; although a line can be drawn with connecting with, or staying in touch with, the depraved, hate-mongers and/or categorical liars. And if someone does wish to genuinely harm you and deny your right to exist then you shouldn’t feel compelled to engage with them. Sometimes you need to draw boundaries, stand up to someone and protect yourself, perhaps because they’re trying to force their beliefs onto you, take advantage of you or they’re taking liberties with your tolerance and patience.
Beliefs shape the world. They have major consequences for us all. Companies survive or die based more on how investors feel a company will do in the coming future, more than how much turnover or profits a company has generated – hence unicorn companies that are apparently worth billions despite having never made a profit or even sale yet. Beliefs shape justice because it can come down to what judges or juries believe is the right verdict when the evidence isn’t clear-cut. It’s even worse when verdicts aren’t decided by a fair trial but something like religious faith or pure intuition. You’ll know when someone has applied pure intuition when you query the reason for their stance and they cannot give you a cogent reason immediately. They might subsequently search for a cogent reason, but they decided upon their conclusion first and are only later attempting to justify it.
In short, we’re biased to believe we’re superior to most others thus admitting that one is wrong has many ego and social costs, hence we’ll try hard to hang onto our current beliefs and resist admitting defeat. It takes a lot of courage and strength to question one’s own beliefs.
Yet our perceptions may not be shared – you may think you’ll lose face if you admit defeat but others will likely regard you more highly, and you’ll gain or preserve valuable social relationships.
Woof!
Comment on this post by replying to this tweet: