with No Comments

Post No.: 0166dissonance

 

Furrywisepuppy says:

 

If you’ve been reading the posts in this blog consecutively, you’ve probably noticed that cognitive dissonance and confirmation bias have been mentioned several times before in previous posts, so I’d like to offer more clarification for what these mean here…

 

‘Cognitive dissonance’ happens when we feel awkward or uncomfortable when holding two or more conflicting or inconsistent feelings at the same time, and so we have an urge to reduce or eliminate this dissonance by subsequently altering our existing cognitions, by either adding new ones or by changing or reducing the importance of a dissonant element – all in order to ultimately maintain a consistent belief system or set of views.

 

For example, if someone who has a massive ego fancies another person but cannot obtain this person’s requited interest, he/she may start to rationalise the situation by thinking that this other person wasn’t worth it after all, is probably a sociopath, has a sexually transmitted infection or whatever! For some people, this kind of rationalisation can come almost immediately after any kind of rejection. The fuzzy dissonance here is between one believing one is an attractive person versus getting snubbed by another person i.e. these things essentially conflict or contradict each other hence something has to give. One can either cease to believe that one is an attractive person or start to believe that the other person wasn’t as desirable as one initially thought. And what mostly tends to happen is that maintaining one’s longest held or most deep-seated existing beliefs takes priority. Thus in this example, this person with the massive ego would likely much rather continue to believe that he/she is still ‘God’s gift to the opposite sex’ hence would rather put this other person they had once fancied down than consider needing any self-improvement. (Whether this is a healthy response because it protects one’s self-esteem or is an unhealthy response because it’s arrogant, I think the best attitude is to not have such a big ego in the first place that is so fragile that such rejections can crush one’s self-esteem.)

 

Another example is if one firmly believes that a particular industry can self-regulate without any problems, but then news of a major, widespread fraud or scandal hits the headlines regarding this industry that one cannot deny. In this case, one can either cease to believe that self-regulation is enough or believe that the fraud was somehow not that serious or some other defence. And if one has politically and publicly supported industry self-regulation for a long time then one will likely automatically bias towards the direction of looking for rationales like ‘it was only the case of a few bad apples’ or it was still somehow the fault of the government. Or for one more example, if one has firmly believed in and voted for a particular political party for all of one’s adult life, but when in power it haemorrhages public funds all the time, then one’s initial response will likely be to try one’s best to somehow rationalise that one hasn’t been making a mistake in supporting this party for all this time, such as convincing oneself that no other party would’ve done any better.

 

This is one reason why it’s incredibly difficult to get people to change their vociferous worldviews or attitudes despite providing consistent evidence that ought to at least soften their beliefs and make them see the world as less black-or-white. It’s not impossible but it takes a lot to convince people they have been wrong about something they’ve long believed in and mentally invested a lot in believing.

 

In a handful of cases, cognitive dissonance is not always harmful though. For instance, helping a person you thought you didn’t like or were indifferent about can build positive feelings towards that person to rationalise or justify the help you’ve just given them. This can be fostered by being grouped with them in a cooperative team task in school where you need each other to succeed, the other person asked for your help or you had to intervene to save their life, for instance. People only tend to willingly help people they like or at least have no problems with, so one can either believe that one is simply the type to help anybody and everybody that needs help or that one helped this particular person because they’re actually all right – for which either belief is good. We don’t tend to hate people we autonomously put in personal effort to help because that is dissonant. Why am I helping the enemy? Well maybe they’re not the enemy after all, or maybe this conflict is senseless. Woof!

 

The opposite can happen too though i.e. you hurt someone whom you were neutral towards or even slightly liked, so to not feel that one is the sort of person that hurts such people, yet one nevertheless has hurt such a person and didn’t have the chance and/or courage to apologise to them – one might convince oneself that this other person somehow deserved it anyway.

 

So cognitive dissonance happens when people find themselves doing things that don’t fit with what they know, have opinions that do not fit with other opinions they hold, or when people’s expectations or attitudes don’t meet reality or their own behaviours. It’s to rationalise that what we’ve just done is what we actually wanted to do thus to not seem stupid for doing it, or to justify that our beliefs are coherent despite real-world evidence or other views or behaviours of our own that contradict them. So playing hard to get (up to a point) can therefore build up another person’s feelings towards you because the more effort they put into you (if they do!) the more they’ll likely like you, to justify that amount of effort they put into you. (Up to a point, and for better or worse, it’s sometimes the case that how much we like a person or thing isn’t down to how much they or it gives us but how much we give them or it, which is related to sunk costs.)

 

And this is why people have strong ‘confirmation biases’ and will tend to avoid situations or informational sources that will give rise to the discomforting feelings of dissonance. We tend to avoid looking for things we won’t like to see or hear just in case we might find them. For example, a person who is staunchly politically-left-leaning avoiding politically-right-leaning news sources, or vice-versa – hence the echo chamber effect and filter bubbles. (See Post No.: 0103 for more about echo chambers and filter bubbles.) Another example is people tend to want to feel smart for coming up with what they think is a unique invention so they don’t tend to actively seek to prove that it’s already been thought of before. But you need to do this if you think it’s patentable.

 

This all means that we tend to be overconfident in our judgements and views because our informational sources are biased and we find it easier to imagine why we might be right than why we might be wrong. We naturally look for and gravitate towards whatever’s consistent with what we already think, feel or want, thus leading us to instinctively avoid, disregard, devalue, misremember or forget information that requires us to change our minds. Anything ambiguous will be interpreted to our favour too. We like to think we personally understand the world (and if others disagree with us then it’s them who are wrong) and that it’s neat, simple and coherent – but it isn’t always so.

 

People don’t want to lose face and credibility for admitting they were wrong, lose the justification for their actions, feel stupid for doing something that seems stupid (e.g. understanding the risks of global warming yet not modifying one’s lifestyle or smoking when one knows it can cause cancer), feel like they’ve gotten something fundamentally wrong for all this time and are now left with not knowing what to believe in anymore (which can even leave them with a sense of losing a main part of their identity e.g. when considering giving up their religion or political stance), and don’t want to face a great personal and social network upheaval for changing their beliefs and therefore ingroup affiliations. So people will do almost whatever it takes to not change their minds – including kidding themselves, sticking with sources of information or people that confirm their biases and covering their ears and saying, “La la la” to sources and people that disconfirm them.

 

…But then we’ll never grow. So we need to always proactively question and test our beliefs and hypotheses. Have a sceptical yet open mind. Don’t stake your flag anywhere too soon, or maybe never inflexibly ally yourself to one side if it means totally shutting out any other side. Probably the biggest bias people have is thinking they’re not biased, or that it doesn’t matter because everyone else is too, or that other people are worse (or some other excuse – the capacity for humans to make excuses or rationalisations for anything imaginable is probably infinite!)

 

Confirmation bias and cognitive dissonance are going to crop up in a lot of topics about attitudes, beliefs and behaviours because they play major parts in why it’s hard to change people’s attitudes, beliefs and behaviours.

 

Woof.

 

Comment on this post by replying to this tweet:

 

Share this post