with No Comments

Post No.: 0636denial

 

Furrywisepuppy says:

 

There are a multitude of cognitive reasons for the denial, or dismissal, of empirical evidence.

 

One is that intentional, directly immoral, instantaneous or imminent threats are perceived to be greater – so global warming or becoming obese involves none of these, whilst terrorism involves all of them, for instance, hence we’ll intuitively care more about tackling the latter than the former kinds of problems, even though the latter kills or harms empirically fewer people.

 

Long-term threats can seem too abstract or vague to respond to and can seem too slow a threat that they don’t register as a serious pressing danger at all. This relates to hyperbolic discounting.

 

Personal or political ideologies play a major role, such as a worldview that believes it’s wrong for the government to interfere with our lives. Holding zealously onto the ideology of laissez-faire capitalism is probably the number one reason for climate change denial. One possible way to persuade these people is to promote the financial sense of investing in environmentally-friendly businesses and to stress the common threats between different political groups since disparate groups can care about the same issues if they’re framed correctly.

 

This connects with the self-preservation of one’s current lifestyle – many people don’t want to change their comfortable lifestyles so it’s far more convenient to simply attempt to discredit any evidence or argument for behaviour change so that one can justify carrying on as usual. We can find change difficult. Disliking the proposed solutions – we can personally be in denial about there being a problem in the first place, or bury our heads in the sand in order to not think about the problems on a daily basis.

 

The illusory superiority and self-serving biases make us like to think that we’re superior in intelligence to most others thus we’ll hate to believe that we’ve been wrong or been duped all along by groups that deny climate change for their own interests; particularly when it concerns our fundamental worldviews. Therefore if we sense that our worldviews are being attacked, we’ll tend to defend and hang even harder onto believing we’re right, as if our entire identity depends on it. This relates to the backfire effect.

 

Another reason can be social proofs or not changing one’s behaviour because others aren’t either. But of course everyone is looking at and following each other’s inactions. This relates to the bystander effect.

 

Ignorance means that there are many other, more immediately gratifying, things one would rather pay one’s limited attention to, and maybe some people are truly ignorant to all the evidence – although it’s much harder for anyone to claim they’ve not been informed about the dangers of global warming nowadays(!) Tons of tough reading is sometimes required when assessing complex issues but unfortunately many of us are too lazy and will avoid or give up on reading heavily technical or long texts in favour of overly-simplified, quick and catchy pieces too, and will be more persuaded by short and emotive sound bites. This relates to cognitive ease or fluency.

 

The sunk cost fallacy occurs when we’ve invested so much time, effort and public image/reputation towards supporting a certain worldview that we can find it psychologically difficult to abandon it and change our minds because we believe that we’d lose too much face if we did – we’d rather be in denial and carry on as if we’ve not lost (yet) than accept conclusively that we have. (In armed conflict contexts, this can mean lives being continually lost for nothing, all for trying to salvage some ‘national pride’.)

 

Social ingroup biases, filter bubbles and echo chambers mean that if you’re ill-informed then your close social network is likely to be ill-informed too, because we tend to prefer to congregate around others who hold similar views to us. And this’ll communally reinforce our beliefs and make us think they are more popular or representative of the population as a whole, and therefore (fallaciously) more correct, than they really are. We also have a tendency to choose, consult with, click on or read news or information sources that already express the same worldviews as us, thus giving the effect of an echo chamber for our existing stances. We’re often preaching to the already converted on social media based on which hashtags or accounts we voluntarily use or follow.

 

Related to confirmation bias, we think we’re scientists discovering the truth but we’re actually more like lawyers arguing for positions we’ve already arrived at by other means i.e. we tend to form our beliefs based on our self-interests and personal desires/biases first, and then try to find or only accept evidence that supports those views via cherry-picking (mis)information or only one-sided data. This is instead of assessing the entire set of evidence with an open furry mind first, and then letting that entire set of evidence guide our conclusions, positions and views. Lawyers argue for whatever suits their desired conclusions – the interests of their clients in particular. They want to win more than be right (although when we don’t know the facts e.g. whether one’s client really did murder someone or not – winning is considered being right).

 

A politician might gloat about his/her own country doing better than other countries based on the statistics provided when it comes to handling a pandemic, but once their country is doing worse than other countries based on the same source of statistics, they’re very keen to point out (although occasionally correctly) that we cannot reliably compare like-for-like(!) A similar thing also happens when trying to compare the drug laws of different countries, their successes or failures, and whether they would create the same outcomes in one’s own (culturally non-identical) country?

 

So we don’t gravitate towards opposing evidence but towards information or opinions that we want to hear i.e. that confirm what we want true, and we also automatically treat evidence that backs up our desired conclusions as more legitimate, accurate or important than any evidence that’s against them.

 

A true investigative scientist, journalist or best-truth-seeker gives fair attention to all possible alternative explanations and hypothetical counterfactuals from an agnostic stance. Statisticians hope to ‘reject the null hypothesis’ to exclude the likelihood of random chance explaining any alleged patterns. A lawyer, meanwhile, might consider the opposing arguments but not to wonder if the other side might be correct or in order to spend resources that may potentially add support to the other side’s arguments but merely in order to pre-empt and counterattack the other side, such as with techniques like casting controversy or confusion.

 

Through doubt and confusion, a bold and strenuous denial of even a blatant fact will perplex and stagger some observers. Even the least impressive argument can make us at least doubt scientific data, and there are lobbying campaigns that exist only to influence people’s opinions to serve particular industry agendas and their main strategy is to spread doubt and confusion, such as via ‘astroturfing’ sources. The implicated, if well-funded, can flood the web with other theories that deflect attention away from them. This can include counter-accusing those who accuse them (e.g. bankers and Wall Street journalists claiming that the 2007/2008 Financial Crisis was primarily down to governments not being lax enough on financial sector regulation). Corporate interest lobby groups and governments might even sometimes spread contradictions by supporting both sides of something so that the public don’t know what to believe in, who the overall good or bad guys are, or what to fight for and what to counter!

 

Uncertainty and bewilderment are powerful ways to paralyse attitude or behaviour change, and one doesn’t even need to present cogent arguments that support one’s side – one only needs to cast doubt on the opposition’s standpoint in order to at least get people to sit on the fence again. And when people sit on the fence, they do the default thing, which means they end up doing nothing and carrying on as usual. They don’t act or change. Although being open-minded is crucial – this means that casting doubt can be used against any fact. Hence more than just an open mind – we need to employ critical thinking methods.

 

People also tend to shut off or go into denial if things get too personally emotionally overwhelming, too fearful to think about, or too complicated. They’ll then end up taking the path of least resistance, which again means going with the status quo (which might be precisely the strategy and hope of those currently in power or making billions). Denial can be temporarily comforting – along the lines of ignoring demand letters if one is in debt. Even when you tell a group of people that a disaster is about to hit them (e.g. a tsunami after an earthquake that’s just struck many miles away), they’ll initially collectively go into denial because they individually don’t want to be the one who takes it seriously just to find out it was a hoax. Embarrassment is worse than death for many people it seems. In upcoming events one has never experienced before (e.g. preparing or mitigating for climate change, or whether or not to evacuate one’s home because of an incoming storm), people tend to be over-optimistic, go for the default course of action (which is usually ‘stay’ and ‘carry on as before’) and follow the immediate herd (so if their immediate neighbours do nothing then they’ll likely do the same, thinking that their neighbours surely must know better, while neglecting the possibility that they may know just as little and are just following what you’re doing too).

 

Other cognitive biases that might contribute to the denial or ignorance of scientific evidence include – the less we know about a subject, the more strongly we feel about our own opinions and the more we over-generalise conclusions. So if you knew nothing whatsoever about a subject then you’d reserve all opinion about it. But if you knew only a little bit then you might only understand an over-simplified, one-sided version of that subject and therefore believe in a strong and starkly black-or-white opinion about it. But if only you knew a lot more (I recommend at least taking some short courses from a reputable institution or two on the subjects one wishes to express strong views about) then you’d understand more about the complexities, dilemmas and known unknowns of that subject and therefore hold more nuanced and humble yet better-informed opinions about it. Therefore knowing only a little bit about something can be more dangerous than knowing nothing at all.

 

We’re not rational truth-seekers as much as rationalisers who are trying to find ways to justify or confirm our existing desired conclusions. We can and frequently do convince ourselves of whatever we want to believe in, whether deliberately or not (like convincing ourselves that being ‘self-interested’ – or serving the interests of the self – isn’t the same thing as being ‘selfish’). Presenting cold statistics and facts isn’t as persuasive as emotional and impassioned pleas. And, as usual, lies that hold a grain of truth convince the best. Even hard evidence isn’t enough. For example, even after people’s beliefs in prophecies of near-future Armageddon (time and time again) evidently fail to materialise – instead of ditching these beliefs, they double down on them by rationalising an excuse for why these predictions never came true, and constantly move the goalposts (e.g. by reinterpreting the supposed dates and other ambiguous information). We can also keep debunked ideas in our heads because it’s less effort than correcting them.

 

…In short, it’s incredibly difficult to get people to change their worldviews! Unless and until it lines their pockets, protects their interests/lifestyles and preserves their political/religious beliefs then the denial of anthropogenic climate change, for instance, will likely continue to exist amongst pockets of society regardless of what empirical evidence or logical reasoning is presented.

 

Woof! You cannot reason people out of beliefs they did not reason themselves into.

 

Comment on this post by replying to this tweet:

 

Share this post