with No Comments

Post No.: 0491cherry-pick

 

Furrywisepuppy says:

 

It would be impossible to sift through all of the academic papers out there in the world ever so we must seek some fluffy shortcuts, such as relying on the verdicts of journalistic reviews or news articles.

 

But if we are confirmation biased, we’ll likely be able to find at least one person with a PhD who’ll endorse the beliefs we want to believe in, or we could focus on a tenuous published study or cherry-pick or suppress data from a particular study to find things that’ll fit the worldview we want to maintain or promote. We seek for the truth – but an even greater internal goal is the desire to prove a point or prove that our current beliefs are justified.

 

Scientists and experts are human and thus are fallible beings too, with potential conflicts of interests that can interfere with exploring or telling the full truth and only the truth, such as because they’re getting paid or funded by a corporate-interest sponsor, or they have self-interests in making a big name for themselves – which one cannot do by agreeing with the established status quo. In other words, if one wants to become famous in a scientific field, one must come up with a finding that’s new, controversial or extreme. One must have something different to say, hence one will be incentivised to force or exaggerate such a novel or original finding. As a consequence, they might believe the evidence supports their conclusions when it does not, and use confirmation bias to maintain their beliefs. How many times an academic paper has been cited is an indirect way of measuring the credibility of a paper, thus some scientists might even illegally pay other authors to have their own papers cited.

 

Some original or contrary findings are genuine, whilst others are chance results, wishful over-extrapolations, accidental misreporting or even intentional fraud.

 

On the odd occasion, weak or bad manuscripts do slip through the scientific journalistic review process and get published. Different journals differ in their rigorousness of review – and indeed, fallible beings also perform the peer reviews. It’s also important to understand that peer reviewers do not check whether the data of a given study is accurate or even genuine – they only check how well manuscripts are written up.

 

In the general media, almost any angle to a story can be promoted, such as that a city or country has a serious crime problem, because you’ll almost always be able to find at least one person with an extreme, eccentric or one-sided perspective, or a person with a unique, rare or unusual outcome, who’ll agree to go on camera to present his/her opinions or personal story. Examples include a person who has smoked all of his/her life and is now 100 years old, an unhappy customer or employee, or a medical disaster story even for a product or operation that is overall quite routine, statistically safe and reliable. A reporter or news outlet can just cherry-pick confirming views and evidence and/or edit out disconfirming views and evidence too.

 

So we can cherry-pick examples of people who played truant when young yet still became successful businesspeople, or cherry-pick examples of Prime Ministers who didn’t come from privileged backgrounds – but do they represent the typical norm in the bigger picture? Does a ‘rags to riches’ story represent the rational expectation or odds for anyone who comes from a poor background? It’d be flawed logic to argue that smoking is good or fine for one’s health just because we can find a few examples of people who’ve lived to a very old age whilst regularly smoking cigarettes. They’re just exceptionally lucky and lived to that age despite, not because of, their smoking.

 

If you really want to find them, you’ll always be able to find examples of people winning bets that had incredibly long odds, survivors after falling out of a plane at altitude without a parachute, people who allegedly only ate potato chips and drank sugary pop and appeared fine, or whatever, but it doesn’t make these people sensible and it doesn’t make following their example rational. Just because something is proven to be possible – that alone doesn’t make attempting it rational.

 

We can always apply confirmation bias and cherry-pick data to serve a particular agenda we want to sell or believe in, but we’ve got to be aware of all of the data we’re ignoring that could be telling us the fuller, more truthful, picture – such as that most people who fall out of a plane from a high altitude without a parachute don’t survive!

 

But, say if we like eating a lot of doggy chews, we’ll be more likely to want to gravitate towards stories that tell us that eating a lot of doggy chews is healthy or at least okay for us. We are naturally drawn to cherry-pick evidence or opinions that confirm our desired biases – just like lawyers fighting for a particular side and who want a predetermined verdict to be true – whilst we ignore or work hard to dismiss any and all counterevidence that comes our way, which might actually make up the majority of the total evidence. Attention is limited so when we’re focused on one thing, we can miss literally everything else. We’re not good at processing complete sets of data even when available – we cherry-pick data. This is how we can easily justify our own beliefs to ourselves in all kinds of contexts. (I don’t like picking cherries though because they’re not good for my tummy, so my equivalent is blackberry-picking – woof!)

 

Due to confirmation bias, many people who drink wine want to believe that wine is good for their health so they’ll pay more attention to and more likely trust without question or with more lax scrutiny anything (from the media, marketing, gossip or wherever) that suggests that ‘wine is good for their health’; and vice-versa for anything they don’t want to believe is true.

 

Even without an agenda except to grab as many viewers as possible, to generate link clicks or to sell papers for the sake of a TV channel, website or news outlet’s profitability – it’s also the case that extreme views or examples of something, even if not popular or common, tend to get disproportionately reported more because they make the more inflammatory and therefore eye-grabbing headlines, or they make a more dramatic programme that stokes up divisive debate on social media afterwards. But, for instance, in a show about ‘the best parenting methods’ that pitches various parents with extreme parenting philosophies against each other – really none of them should win because a method that balances the good points of all of them would actually be the best. That’s why we as media consumers must not be too focused on just a few extreme sources of information, and should look at the broader frame of statistics and opinions i.e. take a step back and look at the bigger pictures. Things aren’t always so binary or mutually exclusive.

 

‘Confirmation bias’ is ultimately an unconscious bias – one doesn’t have to purposely wish to deceive in order to fall foul of it. The bias is most pervasive for desired outcomes, deeply-entrenched worldviews and emotionally-charged issues. It leads to overconfidence and more extreme and polarised stances between groups. It can happen whenever one has a motive to win an argument, make a point or has something to sell, such as an idea, opinion or product. Confirmation bias occurs whenever one strongly holds or desires a particular conclusion first then seeks to back that belief up – like a lawyer defending a position – rather than starts as a true agnostic, seeks information then comes to a conclusion or picks a side. One needs to wonder what x plus y equals – not try to find numbers that add up to 10 as if one already knows the answer should be 10.

 

Our upbringing or early exposures to particular beliefs will therefore profoundly influence our own beliefs, at least for a while in our lives, because the first belief we start to strongly hold regarding an issue will be the belief that becomes subsequently continually reinforced. We’ll prefer to hang around others who share the same beliefs as us if we can, and this will create an echo chamber effect.

 

We tend to distort, dismiss or neglect any evidence that doesn’t match our current worldviews. We can even attack or despise any person or source who or that gives us uncomfortable facts that challenge our worldviews. We have a tendency to gravitate towards, cherry-pick, pay more attention to and pursue only information that supports our existing stances. And we’ll tend to automatically like, apply the halo effect (such as ‘he/she agrees with me thus he/she must be a clever and good person too’) to, and prefer to hang around with, like-minded people and groups who support the same attitudes as us. People frequently say, “Don’t believe everything you read in the news.” Yet we still will, especially if it’s news that confirms our biases.

 

So the natural inclination to cherry-pick information is rife on both the side of the media content creators as well as on the side of media consumers, as expressed in Post No.: 0411 too. Social media algorithms and filter bubbles also mean that users are increasingly exposed to information and sources that they already like i.e. that confirm their current worldviews.

 

When we scan the news and when we see a headline that appears to say what we want to believe in, we’re more likely to read it and already trust in it; whereas if we see a headline that appears to say what we don’t want to believe in, we’re more likely skip it and be cynical of it. So people tend to pay less attention to sources that disconfirm what they want to hear, their preconceptions are already doubting the arguments before they’ve even read beyond the headlines, or if they don’t or can’t avoid paying attention to information that disconfirms their existing beliefs or desires then they’ll scrutinise the minutiae of the counterevidence more deeply and put more effort into conjuring up a counterargument to that counterevidence (maybe with the help of fallacious logic or some kind of superficial attack, such as an ad hominem attack or whatever emotionally satisfies the holder of a view that his/her view is intact). That’s if they don’t want to believe in something compared to if they do want to believe in something. So beliefs persevere despite being shown contrary evidence.

 

Ambiguous evidence will be interpreted as confirming our existing views, which might also mean that we start to falsely perceive correlations between unrelated events (which could seed conspiracy theories in our minds).

 

Confirmation bias and the tendency to cherry-pick, search for, favour and more easily recall evidence we want to hear, and ignore, dismiss or more easily forget evidence we don’t want to hear, leads us to reinforce rather than question our views, and makes us less tolerant of any idea that is different to what we accept or any person who disagrees with us; as if they’re the idiots for thinking differently to us rather than we’re idiots for thinking differently to them!

 

In short, we are biased to test ideas in a highly one-sided way, as we ignore alternative hypotheses in favour of justifying the one we already hold. It’s difficult for us to assess information in a neutral way. We don’t find it natural or pleasant to seek disconfirming evidence or to listen to views that literally antagonise us. But although confirmation bias cannot be quelled completely – critical thinking skills can help reduce them. We need to play ‘devil’s advocate’ against our own existing beliefs more often.

 

Woof!

 

Comment on this post by replying to this tweet:

 

Share this post