Post No.: 0183
Just about every person has told lies before, at least small lies, and if you deny that you have then you’re lying right now or you’re extremely good at deceiving even yourself! Sometimes lies are told because people make a rational calculation based on the rewards or payoffs to be gained, the probabilities of getting caught and the punitive costs if one does get caught. Sometimes people rationalise the moral acceptability of their lies to themselves (e.g. “It’s just a game” or, “They lied to me in the past so I’ll lie to them back”). But sometimes small lies do maintain social harmony and therefore aren’t always selfish or bad (e.g. to keep someone’s spirit up or not hurt their feelings). Regardless, we all lie; yet even someone who tells a lot of small lies can still consider themselves as an honest person!
The most convincing lies usually involve a small grain of truth, or a half-truth, but which has been twisted or exaggerated into something that is simply not true (and this is like how a lot of pseudoscience is created too). We tend to be far better at lying than detecting them, even though most people are vastly overconfident in their ability to detect lies.
Young children frequently attempt lies but start off being not very good at it. Nonetheless, parents soon become generally no better than chance at successfully identifying the lies of even their own children. Children telling white lies and elaborate and plausible stories to cover their lies do indicate empathy and imagination and intelligence though – although when they become adolescents and adults they’ll need to have developed the refined intelligence to understand when they will or will not have a good chance of getting away with lying (e.g. it looks tragic when an adult tries to manipulate others with fake tears), and the many situations when it isn’t worth lying in the bigger picture, otherwise they could end up reputationally being labelled as untrustworthy.
Deception is like warfare, and detection is like counter-warfare, and they’ll constantly push each other with counter-counter-warfare and counter-counter-counter-warfare and so forth as each gets more and more sophisticated (e.g. better algorithms to detect fraud, greater artificial intelligences to create deepfakes). As the police get smarter, so will the criminals, and so must the police, etc..
If lies are greatly rewarded and aid a liar’s (whether an individual or organisation’s) survival because we fall for them too easily, then lies will logically continue to persist in human nature i.e. deception is a strategy that survives because of its advantages when it works, in part because of those who try it and succeed and in part because of those who fall for them. Most of us understand that lying is normally wrong but also understand the benefits if one gets away with it. Many other creatures in the animal kingdom employ deception too, whether physically (e.g. mimicking the colours of a poisonous close species) or behaviourally (e.g. cuckoldry).
We fall for lies easily because we have a general ‘truth bias’, which means that the default is for us to trust other people and what they say. This allows society and commerce to function more efficiently (i.e. no urge to check every single minutiae of information before doing something with someone, which would take a very long time every time). This bias is stronger if the person telling the story is a close friend, family member or colleague, as these fluffy relationships would become strained if informational veracities were constantly queried.
But being trusting by default is often exploited by those who wish to profit from this tendency, such as salespeople. Minor discrepancies/inconsistencies also tend to be excused away, especially when we want to believe that a person is telling the truth (e.g. when we want to believe their product will solve our health concerns). But it helps to be sceptical and judicious if a decision to trust or not trust is critical or if someone is trying to profit from you. The truth bias diminishes rapidly when people become aware of the possibility of deception. So, save for any preconceived discrimination against certain individuals or groups, we tend to trust people until it is suspected that they should not be trusted.
‘White lies’ are arguably socially important. It’s arguably acceptable to tell white lies to preserve another person’s self-esteem or motivation, for example, whilst it’s not acceptable to lie at all in a professional context. So being bluntly honest isn’t always wise or useful in social contexts. A person in the inappropriate context or position who will only ‘say it like (they think) it is and if you don’t like it then they don’t care’ therefore lacks a degree of social intelligence, especially if they express such a sentiment in an arrogant and uncaring manner (even in a professional context). It’s not socially intelligent to just always speak what’s on one’s mind and to tell others, “I’m just being directly straight with you so deal with it.” Besides, an honest opinion isn’t the same thing as an objective fact!
However, it becomes difficult to understand the boundaries of when white lies are acceptable when one’s social life overlaps with one’s professional life (e.g. whether to help a friend and colleague out with an exam question or job reference). White lies may lead to bigger lies down the line too. And white lies might not help someone else in the long run (e.g. telling someone that they don’t look overweight might not help them to realise that they’d be better off changing their lifestyle for a healthier one, or telling someone you’re okay with them when you’re not won’t nip your grievances in the bud before they potentially fester and blow up into a massive and sudden outburst one day!)
‘Blue lies’ fall inbetween well-meaning white lies and selfish ‘black’ or regular lies – they’re told to benefit one’s ingroup but at the expense of an outgroup(s), thus making them simultaneously selfless and self-serving (e.g. lying to cover up some cheating from your own teammates is antisocial but can help your team and even strengthen the bonds amongst the members of your team). Whether ingroup members need to truly believe in those falsehoods is debateable, and possibly irrelevant – these lies can be utilised as useful ammunition in a tribal ‘us against them’ competition (e.g. between two political parties or ideologies). These lies help one’s own side’s cause against an opposition’s cause. Blue lies are therefore ‘groupish’ – which is the social analogue of ‘selfish’.
People are much more likely to be convinced of a fact or ‘fact’ when it originates from ideologically-sympathetic sources or people who look and sound like them i.e. ingroup members (e.g. if we’re white, old and male, then we’re more likely to believe in what other white and old males say compared to members of other demographic groups).
We are social creatures, yet are prone to dividing into competitive groups, largely for the purpose of allocating resources – people can be prosocial towards their ingroups but antisocial towards their outgroups. When we divide ourselves into groups, we open the door to conflict and socially-sanctioned deceit (e.g. propaganda that dehumanises outgroup members – read Post No.: 0178 for more about dehumanisation). We can support our own ingroups and existing worldviews to such zealous extents because they represent a part of our self-concept and identity in deep and meaningful ways. Directionally motivated reasoning results in conclusions that are driven more by feelings than by facts (and this is our default behaviour) – this is why, when the truth threatens our identity, that truth tends to get automatically dismissed.
Much more will be written about the subject of lies because it is a big topic in psychology. In the meantime – you smell fresh today!