with No Comments

Post No.: 0630overconfidence

 

Furrywisepuppy says:

 

The ‘planning fallacy’ occurs when one estimates unrealistically close to the best-case scenario thus ends up being overoptimistic about a task. It’s an extremely frequent and widespread fallacy committed by individuals, businesses, governments and other organisations, hence why projects have a tendency to go over time and over budget! This estimate could be improved by consulting the statistics of similar cases, but we seldom learn from other people’s past experiences, or even often our own!

 

Our pervasive overconfidence or ‘optimism bias’ expresses via the planning fallacy, through viewing the world as more predictable than it really is, through judging our own fluffy attributes as more favourable than they truly are, by deeming the goals we adopt as more achievable than they’re likely to be, and because of exaggerating our own ability to forecast the future. In terms of the consequences (both good and bad e.g. inventors experimenting when trying to make new discoveries or nations entering wars they believe they will (easily) win) – it’s arguably the most significant of our cognitive biases.

 

Overconfidence is risky because it results in taking risks that may be too great. Yet having an optimistic temperament can be a blessing because it means that one tends to look on the bright side of life, which is good for one’s well-being – you’re lucky because you already feel fortunate! Optimists are normally cheerful and happy, and therefore popular, more resilient in adapting to failures, setbacks and hardships, less likely to become depressed, their immune system tends to be stronger, and they take better care of their health, feel healthier and live longer on average (although these are all correlations that may not all be causations). I explored this in Post No.: 0464.

 

Optimism perhaps contributes to resilience by defending one’s self-image in a mildly biased way. Taking the credit for the successes but little blame for the failures, and exaggerating the importance of what one is doing, can help one to carry on in situations where a high rate of failure is expected whilst searching for those one or few successes (e.g. when dating).

 

Those who are overconfident and lucky (and we recognise now that luck plays a key role in ultimately everything) will have their self-confidence reinforced by the admiration of others – hence those who have the greatest influence on the lives of others are likely to be optimistic, overconfident and (historically) lucky, and they take greater risks than they realise because they under-acknowledge the effects of luck and/or over-attribute their ability to control complex external events. Yet because they misjudge the odds (and seldom bother to even find out what the statistical odds are), they believe they’re prudent and feel cocksure about the future. Their positive mood helps others to back them (financially, labour-wise and through other resources) and raises the morale of their team, which might actually enhance their prospects for success – thus some mildly delusional overconfidence can be useful.

 

But the blessings of optimism are limited only to those who are mildly biased and are able to accentuate the positive without losing track of reality. Optimism encourages persistence in the face of obstacles but it can lead to a ‘double or nothing’ gambling mentality and other dangerous fallacies.

 

It’s the combination of emotional and wishful thinking, along with various cognitive biases. We focus on our goal, anchor on our plan (a best-case scenario), neglect relevant base rates, focus on what we want to do and can do, neglect the plans and skills of our competitors, focus on the causal role of skill and neglect the role of luck (the illusion of control) when both explaining the past and predicting the future, and we focus on what we know and neglect what we don’t know, which means that we become overly confident in our beliefs.

 

The chances of a small business currently surviving past 5 years is ~50%, but individuals who start such businesses usually believe that such statistics don’t apply to them, thus they estimate their own chances of success as at least 70% or even 100%! Discouraging advice is often dismissed (it may more likely be obeyed if one had paid for the advice though). >70% of inventions never get off the ground commercially, and most of those that do will still fail. Overall, on average, the return on private inventions is lower than from private equity or high-risk securities (although most inventions make life richer for everyone), and self-employment produces lower returns than external employment (although autonomy and flexibility are desirable things).

 

Most entrepreneurs think that >80% of the outcome of their business depends on what they do in their business. They think their fate is almost entirely in their own paws. But all start-ups depend at least as much on what their competitors do (‘competition neglect’) and changes in the market and national/world events too. So we tend to neglect outside forces and the effects of other people’s decisions and outcomes (things that we cannot control) on our own lives and decisions.

 

It’s the result of ‘what you see is all there is’ (WYSIATI) again (e.g. one’s own plans and actions and the most immediate opportunities and threats like the availability of funding). If there’s a fantastic opportunity for you then you must consider that there’s a fantastic opportunity for your competitors too and everybody else might be thinking the same things, and they might have bigger budgets, teams, facilities, etc. than you. Maybe the better question is therefore – considering what others will likely do with what they have, how do you think you will do with what you have? (By taking into account the competition, one may, possibly, therefore decide to operate during off-peak times rather than when nearly every other major competitor is operating e.g. the time when the highest number of bidders for an online auction may be 7 o’clock, but the time when the highest number of bidders relative to sellers may be 12 o’clock.)

 

It’s common for an investor to use last year’s financial figures as a guide to forecast the value of a firm several years on from now, even though other information about the present state and future plans are more crucial. Irregardless, most >1-year financial forecasts are no better than chance guesses – yet investors still love to hear about them; perhaps to appeal to their emotional hopes. (These figures could also act as surreptitious anchors to influence a company’s perceived valuation.) Who wants to tell potential investors that their company is forecasting a loss (unless this looks pretty certain) or has a 50% chance of going under within 5 years? So the worst thing about these forecasts is that people refuse to acknowledge that they’re next to worthless. Investors may like to hear that there exists a long-range plan but they’re usually delusional about the error bars, which tend to be many times wider than they think. Massive public companies often publish healthy financial forecasts even as their share prices are crashing (as some major banks at the start of the 2007/2008 Financial Crisis did!) Most people most of the time in these contexts shouldn’t be confident in their confidence.

 

Social psychology is also at play here though because anyone who sensibly states a wide confidence interval (greater uncertainty) in their predictions – and their job is to make predictions – displays ‘ignorance’ to others; even though they’d actually be displaying the opposite! A ‘7/10’ certainty will be mocked compared to an exaggerated and arrogant ‘10/10’ certainty of the mid-to-long-term effects of an exit from an economic union, for instance.

 

Therefore even when one knows how little (any)one knows about the far future, one might be penalised for admitting it. The public naïvely seeks black-or-white forecasts from their leaders when most situations are nuanced and have multiple possibilities, pros and cons, threats and opportunities. So overconfidence is unwisely highly valued – politically, socially and in the markets. Those who express nuance are ignored. Overconfidence gets rewarded with votes, attention, investment and custom more than fair assessments, hence amidst the competition for these things – ‘experts’ have become ever more collectively blind to risk and uncertainty, and more prone to exaggeration. One could therefore at least partly blame the electorate and markets for the way politicians and businesspeople behave – they wouldn’t use hyperbole if they weren’t rewarded for it. An overconfidence in being right is essentially fundamentalism; and if combined with power, is tyranny.

 

So there’s a competitive pressure to err towards overconfidence and exaggeration. But those who take the word of overconfident ‘experts’ can expect costly consequences – an inadequate appreciation of uncertainty leads to people taking risks they should’ve avoided or making diagnoses that are wrong. An expert worthy of the label is expected to display high confidence, and so the overly confident ones will be the ones most likely to be picked to be guests on news or TV debates, and the over-hyped or pseudoscientific wares will be the wares most likely to be talked about and purchased. Expert overconfidence is thus encouraged by their clients – it can appear weak, incompetent or untrustworthy to appear unsure. Overconfidence is trusted over nuance hence any acknowledgement of one’s ignorance or uncertainty tends to be kept quiet, lest one be replaced by someone who’s more delusionally certain.

 

An appreciation of uncertainty is a cornerstone of rationality, but that’s not what people want to hear! Extreme uncertainty can indeed be paralysing under dangerous circumstances, and the admission that one is just guessing is especially unacceptable when the stakes are high – hence acting upon bluffed knowledge is therefore, unfortunately, sometimes the preferred solution.

 

So there are emotional, cognitive and social factors to explain why overconfidence is rife. But many of these people won’t consider themselves as risk-takers – they’re just simply unaware of the true extent of the risks they’re taking.

 

Because of ‘risk adjustment/modification’ and ‘risk homeostasis’ – the more safety equipment one perceives one has on one’s side, the more risks one will likely take, hence using safety gear (e.g. a skull cap in rugby) can ironically mean that one will get injured more because one will just go into tackles harder; or relying on assumed-to-be-foolproof mathematical formulae (e.g. value at risk) can make investors think that their investments are safe even against the risk of rare events and thus they’ll try to stretch their investments even further i.e. take greater risks, thus leading to periodic economic crashes.

 

Overconfidence can be modestly tamed through techniques and training, such as stating confidence intervals, reviewing lots of past similar cases where the results were known, and listening to diverse voices and considering competing hypotheses. But like all biases, it’ll never be completely eliminated. Subjective confidence is determined by the coherence of the story one has mentally constructed – not by the quality or quantity of the information that supports it.

 

Groupthink suppresses doubt, but questioning the ideas of leaders and others within our groups should not be viewed as disloyalty when it is constructive criticism. However, leaders generally naturally tend to select those who already concur with them to be within their cabinets, boards or inner circles and be given a voice.

 

One helpful technique is a kind of ‘pre-mortem’ exercise. Just before committing to an important plan – get a (preferably independent and) knowledgeable team to imagine that one’s intended plan has ended up in disaster a year later. Now take several minutes to (each separately) write a brief history of that disaster. This can help combat the groupthink that can affect a team once a decision appears to have been made, legitimise critique and doubt, actively seek to uncover neglected threats, and unleash the imagination of knowledgeable individuals to create coherent counterfactual stories that can counterbalance any overconfidence in one’s existing beliefs or ideas.

 

Woof! What do you think about how overconfidence is often rewarded and what we could do about it? You can share what you think via the Twitter comment button below.

 

Comment on this post by replying to this tweet:

 

Share this post