with No Comments

Post No.: 0282hindsight

 

Furrywisepuppy says:

 

The ‘hindsight bias’ or ‘I-knew-it-all-along effect’ is the tendency to see past events as being more predictable than they actually were at the time. It’s when something seems obvious… but only after we’re given the benefit of hindsight! It doesn’t necessarily mean that it was obvious before the outcome was finally confirmed though.

 

It’s common for people to say they ‘knew’ something was going to happen but only after the fact that it had happened. Overconfidence is fed by the illusory certainty of hindsight. Some people, for instance, may have thought in advance that there’d be a financial crisis in the latter part of the 2000s but they did not know it – they could only stress that they ‘knew’ it after it did happen. There were more people who ‘remembered knowing it’ than who actually thought there’d be a crisis before the crisis. Only a relatively few put their money where their convictions were (although some may argue that these people should’ve tried more to do something about averting or minimising that crisis rather than personally profiting from it!) Still, these people didn’t know it as if omniscient. Predictions of market crashes are made all of the time by at least somebody in the world too so we also cannot ignore all these wrong predictions when we pay attention to the times at least somebody in the world happened to get one right.

 

Many things may appear obvious in hindsight but we could not have known them in advance. We can only know something if it’s both knowable and true, but no one can prove a prediction conclusively before the event happens. And betting on a, say, 1/6 odds bet beforehand cannot be regarded as knowing an outcome either, any more than knowing that it’ll come up with a 3 before a fair 6-sided die roll is made. A guess is not the same as knowing.

 

Some people therefore receive credit for prescience that they do not deserve. The hindsight bias and language used perpetuates the illusion that the chaotic world is more knowable and predictable than it really is. The crystal-clear coherency provided by biased hindsight gives us the illusion of understanding something – this illusion is that we believe we understand the past, which therefore means we understand the future, but we actually understand the past less than we think we do because it’s based on biased perceptions (e.g. confirmation biases, availability biases, cognitive ease).

 

We also seldom ever say or admit that ‘my intuition/prediction turned out to be wrong’, whilst we almost always vociferously point out whenever ‘my intuition/prediction turned out to be right’ – even though our intuitions/predictions are far more frequently wrong than right if only we each bothered to keep an honest and written record of all of our hunches! Without doing so, what happens is that we tend to biasedly forget, dismiss or keep quiet about all of our misses and overstate how certain we felt about our hits.

 

In life, we must make decisions in advance and live with uncertain foresight in a highly complex, chaotic and thus unpredictable world. But our furry minds are eager to make sense of the world and this manifests in these overly simplistic, coherent narratives and internal models. We learn, and revise our personal internal models of how the world works whenever we encounter surprising or unpredicted events in order to accommodate such surprises; and our view of the past as well as the future can be altered as a result.

 

Memories are malleable rather than fixed. This means that once part of our internal model has changed, we lose much of our ability to recall what we used to believe before our minds changed. Frequently on quiz shows, contestants say, “I knew it” yet picked an incorrect answer, which strongly suggests they didn’t know it! Our minds have an imperfect ability to reconstruct past states of knowledge – therefore we believe we knew then what we only know now. This mostly happens regarding issues where one’s mind wasn’t completely made up one way or another. Once such beliefs are altered though, many cannot believe they ever felt differently and thus underestimate the extent they were surprised by unpredicted events (e.g. if someone thought that an event would have a 70% chance of happening but then it didn’t happen, they may subsequently believe they had only thought it had a 50% chance of happening i.e. the history of one’s beliefs are revised in light of what actually happened).

 

The ‘outcome bias’ leads people to assess the quality of a decision not by whether the process was sound but whether its outcome was good or bad (e.g. a low-risk surgical procedure is subsequently deemed as high-risk after an unpredictable or improbable complication occurs; and with the benefit of hindsight, others will assume the doctor ‘should’ve known better’ too). This hindsight bias makes it hard to evaluate decisions fairly – in terms of what were reasonable and probabilistic when the decision was made i.e. without crystal-clear hindsight and with only fuzzy foresight. This especially happens regarding decisions made by those who decide things on other people’s behalf (e.g. politicians and financial advisers – albeit voters don’t like blaming themselves for election/referenda decisions they make that affect future generations who couldn’t vote at the time – that’ll still be the fault of (elected) politicians!)

 

This means that we may blame people for what were rational, positive-expected-value decisions that unluckily turned out bad, and give people too little credit for rational decisions that appear obvious after the fact, and vice-versa. The worse the consequence, the greater the hindsight bias. For example, any terrorist attack will be judged to have been easily preventable. A perpetrator may have been previously on a watch list, but so were many other people who evidently subsequently never posed a threat, and it’d be a dystopian nation to intimately follow people and invade their privacy when there’s no strong specific reason to for national security. (Operations cost money too but most people don’t want to pay more in taxes.) As a result, such agents who make decisions on other people’s behalf are driven to risk-aversion and (costly, slow or burdensome) bureaucratic solutions because if the decision followed some ‘standard operating procedure’ then blame, if something goes wrong, can be placed on the procedure instead of the person. Or people might feel incentivised to constantly pass the problem onto others and never take a major issue on themselves. (Increased accountability can therefore be regarded as a mixed blessing because agents may feel less inclined to take risks that might actually benefit their principals.)

 

A ‘black swan event’ occurs when a severe event comes as a surprise to an observer, yet after the first recorded instance of the event, it’s rationalised via hindsight as if it could’ve been reasonably expected and predicted – that is, that the relevant data were available at the time but was simply unaccounted for.

 

In a similar fashion, reckless risk-seekers who gambled and, against the odds luckily won, are showered with undeserved rewards and bonuses. In these scenarios, the hindsight bias can increase risk-taking. People who’ve been lucky don’t get punished for irrationally taking on too much risk – instead they are lauded for their success (well, luck) as if they had ‘exceptional flair and foresight’, whilst rational or sensible people are seen as ‘timid and mediocre’. (And exceptional luck will likely regress back to the mean/average too, which means that those who were lucky likely won’t be lucky again – but hey, for them, they would’ve likely already been rewarded with a top-paying top job by then!)

 

We tend to overestimate how much we think we would’ve known if we hadn’t discovered what the actual outcome was, or misremember how much we did know before we discovered what the actual outcome was. We’ll think – only after an event has occurred – that the event was more likely to occur than it actually was. A hindsight analysis makes most things seem crystal clear and ‘obvious’. It’s impossible to un-know what’s just been revealed so it’s difficult to see what the world was like again before we acquired that new information, hence it can feel like we ‘always knew it’ even though we didn’t. For this reason, it can be difficult being a teacher or any person trying to explain something to another person because it can be hard to see from the perspective of someone who doesn’t know what you know. You didn’t always know it yourself yet may behave arrogantly towards someone who doesn’t yet know it themselves by expressing in exasperation, “Duh, it’s obvious!”

 

Once you know the correct answer or know where something is (e.g. where the remote control is on a table full of mess), this bias makes you think that others should easily know the correct answer or where that something is too. Therefore related is something that’s obvious to you may not be obvious to others – just like something obvious to others may not be obvious to you, and something obvious to you now may not have been obvious to you in the past. Many things are only obvious or easy if you’ve already been given, seen or worked out the answer previously.

 

When behaving naïvely, most things look obvious in hindsight, and one can act delusionally smug exploiting the benefit of hindsight, especially when judging other people’s errors – hence the sense of superiority many viewers feel when watching business reality show competitions! Many people like watching other people fail in order to feel relatively smart and a sense of superiority, even though much of the time one wouldn’t have known what one would’ve done if one were in the contestants’ shoes. So this can happen even when people haven’t personally tried, or therefore possibly failed, themselves, in a similar task, or when they don’t fully understand the challenges firsthand to empathise with the difficulties of the situation, without the vicarious benefit of hindsight.

 

Viewers of these types of reality show competitions also typically have the ongoing commentary of experts on the show expressing their views only to the viewers and not to the contestants, plus the editing and the score helps viewers to forecast whether a contestant will be just about to succeed or fail too (e.g. a dark filter and a dirge to foretell whether a contestant is about to meet their own doom). These fluffy cues are normally picked up subconsciously. Of course, in real life and in real time, people don’t get to receive such cues or have the benefit of hindsight from their own performances until it’s personally too late for them. So without the benefit of hindsight or watching other people approach the problem before you – if you were to find yourself in a similar situation, how would you know for sure how you would’ve done?

 

We can mention something to someone that seems obvious to them once it’s said (“No need to patronise me”), but if we didn’t mention it at all then that person might’ve complained about why we didn’t mention it beforehand (“You should’ve told me!”) People say, “You should’ve pushed harder” if someone came second, yet will say, “You shouldn’t have pushed so hard” if they crashed out and came last(!) Likewise, ‘patience is a virtue’ if haste was someone’s downfall, but ‘you snooze, you lose’ if capitulation was their downfall instead. In other words, words of wisdom and catchy quotes come easily afterwards with the benefit of hindsight!

 

Woof! How would you feel about keeping a ‘prediction diary’ where every prediction you ever make, the time they’re made, how strongly you felt about them, and their eventual outcomes, is recorded so that you can check your accuracy? Do you think it’ll make you feel better or worse about your perceived ability to make good predictions? Please share your thoughts via the Twitter comment button below.

 

Comment on this post by replying to this tweet:

 

Share this post