After every disaster - earthquakes, floods, industrial accidents, terrorist attacks...-there will be breathless coverage for days on end with lots of expert analysis. They quickly become rather tiring.
In The Black Swan, Nassim Nicholas Taleb suggests doing a thought experiment. Suppose there was a far-sighted manager who had implemented some years ago the safety measures that are now being discussed by the media. He would have been told that he was wasting scarce resources in superfluous areas; the scenario that he had painted would be dismissed as a figment of his imagination - it has never happened before. He would have got a bad annual review since his department had 'squandered resources on non-productive expenses'.
If he continues along his 'foolish' path, he may lose his job. He may be replaced by a 'dashy-pushy' (see note below) guy who has more 'Confidence in the future' (i.e. who ignores the possibility of Black Swans) who will be obsessive about buzzwords like efficiency/cash-flow/ bottom line etc. and reduce the 'unproductive expenses'. He will focus on 'leveraging intellectual capital and intangible assets' to create a 'knowledge-based' firm. (Add a few more buzzwords to impress CNBC) And suppose some of the 'unproductive expenses' had been retained and they had helped mitigate the effects of the disaster that would have happened some years after they had been implemented, the far-sighted manager who had risked his career over them would have been long forgotten.
Accidents often happen because of seemingly trivial faults and minor malfunctions that had been overlooked. Small faults like a tiny leak or a rusted bolt that had been routinely picked up earlier would now be missed by the fewer number of over-worked employees that had resulted from Downsizing/Layoffs/Ramping down an operation/Right sizing/Restructuring etc. to 'trim costs' in order to 'remain competitive'. (The 'flattening of the organizational pyramid' would be to take 'nimble advantage of market nuances'.) The mistake that caused the accident may be the final straw on the camel's back. Taleb writes in The Black Swan:
In his book, Thinking, Fast and Slow, Danil Kahneman give an example of hindsight bias. The day before the attacks on WTC in the US, the CIA got information that al-Qaida may be planning a major attack in the US. This information was given to the NSA rather than to President Bush. When this became known later, the executive editor of The Washington Post said, 'It seems to me elementary that if you've got the story that's going to dominate history you might as well go right to the president.' But the day before the attack, no one knew - or could have known - that the next day would 'dominate history'.
After a terrorist attack you will often be told that the suspect had been in some police record somewhere for some petty crime. The implication will be drawn that if there was better coordination between the different agencies, the person would have been caught then and the terrorist incident would not have happened. But there was no way for the police to know that he would plant a bomb in a bus a few months later.
Note: I came across the word 'dashy pushy' in an article in The Caravan magazine. It is a corrupted combination of two English words and is used in West Bengal:
In The Black Swan, Nassim Nicholas Taleb suggests doing a thought experiment. Suppose there was a far-sighted manager who had implemented some years ago the safety measures that are now being discussed by the media. He would have been told that he was wasting scarce resources in superfluous areas; the scenario that he had painted would be dismissed as a figment of his imagination - it has never happened before. He would have got a bad annual review since his department had 'squandered resources on non-productive expenses'.
If he continues along his 'foolish' path, he may lose his job. He may be replaced by a 'dashy-pushy' (see note below) guy who has more 'Confidence in the future' (i.e. who ignores the possibility of Black Swans) who will be obsessive about buzzwords like efficiency/cash-flow/ bottom line etc. and reduce the 'unproductive expenses'. He will focus on 'leveraging intellectual capital and intangible assets' to create a 'knowledge-based' firm. (Add a few more buzzwords to impress CNBC) And suppose some of the 'unproductive expenses' had been retained and they had helped mitigate the effects of the disaster that would have happened some years after they had been implemented, the far-sighted manager who had risked his career over them would have been long forgotten.
Accidents often happen because of seemingly trivial faults and minor malfunctions that had been overlooked. Small faults like a tiny leak or a rusted bolt that had been routinely picked up earlier would now be missed by the fewer number of over-worked employees that had resulted from Downsizing/Layoffs/Ramping down an operation/Right sizing/Restructuring etc. to 'trim costs' in order to 'remain competitive'. (The 'flattening of the organizational pyramid' would be to take 'nimble advantage of market nuances'.) The mistake that caused the accident may be the final straw on the camel's back. Taleb writes in The Black Swan:
Who gets rewarded, the central banker who avoids a recession or the one who comes to 'correct' his predecessors' faults and happens to be there during some economic recovery? Who is more valuable, the politician who avoids a war or the one who starts a new one (and is lucky enough to win)?
[SNIP]
...everybody knows that you need more prevention than treatment, but few reward acts of prevention. We glorify those who left their names in history books at the expense of those contributors about whom our books are silent. We humans are not just a superficial race (this may be curable to some extent); we are a very unfair one.Hindsight bias, also known as the knew-it-all-along effect, is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it. After an event, people often believe that they knew the outcome of the event before it actually happened. They often forget a dictum that one historian stated - what is now in the past was once in the future -and assume that a decision-maker at the time had the same information that they have now.
In his book, Thinking, Fast and Slow, Danil Kahneman give an example of hindsight bias. The day before the attacks on WTC in the US, the CIA got information that al-Qaida may be planning a major attack in the US. This information was given to the NSA rather than to President Bush. When this became known later, the executive editor of The Washington Post said, 'It seems to me elementary that if you've got the story that's going to dominate history you might as well go right to the president.' But the day before the attack, no one knew - or could have known - that the next day would 'dominate history'.
After a terrorist attack you will often be told that the suspect had been in some police record somewhere for some petty crime. The implication will be drawn that if there was better coordination between the different agencies, the person would have been caught then and the terrorist incident would not have happened. But there was no way for the police to know that he would plant a bomb in a bus a few months later.
Note: I came across the word 'dashy pushy' in an article in The Caravan magazine. It is a corrupted combination of two English words and is used in West Bengal:
By chopping the last three letters off “dashing,” and adding a “y” to ease its coupling with “pushy,” we get a new word. It denotes a go-getter with an unsubtly aggressive edge about him — a slightly pejorative term in its early days, but now one of approval, if not admiration.
No comments:
Post a Comment