This post was first published in The Straits Times by Vikram Khanna.
Businesses make decisions all the time. Some have good outcomes, but many, if not most, do not. Countless management books have been written on strategies for success with bestselling titles such as In Search Of Excellence, Built To Last and Good To Great, which purport to draw lessons from successful companies. But the lessons have not always stood the test of time.
For instance, research by McKinsey & Co on how the 50 companies featured in these three books fared two decades later showed that collectively they barely outperformed the market, while many, such as Circuit City, Kodak, K-Mart and Wang Labs went bankrupt or were acquired.
One can think of many other companies worldwide in industries as varied as technology, music, retail, financial services and the media that were once high flyers but are now either a shadow of their former selves or have disappeared. These once-great companies were led by competent, experienced management teams. Yet they took bad decisions.
Interestingly, many of them took the same bad decisions – such as making terrible acquisitions, trying to imitate successful competitors in the name of following “best practices” or denying that their companies were vulnerable to disruption.
Predictable mistakes
In recent years, behavioural economists and psychologists have tried to understand why decision-makers are prone to making systematic, predictable mistakes. One of them is professor of strategy Olivier Sibony at HEC business school in Paris and a former partner at McKinsey, who has written the book You’re About To Make A Terrible Mistake!
He also co-authored the bestseller Noise: A Flaw In Human Judgment, together with Nobel laureate Daniel Kahneman and Cass R. Sunstein.
Many mistaken decisions are rooted in cognitive biases, some of which are unconscious. Overcoming these biases requires “thinking critically about how decisions are made, or ‘deciding how to decide’,” he writes. The key to good decisions, in other words, lies not in the brilliance of the decision-makers, but in the design of the decision-making process.
“About 70 per cent of all business decisions involve yes or no answers,” Prof Sibony tells The Straits Times in a recent interview. The sort of questions typically posed are “should we make this acquisition?”, “should we hire this person?”, or “should we expand into this new market?”. The reason why these become “yes or no” decisions is because in the process of making the decision, the management team tends to weed out ideas until they come up with a plan they are trying to sell. “They are actually not trying to make a decision,” he points out. “They are trying to sell a proposal.”
For example, they say: Here’s the investment we are planning to make: yes or no? There may have been other ideas – plan Bs and plan Cs – but they have usually been eliminated prior to the final meeting. “That is not a good way to make decisions,” says Prof Sibony. “It would be much better to keep a larger portfolio of options until very late in the process.”
The most glaring and costly mistake that companies make is how they decide to take big risks. It’s often the result of a charismatic CEO saying “I know this is a big risk, but I’m going to take it anyway” – a reflection of overconfidence and excessive optimism that people have in the accuracy of their predictions – a common bias both in business and in life. Management teams often go along with the views of highly successful and confident CEOs, with no decision-making process.
In business, the classic example of such risk-taking happens during major acquisitions. Prof Sibony recalls, for instance, the US$10 billion (S$13.8 billion) purchase by the IT giant HP of the British tech firm Autonomy in 2011, followed a year later by HP writing off US$8.8 billion of that investment. “How do you decide to acquire a company for US$10 billion and then a year later say: ‘Oops, we overpaid by a factor of almost 10x’? There are stories like that every few months,” he says.
But the most frequent mistake is the opposite of the one above. Bosses kill many risky projects that they should pursue but don’t, a reflection of risk aversion, another common bias.
Mr Sibony explains: “There are many small projects – not US$10 billion, but US$10 million projects – that involve some risk but have a good chance of returning enough to justify the risk.” Suppose, for example, CEOs are presented with an R&D project that will cost US$10 million. If it succeeds it will return US$30 million and if it fails, the US$10 million would be lost.
To the question: “What is the minimum probability of success that you would demand to invest in this project?”, the rational answer would be anything above 25 per cent. “But in reality, people say: ‘if it’s not 80 per cent likely to succeed, I won’t invest’,” he says. “Managements tend to be very risk averse to projects like this, because they don’t want to be associated with a project that lost US$10 million. However much you try to assure people that it’s OK to fail, they believe it’s not.”
Another common mistake companies make is mis-defining the problem they need to solve. Prof Sibony cites the example of the music record companies that were faced with the audio streaming service Napster which provided songs for free in the early 2000s. The music companies successfully sued Napster, which was forced to file for bankruptcy. But this did not solve their problem. Soon after, Apple started offering music digitally on iTunes and others also joined in the streaming business, which disrupted the record companies.
So the real question these companies needed to ask was not “how can we get rid of this pesky competitor?”, but “how can we make money from digital?”
At the root of their mis-definition of their problem was the companies’ desire to preserve their high profit margins – a reflection of the “status quo bias” – the attempt to maintain the status quo, regardless of the changing business environment.
“If you define the problem in such a way that it cannot be solved, then you will not solve it,” says Prof Sibony.
Many industries face disruption, he points out, but what is striking is that a lot of incumbents don’t think it will happen to them, citing their unique strengths or circumstances. “Companies overemphasise the differences which enable them to believe that this problem does not apply to them. I have seen this in industry after industry.”
Elements of a good process
What then is a good decision-making process that can minimise biases?
The first element in a good process is to trust the process, he points out. “You may say, the process tells me I should do this, but my gut tells me I should do that. You should listen to the process, not your gut. You may think your gut reflects your own experience, but it doesn’t do that very well. It reflects your perceptions of your experience. Our ability to learn from experience is overrated.
“People say: we learn from our failures. I wish that were true. We often learn the wrong lessons from our failures – it’s very hard to know what we should learn. We tell stories about the past which make perfect sense in hindsight. But we forget how uncertain the past was when it was the future.” People also constantly re-interpret their experiences.
Another key element of a good process, says Prof Sibony, is collegiality. “It’s easy to disregard the process when nobody’s looking at you. But it’s much harder to do it in front of other people. When there is collegiality, you’re much more likely to follow the process.”
Collegiality leads to discussion and debate, which are important parts of the process – more so than analysis. He points to a study where he and his team considered more than 1,000 investment decisions, comparing those which were based on a lot of analysis and number crunching with those where there was a good debate prior to the decision. “It turned out that the good debate made a much bigger difference to the outcome of the decision than the analysis,” he says. “Not because analysis doesn’t matter, but because everybody does the analysis.”
Techniques to minimise bias
Behavioural economists and psychologists have come up with various techniques to minimise biases during a decision-making process.
One, proposed by the psychologist Gary Klein, is the idea of a “pre-mortem”: When a decision on a project is almost finalised, the decision-makers are asked the hypothetical question: “Assume we have gone ahead with the project. Three years have passed and the project is a total disaster. Write down why this happened.” This forces doubters to overcome their tendencies to keep their doubts to themselves and identifies flaws that might not have been discussed.
Another technique suggested by Professor Kahneman of Princeton University is to disallow the decision-makers to voice their views on a decision at the outset of a discussion, but instead make them write down the pros and cons and read out what they have written. This prevents them from being influenced by previous speakers or echoing the CEO’s opinions. It also leads to a greater diversity of views, a richer discussion and a better decision.
In his book, Prof Sibony lists 40 decision-making techniques that can help minimise biases and improve decisions. Among them:
- Appoint a “devil’s advocate” – or even a team to make the opposite case to the one being presented.
- Run the “vanishing options” test. Ask decision-makers what they would do if for any reason, the option on the table becomes impossible. This often leads to new, unexpected ideas.
- Stress test assumptions – to ensure the worst case is really the worst case, not just a slight downgrade from the base case.
And a final tip: Sleep on it.
After all the meetings are over and the research examined from every angle, wait till the next morning before taking a decision. This creates some distance and avoids decisions being made while people may still be in the grip of emotions. This age-old piece of advice can be invaluable – both in business and in life.
Olivier Sibony is a Senior Advisor and Investment Committee Member of Qualgro, a Southeast Asian Venture Capital firm, since 2015.
A professor and award-winning author, and an advisor to corporates, Olivier is renowned for his work on cognitive biases in decision making and strategic thinking.
He was a Senior Partner at McKinsey, with experience in Europe and the United States.