If you were to jump from a tower, wouldn’t you check whether it is safe before you jumped? Wouldn’t you think about possible risks in order you take the necessary precautions? In short, wouldn’t you perform some kind of risk management, or would you just jump?
Risk Management: Some Facts
“Facts do not cease to exist because they are ignored,” British writer Aldous Huxley once said. Huxley probably was not referring to risks in project work. He probably also was not aware of a certain project for the construction of a certain airport, a project that could not be finished in time because certain systems didn’t function properly and no alternatives were in place. No, I am not talking about the infamous Berlin Airport BER, and it is not 2013. It is 1993, and I am talking about Denver International Airport.
Ten years later, in 2003, Tom DeMarco wrote a book called “Waltzing with Bears: Managing Risk on Software Projects” in which he analyzes the problems in risk management at that time in that project. Another ten years later, another airport was not finished in time. This time it was happening in Berlin, and again no effective risk management was performed. How is this possible?
With BER there are certainly a number of factors causing delays: changing regulations in the course of the project, lack of transparency regarding requirements and project progress, unclear responsibilities. Were the problems not identified as risks? Well, when the consequences became apparent the surely were.
Why didn’t the responsible persons learn from the mistakes made in Denver, or in Munich for that matter? Well, some dangers are harder to anticipate than others, not only for developers, engineers and stakeholders. We succumb to “illusions of prognosis”, we overestimate ourselves, we yield to temptations of incentive systems, we ignore alternatives, we omit countermeasures, we align ourselves with the majority, and we assess decisions based on results.
What is Hindsight Bias and How Do Illusions of Prognosis Develop?
Another quote, this time by Winston Churchill: “Politics is the ability to foretell what is going to happen tomorrow, next week, next month and next year. And to have the ability afterwards to explain why it didn’t happen.” Shouldn’t the politician in question be able to anticipate the consequences better? In hindsight, consequences often seem inevitable. This habit of assessing decisions on the basis of results is called Hindsight Bias.
We like decisions that lead to satisfying results. Decisions that lead to dissatisfying results are bad decisions. In doing so we fail to fully capture the moment of decision, i.e. the available information as well as the circumstances leading to a decision. It is always easy to point your finger at the operator of the Berlin Airport, isn’t it?
Is there actually a way to estimate scope, content and workload of a project reliably? By definition, each project is unique. Individuals interact, surrounding conditions, experiences and tools differ. Something new is created right here and right now, which is why it is so difficult to anticipate the course of a project. We use models to foresee reality. We use Best Practices and adjust them to concrete situations. The more complex a planned system is and the bigger the time frame, the more unstable a prognosis becomes. For example, can an invention be predicted? Why do so many companies fail?
Philip Tetlock, Professor at the University of California, Berkeley, published a study in 2005 containing more than 82.000 predictions by so-called experts. The hit rate was insignificantly higher than that of a random generator. We are wrong to think we can anticipate or plan the future.
Why do we overestimate ourselves?
I have never heard of a large-scale project that was finished quicker and for less money than planned. Have you? The Elbphilharmonie Hamburg, soccer stadiums in Brazil, the German Transrapid train, the atomic factory Hanau, Stuttgart21 – all of them projects that did not achieve the desired results in the planned time and scope. Is that surprising?
I always need more time than I think I do. Cooking, tidying up, writing – it’s always the same, a chronic condition that does not get better with time. Good to know I am not the only one.
Dan Lovallo describes this behavior as a tendency of people and organizations that underestimate time, costs and risks of future actions while at the same time overestimating the advantages of these actions. This not only leads to excess of a time frame but also to cost explosions and less benefits than planned.
The Problem of Poor Incentives
The easiest way to reach your goals is to aim low. Are there groups that benefit from cost underestimations? Politicians do because voters like low cost projects. Does it make sense to pay consultants on a time and material basis if as a result their main goal is to use as much time and material? Does it make sense to pay a doctor more money if he performs more examinations?
Rolf Dobelli, Swiss entrepreneur and writer told the following story: “The investor Charlie Munger entered a store for fishing equipment. He stopped in front of a shelf, looked at a particularly glittering lure and asked the shop owner, “Does fish really go for this stuff?” The shop owner smiled and said, “Charlie, we don’t sell to fish.”
Incentive systems are important but poor incentives lead to unexpected results. How do incentive systems work in your projects?
There are a lot more human spects wie tend to overlook: We ignore alternative means of achieving our goals. We act when we should wait. We do nothing when we should act. We accept the group concensus. We succumb to an illusion of control. We construct inadequate causalities.
Can we eliminate risks? No. But we should take adequate precautions and perform effective risk management. Also when jumping from a tower.
1) Tom DeMarco, Waltzing with Bears: Managing Risk on Software Projects
2) Rolf Dobelli, The Art of Thinking Clearly
3) Baruch Fischhoff, For Those Condemned to Study the Past: Heuristics and Biases in Hindsight
4) Daniel Kahneman, Thinking, Fast and Slow
5) Dan Lovallo: Delusions of Success: How Optimism Undermines Executives’ Decisions.
6) Philip E. Tetlock, Expert Political Judgment: How Good Is It? How Can We Know?