Saturday, 11 January 2014

How To Lie With Risks

 I first heard about the book "How To Lie With Statistics" while taking a Coursera course on Information Risk Management, and immediately thought I had to give it a go. Having gulped the thought-provoking and illustrative contents in a few days, I know I will never look at statistics and surveys in the same way again. Questions ranging from the reliability, bias, omissions, misrepresentations to unfounded correlations and ridiculous extrapolations will always crop up.



 Since statistics is inherent in the work I do in risk management, I thought I would examine some of the ways to lie with risk assessment and risk management. In a previous post, I talked about  managing risks using Oracle Primavera P6 R8. P6 requires you to choose a probability, cost impact and schedule impact range for an identified risk before and after any mitigation if required. It then calculates the risk score and risk exposure in £, also known as value at risk.

 All of the below are real life examples from analysing tons of project risk registers. Some look mischievous while some others are just ridiculous or show a lack of understanding.

Be precise to the nearest pence and decimal: since P6 will only let you choose your probability in a range say 1-10%, 10-20%, it is perhaps natural at first to find it limiting by pretending to know the exact probability of a risk event occurring, say 8% or 2%. The truth is that, unless it is about the chance of a head or tail in throwing a coin, most probabilities are estimates or gut feelings. It is therefore more logical to agree that the probability will fall in a certain range. The same goes with the cost impact. Knowing that the cost impact of a risk is exactly £71.20 is unrealistic, and insisting that it is so is just ridiculous.
 
Exaggerate the risk impact in millions of £: As P6 allows a maximum of a 9X9 matrix, there is a limit to the maximum value of cost impact you can use. In a £3M project, identify risks with a potential impact of £1M such as the eruption of a Third World War or tsunami in central London. You are annoyed that the matrix does not allow for such large values. Get even more frustrated that you cannot assign such risks the precise probability of 0.5% or even 2%. The truth is that if a project of that value was realistically exposed to such magnitudes of risk impact, it would be a reason for escalation to corporate or global risk register, or for project closure. This is scaremongering!


 Have a wide range of risks that would happen even without your project: this is very similar to the previous one. Listing risks that are part of everyday occurrence, and not triggered by the project environment is a waste of space and time. This includes the risks of general uncertainties about people, time and costs. A real risk needs a trigger, otherwise it is just a worry.
 
Have vague risks and response plans:This might be a one word description or an exaggerated risk with an impact of £100K. Then suggest you can mitigate the exposure to zero by "close liaison with the contractor." I wonder what you are being paid for in the first place! This is something you ought to do anyway.

Make your post response exposure higher than the pre response exposure: This is more like an anecdote. If you make things worse by mitigating a risk, perhaps you should just let it be. This could be a sign of someone randomly assigning numbers without much thought.

Make the response cost higher than the exposure: If  the money you spend solving a problem is bigger than the problem itself, then something is wrong.  Of course there are things that are more valuable than money such as human life and reputation. Unless you can make that argument, you need to consider other possible risk responses or just accept it. It's a risk, it may never happen.

Tolerate a risk and then go on and spend £10,000 on it: When you tolerate or accept a risk, all you do is watch and hope it doesn't happen. Once you start acting on it, it is no longer just accepted. The only justifiable case is when a risk is actively accepted. In that case you will need to have a contingency in place if it does happen. The cost of this contingency should be the risk exposure.


Copy risks one to one from another project: This is often sheer laziness. While a lot of risks are generic in similar projects, the circumstances under which they can occur can be very different. This will have an effect on the likelihood and impact of those risks. In addition, projects by definition are never the same. The location, client, and delivery team can make a huge difference.

Just make up the probability and impacts: Assigning these values is not easy. It is therefore imperative to give a bit of description detailing how you arrived at these values. This is will show that you have given the specific project risks some thought.

If you think of any other ways to lie with risks or disagree with what I have said, please say so in the comments section.

No comments:

Post a Comment