In a recent twitter exchange, a formula was presented that goes like this, from the Original Post
Rewriting this into algebra in LaTeX gives us
Let's see if we can unpack the equation:
- The total uncertainty is the ratio of the estimated past, plus the estimated future divided by the actual past + actual future.
- There can certainly be an estimate of the past that is used, for example in our domain, this estimate of the past would be used to compute the variance of the estimating process, once the actuals come in. We'd compare the actuals to the estimate to determine the fidelity of the estimating process. This would be the Estimate at Completion (EAC) and Estimate to Complete (ETC) recorded at a specific time in the past.
- But that past EAC and ETC are updated when the actuals come in (monthly).
- So it's not clear what it means when Actual Future is used in an equation of Total Uncertainty.
It's Also Not clear:
- Before the project is complete, we can know the actual past but how do we know the actual future?
- So it can certainly be that the estimate of the past is not equal to the actual past - and it rarely is, otherwise, that estimate, wouldn't be an estimate, it'd be the actuals.
Let's see if we can unpack the last part of the OP:
- When the project starts there are only future uncertainties since no work has been done, only estimates made about those future uncertainties.
- When the project is done, those uncertainties are gone, since all we have is actuals and there are no uncertainties about those.
- As well, Actual Future Uncertainty is an oxymoron. The value under measurement is either an actual or an uncertain value, but it can't be both.
- Since we can't know the actual value of the future uncertainty since it's uncertain, not sure what the value of the number in the denominator is.
Let's Try Another Approach
All project (or product developments) have planned work. This can be a traditional planning approach or an Agile Planning approach. In the traditional approach, work is planned out usually in work packages of scope according to some sequence needed to produce the desired value for the cost and duration. The level of detail is usually domain specific. A schedule of the work is built. This schedule can represent all the details or it can be a rolling wave of details, with the planning horizon defining how much detail in the schedule at any one time.
This is one of the fallacies agilest like to use, the rolling wave process, the incremental spiral commit process, and other incremental and iterative processes are baked into our Federal Acquisition process and used in our enterprise IT processes.
There is no such thing as Waterfall, other than on badly managed projects, that willfully ignore current best practices for project management that have been around for decades.
So let's look at some ways to assess the uncertainty
Here's a notional 35-day project, with one approach of just do all the work over the 35 days and a second approach of break down the work into its Work Packages and Tasks. In both cases, the assign uncertainty for the Aleatory risk (irreducible) uncertainty of the work effort as minus 5% and + 15%.
Both these values are for the Total Uncertainty at the beginning of the project, whose work looks like this...
Now when we hear - oh we don't use Gantt Charts for our work, fine, whatever you use, Kanban board, Scrum Product Back Log, sticky notes on the wall, strings, it doesn't matter. Somehow you're sequencing the work to be done, assessing the progress of that work, measuring compliance with what the customer ordered. It doesn't matter. The uncertainty in the effort, duration, cost, and technical performance resulting from that work is still there.
If this is not the case, stop reading now, you're not managing project work in the presence of uncertainty. You're doing something else, but it's not the management of other people's money to produce value.
The two simulation runs below are for the simple 35-day task on the left and the more complex 35-day collection of Work Packages on the right.
For the simple (do all work) approach, there is a 50% chance that we'll finish on or before Jan 29, 2018. For the more complex set of Work Packages, there is a 39% chance that we'll finish on or before Jan 29, 2018. This difference have to do with how the Monte Carlo Simulation (Risky Project) treats the network of work. This is a seperate topic.
As the project progresses and we get status for each of the task - simple and more complex - let's see how the risk of NOT finishing on time changes. The status date for the project is Jan 4th, 2018 and let's assume the work complete for both the simple model and the more complex model is the same. On Jan 4th, we're 57% complete. This measure of Physical Percent Complete is a Critical Success Factor for all project work. This means:
- We have units of measure meaningful to the decision makers.
- These measures can be Measures of Effectiveness, Measures of Performance, Key Performance Parameters, and or Technical Performance Measures.
- But that have to produce tangible evidence of progress to plan. This is called Quantifiable Backup Data where we work.
In other words, we NEVER measure progress to plan (what ever type of plan you have) without tangible evidentiary materials to confirm that progress represents Physical Percent Complete. No handwaving, no ratios that hide the individual measures, no personal opinion. Just data from a predefined plan for the data. What Percent Complete SHOULD we be on this date? What Percent Complete are we? If we're less - in the aggregate, then we're late, likely over budget as well, since more money will be spent to show up late or need to get back on schedule.
So here's the project status on Jan 4, 2018, with the Percent Complete bars on dark blue. Technically (and this is a very technical issue) the bar shown here is the percent duration complete, not the other 4 measures of complete. But that requires a more complex set up and profiling of the labor spreads across the work and calculations to be set up. This is a notional example and on any REAL program we work, the MSFT Project Percent Complete is never used, since it can hide the Physical Percent Complete. But for the typical programs we work (> $100M) our approach is much too complex for the typical sofware development process.So now that we've moved along in our project and made progress - but not actually the progress as planned, since we're showing late on a few tasks, we now have a new probability of completing on or before the planned completion date.
We have a 47% chance of completing on or before Jan 30, 2018, for the Simple Linear Work and a 35% chance of completing on or before Jan 30, 2018, for the More Complex Work. So our probability of completing as needed is going down.
This is a critical understanding.
- We want to know the probability of arriving on or before the need date, at or below the needed cost.
- To say uncertainty is not being reduced is of little use unless we know how that uncertainty is impacting the probability of success.
- So when the performance of the project doesn't match the planned performance - all those measures - AND the reducing uncertainty, we need to do something beyond just pointing out that uncertainty didn't reduce. That something is to take corrective or preventative actions to keep the performance measures inside the planned bounds.
The Punch Line
Just stating what the uncertainty is in a project - that approach taken by critics of the Cone of Uncertainty is useful, but not actionable. We need to answer the question
Knowing something about the uncertainty in the future, our past performance, the current Physical Percent Complete, the remaining work, and any changes in the uncertainty of that work (I didn't make any changes to the work that was baselined at -5%/+10% aleatory uncertainty for each task), but I could have updated that uncertainty for the future). Know all these things - WHAT IS THE PROBABILITY OF COMPLETE ON OR BEFORE THE NEED DATE, and AT OR BELOW THE PLANNED COST. (I didn't cost load the work either, but that's an easy task).
Knowing the Total Uncertainty is interesting but not very useful. The cost and schedule are driven by this uncertainty, but that uncertainty had better be reducing as the project progresses - and reducing at some planned rate - otherwise, you're late, over budget, and the product is not likely to be working.
So the critics of the Cone of Uncertainty, are criticizing the wrong problem. They claim:
- They have data that shows the uncertainty does not reduce.
- OK, fine - Why is the uncertainty not reducing as needed to maintain the probability of completing on or before the need date and at or below the needed cost?
- One article states the reasons for the data not showing reducing uncertainty, but doesn't appear that any corrective or preventative actions were taken. So the result was the uncertainty didn't reduce.
- A follow-up letter (IEEE Sofwtare, 2006) to that article in the same magazine states the same.
- As well it states, using relative uncertainty hides the drivers of the uncertainty, since the Cone of Uncertainty was meant and it always considers the absolute uncertainty. Also stated in the Letter to the Editor of IEEE Software.
- Since the Cone represents the best case uncertainty, it's always possible to be worse - that is to NOT control the aleatory and epistemic risks to the project and have data that goes outside the Cone. This is not an issue with the Cone it's an issue with the management of the project.
The Cone of Uncertainty Does Not Reduce Itself. The CoU is defined as the desired reduction of uncertainty at specific phases of the projects needed to informed the decision makers of the Probability of Project Success.
So when someone has data that doesn't have its uncertainty reduced in accordance with some plan, and doesn't take corrective or preventative actions to reduce that uncertainty, then they're going to be disappointed in the results for the cost and schedule performance of the project. They can rationalize that he customers loved the product. But that doesn't remove the fact you showed up late and over budget.
Project Management in the presence of uncertainty is a closed loop control system. Cost, Schedule, Risk, and production of Value are a few of the dependent variables of this closed loop control system. When those variables have excursions outside the planned boudnries cporrective or preventative actions must be taken it get them back in the boundaries. Keep the Program GREEN is a favorite syaing where we work.
Here's two useful resources for applying the cone of uncertainty that Google will find for you
- "Improving Software Development Tracking and Estimation Inside the Cone of Uncertainty," Pongtip Aroonvatanaporn, Thanida Hongsongkiat, and Barry Boehm.
- "Reducing Estimation Uncertainty with Continuous Assessment:Tracking the “Cone of Uncertainty,” Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm
Read these and learn how the CoU is to be properly used to increase the probability of project success