The notion of the Cone of Uncertainty has been around for awhile. Barry Boehm's work in “Software Engineering Economics”. Prentice-Hall, 1981. The poster below is from Steve McConnell's site and makes several things clear.
But first, let's establish a framing assumption. When you hear of projects where uncertainty is not reduced as the project progresses, ask a simple question. Why is this the case? Why, as the project progresses with new information, delivered products, reduced risk, is the overall uncertainty not being reduced? Go find the root cause of this, before claiming uncertainty doesn't reduce. Uncertainty as a principle for all projects, should be reducing thorugh the direct action of Project Management. If uncertainty is not reducing the case may be - bad management, an out of control project, or you're working in pure research world where things like that happen.
So what is the Cone of Uncertainty?
- The Cone is a project management framework describing the uncertainty aspects of estimates (cost and schedule) and other project attributes (cost, schedule, and technical performance parameters). Estimates of cost, schedule, technical performance on the left side of the cone have a lower probability of being precise and accurate than estimates on the right side of the cone. This is due to many reasons. One is levels of uncertainty early in the project. Aleatory and Epistemic uncertainties, which create the risk to the success of the project. Other uncertainties that create risk include:
- Unrealistic performance expectation with missing Measures of Effectiveness and Measures of Performance
- Inadequate assessment of risks and unmitigated exposure to these risks with proper handling plans.
- Unanticipated technical issues with alternative plans and solutions to maintain effectiveness
- Since all project work contains uncertainty, reducing this uncertainty - which reduces risk - is the role of the project team and their management. Either the team itself, the Project or Program Manager, or on larger programs the Risk Management owner.
Here's a simple definition of the Cone of Uncertainty:
The Cone of Uncertainty describes the evolution of the measure of uncertainty during a project. For project success, uncertainty not only must decrease over time, but must also diminishe its impact on the project's outcome. This is done by active risk management, through probabalistic decision-making. At the beginning of a project, there is usually little known about the product or work results. Estimates are needed but are subject to large level of uncertainty. As more research and development is done, more information is learned about the project, and the uncertainty then decreases, reaching 0% when all risk has been mitigated or transferred. This usually happens by the end of the project.
So the question is? - How much variance reduction needs to take place in the project attributes (risk, effectiveness, performance, cost, schedule - shown below) at what points in time, to increase the probability of project success? This is the basis of Closed Loop Project Control Estimates of the needed reduction of uncertanty, estimates of the possisble reduction of uncertainty, and estimates of the effectiveness of these reduction efforts are the basis of the Close Loop Project Control System.
This is the paradigm of the Cone of Uncertainty - it's a planned development compliance engineering tool, not an after the fact data collection tool
The Cone is NOT the result of the project's past performance. The Cone IS the Planned boundaries (upper and lower limits) of the needed reduction in uncertainty (or other performance metrics) as the project proceeds. When actual measures of cost, schedule, and technical performance are outside the planned cone of uncertainty, corrective actions must be taken to move those uncertanties inside the cone, if the project is going to meet it's cost, schedule, and technical performance goals.
If your project's uncertanties are outside the Planned boundaries at the time when they should be inside the planned boundaries, then you are reducing the proabbility of project success
The Measures that are modeled in the Cone of Uncertainty are the Quantitative basis of a control process that establishes the goal for the performance measures. Capturing the actual performance, comparing it to the planned performance, and compliance with the upper and lower control limits provides guidance for making adjustments to maintain the variables perform inside their acceptable limits.
The Benefits of the Use of the Cone of Uncertainty
The planned value, the upper and lower control limits, the measures of actual values from a Close Loop Control System - a measurement based feedback process to improve the effectiveness and efficiency of the project management processes by 
- Analyzing trends that help focus on problem areas at the earliest point in time - when the variable under control starts misbehaving, intervention can be taken. No need to wait till the end to find out you're not going to make it.
- Providing early insight into error-prone products that can then be corrected earlier and thereby at lower cost - when the trends are headed to the UCL and LCL, intervention can take place.
- Avoiding or minimizing cost overruns and schedule slips by detecting them early - by observing trends to breaches of the UCL and LCL.
enough in the project to implement corrective actions
- Performing better technical planning, and making adjustments to resources based on discrepancies between planned and actual progress.
A critical success factor for all project work is Risk Management. And risk management includes the management of all kinds of risks. Risks from all kinds of sources of uncertainty, including technical risk, cost risk, schedule, management risk. Each of these uncertainties and the risks they produce can take on a range of values described by probability and statistical distribution functions. Knowing what ranges are possible and knowing what ranges are acceptable is a critical project success factor.
We need to know the Upper Control Limits (UCL) and Lower Control Limit (LCL) of the ranges of all the variables that will impact the success of our project. We need to know these ranges as a function of time With this paradigm we have logically connected project management processes with Control System processesIf the variances, created by uncertainty going outside the UCL and LCL. Here's a work in progress paper "Is there an underlying Theory of Project Management," that addresses some of the issues with control of project activities.
Here are some examples of Planned variances and managing of the actual variances to make sure the project stays on plan.
A product weight as a function of the programs increasing maturity. In this case, the projected base weight is planned and the planned weights of each of the major subsystems are laid out as a function of time. Tolerance bands for the project base weight provide management with actionable information about the progression of the program. If the vehicle gets overweight, money and time are needed to correct the undesirable variance. This is a closed loop control system for managing the program with a Technical Performance Measure (TPM). There can be cost and schedule performance measures as well.
Below is another example of a Weight reduction attribute that has error bands. In this example (an actual vehicle like the example above) the weight must be reduced as the program proceeds left to right. We have a target weight at Test Readiness Review of 23KG. A 25KG vehicle was sold in the proposal, and we need a target weight that has a safety margin, so 23KG is our target.
As the program proceeds, there are UCL and LCL bands that follow the planned weight. The Orange dots are the actual weights from a variety of sources - a Design Model (3D Catia CAD system), a detailed design model, a bench scale model that can be measured, a non-flying prototype, and then the 1st Flight Article). As the program progresses each of the weight measurements for each of the models through to a final article is compared to the planned weight. We need to keep these values inside the error bands of NEEDED weight reduction if we are to stay on plan.
This is the critical concept in successful project management
We must have a Plan for the critical attributes - Mission Effectiveness, Technical Performance, Key Performance Parameters - for the items. If these are not compliant, the project is bcome subject to one of the Root Causes of program performance shortfall. We must have a burndown or burnup plan for producing the end item deliverables for the program that match those parameters over the course of the program. Of course, we have a wide range of possible outcomes for each item in the beginning. And as the program proceeds the variances measures on those items move toward compliance of the target number in this case Weight.
Here's another example of the Cone of Uncertainty, in this case, the uncertainty is the temperature of an oven being designed by an engineering team. The UCL and LCL are defined BEFORE the project starts. These are used to inform the designer of the progress of the project as it proceeds. Staying inside the control limits is the Planned progress path to the final goal - in this case, temperature.
The Cone of Uncertanty, is the signaling boundaries of the Closed Loop Control system used to manage the project to success
It turns out the cone can also be a flat range with Upper and Lower Control Limits of the variable that is being developed - a design to variable - in this example a Measure of Performance. In this case, a Measure of Performance that needs to stay within the Upper and Lower limits as the project progresses through its gates. If this variable is out of bounds the project will have to pay in some way to get it back to Green.
A Measure of Performance characterizes physical or functional attributes relating to the system operation, measured or estimated under specific conditions. Measures of Performance are (1) Attributes that assure the system has the capability and capacity to perform and (2) Assessment of the system to assure it meets design requirements to satisfy the Measures of Effectiveness, (3) Corrective actions to return the actual performance to the planned performance when that actual performance goes outside the Upper and Lower control limits. Again this is simple statistical process control, using feedback to take corrective actions to control future outcomes - feedforward. In the probabilistic and statistical program management paradigm, feedforward control using past performance, with future models (Monte Carlo model of future behaviors) to determine what corrective actions are needed to Keep The Program Green.
Another cone style is the cone of confidence in a delivery date. This Actual case it's a Low Earth Orbit Vehicle Launch date. In this case, as the program moves from left to right, we need to assure that the Launch Date moves from a low confidence Date to a date that has a chance of being correct. The BLUE bars are the probabilistic ranges of the current estimate date. As the program moves forward those ranges must be reduced if we're going to show up as needed. The Planned date and a date with a margin are the build to dates. As the program moves the confidence of the date must increase and move toward the need date.
- The probabilistic completion times change as the program matures.
- The efforts that produce these improvements must be defined and managed.
- The error bands of the assessment points must include the risk mitigation activities as well.
- The planned activities show how the error band narrows over time:
- This is the basis of a risk tolerant plan.
- The probabilistic interval become more reliable as the risk mitigation and the maturity assessment add confidence to the planned launch date.
Just a reminder again - the Cone of Uncertainty is a DESIRED path, NOT the result of an unmanaged project outcome.
Risk Management, as shown below, is how Adults Manage Projects
Wrap Up On the Misunderstanding of the Purpose and Value of the Cone of Uncertainty
When you hear...
I have data that shows that uncertainty (or any other needed attribute) doesn't reduce and therefore the COU is a FAKE ... OR ... I see data on my projects where the variance is getting worse as we move forward, instead of narrowing as the Planned COU tells us it should be to meet our goals ...
...then that project is out of control, starting with a missing steering target that means it's Open Loop Control and will be late, over budget, and likely not perform to the needed effectiveness and performance parameters. And when you see these out of control situations, go find the Root Cause and generate the Corrective Act.
This data is an observation of a project not being managed as Tim Lister suggests - Risk Management is How Adults Manage Projects.
And if these observations are taking place without corrective actions of the Root Causes of the performance shortfall, the management is behaving badly. Their just observers of the train wreck that is going to happen real soon.
The Engineering Reason for the Cone of Uncertainty Model and the Value it Provides the Designing Makers
The Cone of Uncertainty is NOT an output from the project's behaviour, by then that's too late.
The Cone of Uncertanty is a Steering Target Input to the Management Framework for increasing the probability of the project's success.
This is the Programmatic Management of the project in support of the Technical Management of the project. The processes is an engineering discipline. Systems Engineering, Risk Engineering, Safety and Mission Assurance Engineering, are typical roles where we work.
To suggest otherwise is to invert the paradigm and removes any value from the post-facto observations of the project's performance. At that point it's Too Late, the Horse has left and there's no getting him back.
Defining the planned and needed variance levels at planned points in the project is the basis of the closed loop control system needed increase the probability of success.
When variances outside the planned variance appear, the Root Cause of those must be found and corrective action take.
Here's an example from a Galorath presentation, using the framework of the Cone of Uncertainty, and the actual project cones of how to put this all together. Repeating again, the Cone of Uncertainty is the framework for the
planned reduction of the uncertainty in critical performance measures of the project.
If your project is not reducing the uncertainty as planned for these critical performance measures - cost, schedule, and technical performance - then it's headed for trouble and you may not even know it.
 Systems Engineering Measurement Primer, INCOSE
 System Analysis, Design, and Development Concepts, Principles, and Practices, Charles Wasson, John Wiley & Sons
 SMC Systems Engineering Primer & Handbook: Concept, Processes, and Techniques, Space & Missle Systems Center, U.S. Air Force
 Defense Acquisition Guide, Chapter 4, Systems Engineering, 15 May 2013.
 Program Managers Tool Kit, 16th Edition, Defense Acquisition University.
 "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty'," Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm. ASE’10, September 20–24, 2010, Antwerp, Belgium.
 Boehm, B. “Software Engineering Economics”. Prentice-Hall, 1981.
 Boehm, B., Abts, C., Brown, A. W., Chulani, S., Clark, B. K., Horowitz, E., Madachy, R., Reifer, D. J., and Steece, B. Software Cost Estimation with COCOMO II, Prentice-Hall,
 Boehm, B., Egyed, A., Port, D., Shah, A., Kwan, J., and Madachy, R. "Using the WinWin Spiral Model: A Case Study," IEEE Computer, Volume 31, Number 7, July 1998, pp. 33-44
 Cohn, M. Agile Estimating and Planning, Prentice-Hall, 2005
 DeMarco, T. Controlling Software Projects: Management, Measurement, and Estimation, Yourdon Press, 1982.
 Fleming, Q. W. and Koppelman, J. M. Earned Value Project Management, 2nd edition, Project Management Institute, 2000
 Galorath, D. and Evans, M. Software Sizing, Estimation, and Risk Management, Auer-bach, 2006
Jorgensen, M. and Boehm, B. “Software Development Effort Estimation: Formal Models or Expert Judgment?” IEEE Software, March-April 2009, pp. 14-19
 Jorgensen, M. and Shepperd, M. “A Systematic Review of Software Development Cost Estimation Studies,” IEEE Trans. Software Eng., vol. 33, no. 1, 2007, pp. 33-53
 Krebs, W., Kroll, P., and Richard, E. Un-assessments –reflections by the team, for the team. Agile 2008 Conference
 McConnell, S. Software Project Survival Guide, Microsoft Press, 1998
 Nguyen, V., Deeds-Rubin, S., Tan, T., and Boehm, B. "A SLOC Counting Standard," COCOMO II Forum 2007
 Putnam L. and Fitzsimmons, A. “Estimating Software Costs, Parts 1,2 and 3,” Datamation, September through December 1979
 Stutzke, R. D. Estimating Software-Intensive Systems, Pearson Education, Inc, 2005.