Let's say you're the project or program manager of a large complex system. Maybe an aircraft, or a building, or an ERP system deployment. Your project is valued in the 100's of millions of dollars. Or say your project is s simple straight forward set of activities. Maybe the planting of a new row of trees on your land, or remodeling your kitchen.
No matter the project, the domain, the product or service, there are Five Immutable Principles of project success. For each of the Five Principles, there is uncertainty. The uncertainty is always there, it doesn't go away with specific actions in specific domains, or with the use of any tools, processes, or practices.
All project work operates in the presence of uncertainty. This is an immutable principle that impacts planning, execution, performance measures, decision making, risk, budgeting, and overall business and technical management of the project and the business funding the project no matter the domain, context, technology or any methods.
We cannot escape these two uncertainties - reducible and irreducible - and must learn how to manage in the presence of these uncertainties.
If we're going to successfully manage project work in the presence of this uncertainty, we need a framework in which we can make decisions based on the underlying probabilistic and statistical processes that create the uncertainties. The raw material for managing in the presence of uncertainty include answers to the following questions:
- Do we have a plan for what attributes of the deliverables in units of measure meaningful to the decision makers?
- Do we have measures of Effectiveness, Performance, all the ...ilities.
- Are we making progress to plan for the technical and operational aspects of the deliverables? This includes the Measures of Effectiveness (MoE), Measures of Performance (MoP), Key Performance Parameters (KPP), and Technical Performance Measures (TPM) of the deliverables.
- Is each of these measures being met for the planned cost at the planned time?
- Are the upper and lower control limits of each measure inside the planned acceptable performance range at any specific point in time on the path to Done?
- If not, what are the root cause and corrective actions to bring the performance back inside the bounds?
- Are we being effective with our budget - that is are we earning our budget.
- A $1 spent produces a $1 in value return at the planned time for that return.
Let's start with a clear and concise description of the problem of successfully managing projects in the presence of uncertainty:
Accurate software cost and schedule estimations are essential for non-trivial software projects. In many cases, once the estimates have been made (at proposal or authorization to proceed), recalibrate and reduction the uncertainty of the initial estimates is not always performed. As a software project progresses, more information about the project is known. This knowledge can be used to assess and re-estimate the effort required to complete the project. With more accurate estimations and less uncertainties, the probability of success of the project outcome can be increased. Abstracted from 
The paradigm of using past performance to update the estimates and risks to the project fits into a paradigm of the Cone of Uncertainty. The Cone is a convenient way to describe the upper and lower confidence levels of any parameter of the project, technical, cost, or schedule needed to be adhered to for project success. The notion of the Cone of Uncertainty has a ling history , , and , where reducing the uncertainty in key parameters of the project was recognized as a critical success factor.
The Cone of Uncertainty can be used to assess the past performance of the project. If the actual data is outside the Cone of Uncertainty, then the project was subject to some cause. Four sample causes are shown below.
Before going let's look at the definition of the Cone of Uncertainty because it is misdefined and misused by some  in Wikipedia. Here's the generic Cone of Uncertainty. As the project progresses, the uncertainty of the project attributes should be reducing. If they are not reducing, then the project is headed for trouble, the and management processes of the project has failed to performance its job.
If the actual value of the measured parameter, when compared to the planned value of the parameter is outside the bounds of the planned range, then the project is no longer headed to success. Some corrective action is needed to get back inside the bounds. This is called back to green where we work.
There still seems to be confusion around the Cone of Uncertainty. Let me take another crack of describing how the CoU came about, how it's used in our Aerospace, Defense, and Enterprise IT domain and how not having a plan to reduced the uncertainty of the TPMs or any other measure of program performance is one of the top four root causes of project performance shortfall.
Let's start with Mr. Bliss's chart. From research conducted a the Performance Assessment and Root Cause Analyses department in the Office of Secretary of Defense for Acquisition, Technology, and Logistics, here the top four Root Causes of unanticipated Cost and Schedule growth. Many of the programs we work are Software Intensive System of Systems. The software is now a major component of most weapons and space flight systems. So getting the software right means increasing the probability of program success.
These four root causes are:
- Unrealistic performance expectations - we have optimism that the resulting product or service will performance at the needed levels to deliver the needed capabilities to meet the business goals. Notice the term unrealistic. Without a realistic assessment of what can be delivered, we have no way to assess that our expectations can be met.
- Unrealistic cost and schedule estimates based on inadequate risk-adjusted growth models. If we have models, informed by empirical data, that out risk-adjusted cost and schedule is credible, then we're late, over budget, and the deliverables won't likely meet our needs on day one. These estimate can certainly be based on empirical data, but reference classes, nearest neighbor models, parametric models, Monte Carlo simulations are all methods to make estimates as well.
- Inadequate assessment of risk and unmitigated exposure to these risks. Risk Management is How Adults Manage Projects - Tim Lister. We need a formal risk management process. Mitigation of the risks needs to be on the master schedule, or we need the margin for cost and schedule.
- Unanticipated technical issues without alternative plans - and solutions to maintain effectiveness. Technical issues always arise. Having plans to address them when they turn into issues are needed.
If someone suggests that the cone of uncertainty doesn't work for their project, it is critical to determine why, through root cause analysis.
For example for a recent paper abstract
Software development project schedule estimation has long been a difficult problem. The Standish CHAOS Report indicates that only 20 percent of projects finish on time relative to their original plan. Conventional wisdom proposes that estimation gets better as a project progresses. This concept is sometimes called the cone of uncertainty, a term popularized by Steve McConnell (1996). The idea that uncertainty decreases significantly as one obtains new knowledge seems intuitive. Metrics collected from Landmark's projects show that the estimation accuracy of project duration followed a lognormal distribution, and the uncertainty range was nearly identical throughout the project, in conflict with popular interpretation of the "cone of uncertainty"
So here are some unanswered questions:
- Why are the attributes not staying inside the Upper and Lower control limits that were planned for those parameters to be successful?
- Why did only 20% of the projects finish on time?
- Why did the estimates NOT get better?
- BTW the term was NOT popularized by McConnell, the term goes all the way back to 1958 on the chemical plant construction industry.
- Why were the estimate ranges essentially the same? Why did they not improve with new information about the project attributes?
The referenced paper doesn't answer these. Instead, it suggests the Cone of Uncertainty is not the proper model for software estimating, with no evidence other than anecdotal examples from observed projects.
If the observed data is outside the Cone, then an answer is needed why is this the case before assuming the Cone is of no use.
This approach ignores the core principle of reducing uncertainty - with management intent - as the project progresses and discovers new information, collects past performance to be used for estimating future performance, and applies active risk management.
The referenced paper doesn't say why. Instead, it suggests the Cone of Uncertainty is not the proper model for software estimating, with no evidence other than anecdotal examples from observed projects. This approach ignores the core principle of closed loop control. If the control systems is designed to reduce uncertainty - as any project would want to, and the uncertainty is Not reduced, go find out why and take corrective action. Don't toss out the notion that the Cone of Uncertainty is the wrong paradigm for controlling the project performance.
It's not about the project performance data not fitting inside the cone, it's about WHY did the project performance data NOT fit inside the cone of uncertanty?
Without an answer to this why, and testable corrective action for the Root Cause, the project has little hope of showing up on time, on budget, and delivering the needed capabilities the customer has paid for.
Resources , , and  provide some more background.
The Cone of Uncertainty is the Planned reduction of uncertainty for project attributes, cost, schedule, and technical. When your project numbers are outside the planned upper and lower control limits, you've got a problem that requires management intervention - a corrective action. When it is mentioned that the cone of uncertainty is not applicable because of observed excursions outside the control limits, then that project is out of control to the planned control limits. NOT because the cone of uncertainty is wrong in principle.
- Cone of Uncertainty definition (Wikipedia)
- Steve McConnell's Cone of Uncertainty
- "Reducing Estimation Uncertainty with Continuous Assessment: Tracking the “Cone of Uncertainty” Pongtip Aroonvatanaporn, Chatchai Sinthop, Barry Boehm, ASE’10, September 20–24, 2010, Antwerp, Belgium.
- "A Production Model for Construction: A Theoretical Framework," Ricardo Antunes and Vicente Gonzalez, Buildings 2015, 5(1), 209-228
- "Accuracy Considerations for Capital Cost Estimation", Carl H. Bauman, Industrial & Engineering Chemistry, April 1958.
- Software Engineering Economics, Barry Boehm, Prentice-Hall, 1981.
- "The COCOMO 2.0 Software Cost Estimation Model,"Barry Boehm, et al., International Society of Parametric Analysis (May 1995).
- “Coping with the Cone of Uncertainty: An Empirical Study of the SAIV Process Model,” Da Yang, Barry Boehm, Ye Yang, Qing Wang, and Mingshu Li, ICSP 2007, LNCS 4470, pp. 37–48, 2007.
- “Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty’” Pongtip Aroonvatanaporn, Chatchai Sinthop and Barry Boehm, Center for Systems and Software Engineering University of Southern California Los Angeles, CA 90089, ASE’10, September 20–24, 2010, Antwerp, Belgium, 2010.
- “Shrinking The Cone Of Uncertainty With Continuous Assessment For Software Team Dynamics In Design And Development, Pongtip Aroonvatanaporn,” Ph.D. Thesis, University of Southern California, August 2012.