Cone of Uncertainty describes the evolution of the amount of uncertainty during a project. The planned uncertainty not only needs to decrease over time passing, but this reduction diminishes any impacts of risk on the decision-making processes.
Seems there is still some confusion (intentional or accidental) about the Cone of Uncertainty and its purpose and its use in software development. Some feel that the cone does not provide any value for the work they do and does not match the reducing uncertainty in their estimates. Rather than update Cone of Uncertainty - Part Cinq (Updated) again, let me provide a bibliography of papers describing the origins of the Cone and it's use in the management of Programs.
First, the Cone of Uncertainty is a Principle used to define the needed reduction in the variances of estimates on Programs. It is NOT a post-hoc assessment of project performance, rather it is a guide to state what level of confidence will be needed at what point in the program to increase the Probability of Program Success. The Cone does NOT need data to validate the principle of reducing unc as the program progresses. That is an Immutable Principle of good project management.
If your uncertainty is not reducing at some planned rate, you're not managing the project for success and you're going to be late, over budget and the products not likley to work or some combination of those.
Asking for data to show the COU is valid is a fallacy, and shows lack of understanding of the principle.Remember
Risk Management is How Adults Manage Projects - Tim Lister
The chief critic of the cone has data collected from his projects that develop Products. From his article in an IEEE magazine often quoted by No Estimates advocates.
The domain for the Cone can certainly be product development, but it's origins are for Programs, where a budget, period of performance, and needed capabilities are On Contract. For product development, like the quote above mentions, there is an unanswered question.
If the product with the needed Features shows up late and over budget, is that good business management? And if the competitor is shipping software with greater value, how does the Cone impact that? Seems is you're shipping software, potentially late and over budget, that doesn't have competitive value, that's a Product Marketing problem, not a project management project problem.
Setting those confusing issues aside, here are some materials on the Cone of Uncertainty applicable to contract based delivery of needed Capabilities, at the needed time, for the needed cost.
It may improve the understanding of those conjecturing the Cone of Uncertainty does not represent their data and is of little use, as well as those conjecturing that data is needed to apply the Cone of Uncertainty to a program - to actually read the materials contained below.
The Cone of Uncertainty is a build-to-framework - that is if you program's uncertainty is not reducing at some pre-planned rate as it progress, then it is unlikely it will show up at the needed time for the needed budget, with the needed Capabilities. This is irregardless of the competitions better offerings. That is a Prodict Marketing problem (You're building the wrong product) and not the problem of managing the product or Program you are building.
For those with IEEE Membership, be sure to read the Letter to the Editor about the Cone of Uncertainty article published conjecturing the COU does not follow the data for the projects. This letter states
In “Schedule Estimation and Uncertainty Surrounding the Cone of Uncertainty” (May/June 2006), (the author) expresses some concern about whether uncertainty really does decrease over time. In particular, he wonders whether estimates for work remaining are better (obviously, uncertainty’s not an issue for work already done). His results, however, might instead reflect the techniques used in his company, which, as described, appear to be the same technique used throughout the whole project—that is, “expert judgment.”
In other words, the underlying management of the projects were making use of expert judgment to produce estimates. When this happens, reducing the uncertainty of the estimate may not - and was observed to no - follow the Cone of Uncertainty. As has been stated here before, until the Root Cause of why the estimates did not follow the CoU have been found, corrected and prevented, any conjecture that the CoU is not applicable has no basis in fact.
But before delving into the background and applicability of the Cone of Uncertainty materials below, I'd suggest a read of this manual, showing how estimates are made in our Software Intensive System of Systems domain, where the Cone of Uncertainty guides the needed confidence in all the estimates to increase the probability of program success as the program moves left to right in its execution.
Google will find each of these for you, as a start to the extensive body of litertaure for estimating agile projects in the presence of uncertainty
- "Improving Software Development Tracking and Estimation Inside the Cone of Uncertainty," Pongtip Aroonvatanaporn, Thanida Hongsongkiat, and Barry Boehm, Technical Report USC-CSSE-2012-504, Center for Systems and Software Engineering, University of Southern California, 2012.
- "Coping with the Cone of Uncertainty: An Empirical Study of the SAIV Process Model," Da Yang, Barry Boehm, Ye Yang, Qing Wang, and Mingshu Li, ICSP 2007, LNCS 4470, pp. 37-48, 2007.
- "Software Intensive Systems Cost and Schedule Estimation," Technical Report SERC-2013-TR-032-2, Principal Investigator: Dr. Barry Boehm, University of Southern California, AFCAA Sponsor: Dr. Wilson Rosa, Naval Postgraduate School: Dr. Ray Madachy, University of Southern California and Software Metrics: Dr. Bradford Clark, University of Southern California: Dr. JoAnn Lane, Dr. Thomas Tan, Mr. Ramin Moazeni, Stevens Institute of Technology, Systems Engineering Research Center, 31 June 2013.
- "Shrinking The Cone Of Uncertainty With Continuous Assessment For Software Team Dynamics In Design And Development," Pongtip Aroonvatanaporn," Ph.D. Thesis, University of Southern California, August 2012.
- “Reducing Estimation Uncertainty with Continuous Assessment: Tracking the 'Cone of Uncertainty’” Pongtip Aroonvatanaporn, Chatchai Sinthop and Barry Boehm, Center for Systems and Software Engineering University of Southern California, Los Angeles, CA 90089, ASE’10, September 20–24, 2010, Antwerp, Belgium, 2010.
- “Accurate Estimates Without Local Data?” Tim Menzies, Steve Williams, Oussama Elrawas, Daniel Baker, Barry Boehm, Jairus Hihn, Karen Lum, and Ray Madachy, Software Process Improvement And Practice, (2009).
- "Management Challenges to Implementing Agile Processes in Traditional Development Organizations," Barry Boehm, Richard Turner, IEEE Software, September/October 2005.
- "Phase Distribution of Software Development Effort," Ye Yang, Mei He, Mingshu Li, Qing Wang, Barry Boehm, ESEM'08, October 9-10, 2008.
- "Software Development Cost Estimation Approaches - A Survey," Barry Boehm, Chris Abts, and Sunita Chulani, Ph. D. Qualifying program, Computer Science Department, University of Southern California, 1998.
- "Modern Tools to Support DoD Software Intensive System of Systems Cost Estimation," Jo Ann Lane and Barry Boehm, A DACS State of the Art Report, DACS Report Number 347336, 31 August 2007.
- "Software Development Effort Estimation: Formal Models or Expert Judgment?", Magne Jorgensen and Barry Boehm, IEEE Software, March-April, 2009.
- The Incremental Commitment Spiral Model: Principles and Practices for Successful Systems and Software, Barry Boehm, Jo Ann Lane, Supannika Kollmanojwong, and Richard Turner, Addison Wesley, 2014.