Decision Making on Software Development Projects

Is Both Simple and Complex at the Same Time

All projects operate in the presence of uncertainty.

Real-world decision making is performed under uncertainty.

Decision makers must make decisions which best incorporate these uncertainties.

These uncertainties come in two forms – reducible (I can do something about it) and irreducible (I can't do anything about it).

Irreducible (Aleatory) uncertainty is the natural variability of the processes and technology on the project. These are stochastic, they may be nonstationary, and are due to chance from the underlying statistical distributions.

Aleatory uncertainties are modeled as random variables described by statistical distributions (Triangle is a common one when the actual distribution is not known)

In aleatory uncertainty, decision makers make assumptions about the distribution's descriptive statistics - the Mean and Variance are needed along with the Most Likely (Mode) at a minimum. The shape of the curve is needed as well.

Irreducible uncertainty can only be dealt with margin. Cost margin, schedule margin, technical margin.

Reducible (Epistemic) uncertainty is subjective, with the subjectivity coming from lack of knowledge.

This lack of knowledge comes from the probabilistic nondeterministic behavior of the system or the environment.

Reducible uncertainty is addressed with redundancy, experiments, prototypes, models, measures, empirical data to reveal knowledge about the underlying probabilities of the process.

All Uncertainty Creates Risk.

Reducible risk requires estimating the probability distribution of the occurrence.

Irreducible risk requires estimating the statistical distribution of the naturally occurring processes.

– Tim Lister.Risk Management is How Adults Manage projects

Risk management requires estimating.

Adult management of projects requires estimating.

Not Estimating means not managing as an Adult.

- “Treatment of aleatory and epistemic uncertainty in performance assessments for complex systems,”J. C. Helton and D. E. Burmaster, Reliability Engineering and System Safety, vol. 54, no. 2-3, pp. 91–94, 1996.
- “Uncertainty quantification using evidence theory,” W. L. Oberkampf, in
*Proceedings from the Advanced Simulation & Computing Workshop*, Albuquerque, NM, USA, 2005. - “Treatment of uncertainty in performance assessments for complex systems,”J. C. Helton,
*Risk Analysis*, vol. 14, no. 4, pp. 483–511, 1994. - S. N. Rai, D. Krewski, and S. Bartlett, “A general framework for the analysis of uncertainty and variability in risk assessment,” Human and Ecological Risk Assessment, vol. 2, no. 4, pp. 972–989, 1996.
- “Understanding uncertainty,”W. D. Rowe,
*Risk Analysis*, vol. 14, no. 5, pp. 743–750, 1994. *Possibility Theory: An Approach to Computerized Processing of Uncertainty*, D. Dubois and H. Prade, Plenum Press, New York, NY, USA, 1988.- “Judgment under uncertainty: heuristics and biases,”A. Tversky and D. Kahneman,
*Science*, vol. 185, no. 4157, pp. 1124–1131, 1974. - Methods for representing uncertainty. A literature review, Enrico Zio and Nicola Pedroni,
*Foundation for an Industrial Safety Culture*, Toulouse, France *Estimating Software-Intensive Systems*, Richard Stutzke, Addison Wesley