Starting with Tversky & Kahnemann (1974), "Judgment under Uncertainty: heuristics and Biases," there are sources of error that should lay the foundation for abandoning the simple and many times naive 3 point estimate approach to work duration and effort.
- Representativeness - occurs where probabilities are evaluated by the degree to which A is representative of B.
The last time we did this, we say durations that looked like this. Or, We've asked the guys down Florida and they think we should be able to get this done in 1,400 hours, that what it took them the last time they did it.
- Insensitivity to prior probability of outcomes - we ignore the past statistical behavior of our work efforts. This is ignorance of Bayesian statistics.
I know they took 1,400 hours, but we've learned a lot from their mistakes, so we can do it for less. We don't really have evidence, but our gut feel is they were not very good at their job and took too long.
- Insensitivity to sample size - this is a very common error, where the estimator makes a forecast independent of the sample size.
My students - all 23 of them - answered a survey I wrote about a behavior of problems with Earned Value, and they came up with answers I'll apply to a broad set of situations. I sampled the people I know and they came up some interested statements of what works and doesn't work. We hand picked 12 managers in our firm, gave them a survey we put together from some outside suggestions and we'll be making changes based on their answers.
- Misconceptions of Chance - the belief that random processes represent the core behaviors of a process, while observing a short sequence of outcomes.
I've seen this happen 3 or 4 times in other places, it's got to be the same thing happening here. It's happening in all kinds of places, so it's got to be the same here. "For every up there is a down," let's just wait and we'll be back on track pretty soon.
- Insensitivity of predictability - the future can be predicted intuitive predictions
I've seen this happen before, it's got to be the cause of what's happening this time too.
- Illusion of Validity - the observations we're seeing are a good fit with what we're expecting.
We interviewed a list of people selected by management, and they support the what management thinks is going wrong here. Management has a target budget and schedule in mind, and when we asked the developers - after hearing the management numbers - they came up with about the same numbers.
- Misconceptions of Regression - all those random processes add up to a average we can live with.
When we get all the estimates in, we can find the average variance and use that to forecast the work we're going to be given in the future. This concept is actually mis-represented in a government cost estimating guidebook. Adding a sample set of distributions results in a "normal" distribution.
- Biases due to the effectiveness of a search set -when asked to recall some data or event, the one they come up with is what is most familiar to them.
We met some guys in the cafeteria and they had lots to say about how we should estimate our costs
- Biases due to the retrievability of instances - the data I have at hand is the data I used for my forecast.
We had all the data from the projects they did in Dallas, and used that as our sample data for foresting our performance.
- Adjustment and Anchoring - the estimate is based on the starting value and is then adjusted to get the final answer.
Management gave us a starting point for the cost and schedule, so we need to improve that estimate.
- Biases in the evaluation of conjunctive and disjunctive event - in conjunctive events probabilities are over estimated.
Our rule of thumb has served us well in the past - turns up to be biased
- Anchoring in the assessment of subjective probability distribution - confidence intervals are over confident.
Let's not broaden the variances too much on those cost estimates, it just doesn't feel like right in this situation.
These topics are from Micheal Axelsen's review of of Tversky & Kahnemann 1974 paper, "Judgment under uncertainty: heuristics and biases."
These topics will hopefully put an end to the simple minded 3-point estimates extracted from the people tasked with doing the work. They certainty have a contribution to the estimating process. But serious problems arise when you take those numbers and start making management decisions. In exactly the same way you create serious problems when you take management directive and start makes estimates "anchored" on the bounds of cost or schedule they what to have you produce to.