A Fallacy of Estimation piece has an interesting phrase.
The post goes on to say...
Here's an ongoing collection of fallacy of estimating commentary, some useful, some misinformed, some not even right let alone wrong. Moving beyond the personal opinion into the realm of actual processes, tools, and people who estimate for a living probably has value when you're assigned a project that spends non-trivial amounts of money.
Ignoring for the moment the uninformed notion that estimating is the smell of dysfunction, since no dysfunctions have been mentioned in that context, let alone corrective actions, other than to Not Do Estimates, there is in fact a critical issue about estimating in all domains.
Of course the last statement as well ignores time phase for estimates. Before the project starts, what's our a risk exposure for the cost of this project? The let's get started and find out how much this will cost is like Steve Levitt's Freakonomics description of the drug dealers - here just try this, I give it to you for free. Now of course Not Estimating has no connection to getting people kooked on Crack Cocain, but let's get started spending your money and we'll find out later how much you're going to have to commit to this project, sounds a bit like bait and switch of the 1970's with Cal Worthington in Southern California, when he'd advertise a car that didn't exist, get you to come down, then up-sell everything - ah the good olde days.
Just heard the story on NPR yesterday about the Spanish firm that will Stop Work on the widening of the Panama Canal because they've overrun by a Billion $.
The Fallacy of Estimating term is used many times without attribution. It starts with the Daniel Kahneman and Amos Tversky and the difficulties humans have in making estimates. Their current book Think Fast and Slow is a continuation of their thesis. But the core of the thesis is contained in a few critical papers, that must be read before drawing any conclusion, and most importantly be read before listening to anyone who has read them.
- Anchoring and Adjustment in Software Estimation, Jorge Aranda and Steve Esterbrook, ESEC-FSE '05, ACM.
- Anchoring and Adjustment in Software Project Management: An Experimental Investigation, Timothy Costello, Naval Postgraduate School
- Anchoring Unbound, Nicholas Epley and Thomas Gilovich, Journal of Consumer Psychology, (20) 2010 pp 20-24.
- Assessing Ranges and Possibilities, Strategic Decision and Risk Management Stanford Certificate Program, Decision Analysis for the Professional, Chapter 12. It is popular to speak about favorit methods as part of decision making without actually saying what decision making paradigm is being used. The Stanford course is a good start.
- Implications of Attitude Change Theories for Numerical Anchoring: Anchor Plausibility and the Limits of Anchor Effectiveness, Duane T. Wegener, Richard E. Petty, Brian T. Detweiler-Bedell, and W. Blair G. Jarvis, Journal of Experimental Psychology, 37, 62-69.
- The Anchoring-and-Adjustment Heuristic Why the Adjustments Are Insufficient, Nicholas Epley and Thomas Gilovich, Psychological Research, Association for Psychological Science, Vol 17, No 4.
- Judgement Under Uncertainty,Amos Tversky; Daniel Kahneman Science, New Series, Vol. 185, No. 4157. (Sep. 27, 1974), pp. 1124-1131.
But there's another set of knowledge needed to be successful in the estimating business and that the acknowledgement that all estimates are probabilitic. This can't be said enough. The place to start is Probability Methods for Cost Uncertainty Analysis: A Systems Engineering Perspective, Paul R. Garvey. This book is the anchor for everything we do in the cost, schedule, and techncial performance estimating business on software intensive programs. Mr. Garvey's work at Mitre is the basis and many of the tools and processes used in our domain.
The last paper is one you must have on your desk if you're actually interested in solving the fallacy of estimating.
How Did We Get Into This Estimating Fallacy Mess?
The original explanation for the estimating fallacy was that planners focused on the most optimistic scenario for the project, rather than using past performance, subject matter experts, or parametric models of how much time similar work would require given similar conditions. This is the Optimism fallacy. What could possibly go wrong? Well we know the answer to that now don't we? This is common in our defense and space procurement world and most other worlds where high stakes projects are driven by politics. It a fundamental axiom of life. No Guts No Glory. When the value at risk is high, conservative actions go by the wayside.
Another explanation - one found in our domain as well - is the Authorization Imperitative. If we want to get our program approved, we'd better not tell them how much it will cost. The James Web Telescope and Joint Strike Fighter are good examples of that. JWT is currently at something like $7B it started out less than $1B. Similar for JSF.
So What Next?
It's the olde saw Doctor, Doctor it hurts when I do this? Than stop doing that. This of course is utter nonsense when it comes to estimating. Doctor, doctor we can't make good estimates. Estimates are the smell of dysfunction (with none listed) OK, then stop estimating and start spending. Yea Right!
A few facts of life:
- Building products or supplying services for money almost always means spending other people money. If it's your own money, do as you wish. If it's other people's money they get to say what you do with it. They shouldn't be acting like Dilbert's boss. BTW we can find examples of Dilbert Bosses everywhere. That's trivial. Pointing out those problems is child's play. How about providing solutions. They should understand enough about business management to know that all estimates are probabilistic. That all estimates have built in risks, some reducible, some irreducible, all knowable, many not known. If you have Unknowable risks, you'd better not start the project, until you get those back into the knowable column.
- When those with the money give you the money, they expect you to spend it wisely. That means, you have some notion what you're going to spend it on to produce value to those who gave it to you. That you also know to some degree of confidence how much money you are going to need to spend to deliver the expected value the customer has given you the money for.
How Can We Get Better Estimates?
Let's assume for a moment that we understand why we need to estimate how much of other people's money we're going to spend.
How do we get better? The answer is simple - We're spending other peoples money and they likely want to know how much, when, and what are they getting. We start by looking to the tried and true, field proven, tractable approaches to estimating. This is not a platitude. We start by doing our homework, reading books and papers, looking at tools, asking others how do you do this? We do what any person learning to do something new would do. We look to others first.
What's Out There to Learn?
There are lots of sources for learning to estimate. But the first - field proven way - is Reference Class Forecasting. It's the basis of most estimating processes in use today
Bent Flyvbjerg has lots to say on this. But some care is needed, he tends to overstate the intent of planners and estimators making poor estimates as Lairs. Maybe that's the case at times, I know of a few programs, but it's overstating none the less. Calling people Lairs is fight'en words where I come from in the Texas Panhandle, home of T. Boone Pickens, Dog the Bounty Hunter and Randy Matson
Let's assume you actually want to improve your estimating skills, abilities, and probability of success. Where do you start. First you start by ignoring those who say it can't be done, because it can. Then ignore those who say estimates are the smell of dysfunction because estimates are part of any credible business process - period. OK, if you believe in Unicorns and Pixy Dust, you might belive that making estimates is a dysfunction. Making estimates for the wrong reason is dysfunction. But doing anything for the wrong reason is a dysfunction. Learn to do things for the right reason and as the poster campaign at Rocky Flats said:
Don't do stupid things on purpose
By the way, the notion of drip funding is fine. But it does not answer the question what is our estimate at complete? Drip Funding is also called Time Boxed Scheduling been around for decades. Here's a small amount of money and a list of things I want you to do. Go do them, come back and we'll talk more. If you did them for more or less the money provided, good. If not, you've now got information to calibrate the future capacity for work. This called Reference Class Forecasting.
If you can't estimate credibly you won't be in business long. From the lawn care guy who cut our grass every to the builder of Joint Strike Fighter. OK, they're still in business ;<(
So the first place to start is to inform yourself how other do credible estimating. Let's start with How to Estimate if You Really Want To. But those aren't enough, you'll need some tools. And before you listen to anyone telling you tools get in the way of innovation and understand, ask if you can do non-stationary stochastic modeling of Monte Carlo Markov chains of dependent process flows in you head or by exchanging words between people? OK, back to the problem at hand.
There are many starting points for probabilistic estimating, but they all have one thing in common. We need to know what the problem is. For software we need to know what capabilities the customer would like to possess when the project is DONE. This is called Capabilities Based Planning. Capabilities aren't requirements. Capabilities reveal the requirements, and requirements enable the capabilities to exist. Here's an example of a set of evolving capabilities for a health insurance ERP system
So once we have something like this, we can start to decompose the parts into bite-sized chunks, just like those suggesting drip funding need for success. These are Drips of work.
Next you'll need to suspend belief, just like the Unicorns. The suspension goes like this.
For the vast majority of commercial and a whole lot of military and space software systems, there is nothing new under the sun.
You may not personally have had experience with this new requirement. You may have not even heard of such a thing. But help is at hand - Google. Start there, someone, somewhere, somehow has built something similar. Find it, ask them, do your homework, build a Reference Class. Can't build the Reference Class? OK, then spend some money to build a prototype. Charge the customer for this exploratory effort. This by the way is called agile development. Try a little, learn a little. Try some more, learn some more. Improve your probability of success with direct experience. But make sure you get paid for this. It's part of the project. Exploring like this on other people's money with out them knowing it or without them paying you for it is really bad business. That's a true dysfunction.
Now For Some Tools
Here's my list of favorites. They're favorites because I use them or know people who do:
- Risky Project
- Risk+ - it's gone, but still one of my favorites
- @Risk for Project
- Booz Allen's Polaris
- DelTek Accumen
- Crystal Ball
- COCOMO and cousins COSTAR and other derivative. Steve McConnell's Estimate Pro was like this
- Price Systems
There are likely others, so if you have one send me a note.
In the End
It's not your money, behave appropriately