Principles, Practices, and Processes to Increase Probability of Project Success
I listened to the final rant of Jon Stewart (BIG CAUTION this show is from Cable and for Adults Only, just like Risk Management) and came away with inspiration for a post, which I've edited a bit to remove the phrases not applicable here.
Pseudoscience and science – the former is an belief based on logical fallacies that is supported by some people who may seem rational; the latter is an actual rational methodology to discover facts about the natural universe. The former is utter bullshit. And the latter is fact. Deal with it. And an almost literal translation of Jon Stewart's last broadcast.
Bullshit Is everywhere. There is very little that you will encounter in life that has not been, in some ways, infused with bullshit.
Not all of it bad. Your general, day-to-day, organic free-range bullshit is often necessary. That kind of bullshit in many ways provides important social-contract fertilizer. It keeps people from making each other cry all day.
But then there’s the more pernicious bullshit–your premeditated, institutional bullshit, designed to obscure and distract. Designed by whom? The bashitocracy.
It comes in three basic flavors. One, making bad things sound like good things. “estimates are the smell of dysfunction, so let's not estimate and the dysfunction will disappear." Because "we're just developers who can't even make high level estimates how much it will cost and we work for bonehead managers who can't tell the difference between a good estimate and a bad estimate," doesn't have the same ring.
"Estimates inhibit creativity, restrict our ability to be flexible, and the other restrictions to our creativity" sounds better than "we have no clue what we're doing, how much it will cost, or when we'll be done, so juts give us the money so we can start spending," So whenever something’s been titled Pure Agile, or 10X improvement in productivity, searching for the Magic take a good long sniff. Chances are it’s been manufactured in a facility that may contain traces of bullshit
Number two. Hiding the bad things under mountains of bullshit Complexity. You know, I would love to download Drizzy’s latest Meek Mill diss. But I’m not really interested right now in reading Tolstoy’s iTunes agreement. So I’ll just click agree, even if it grants Apple prima note with my spouse. This comes to the discussion of Value at Risk and what are estimates for other than to protect Value at Risk. And the willful ignorance of how every business works projects with probabilistic events and statistical variance and how those uncertainties must be dealt with for the business to have any hope of surviving .
And finally, finally, it’s the bullshit of infinite possibility. The Unicorn approach to solving hard problems, by claiming all big problems can be broken down into little problems, These bullshitters cover their unwillingness to act under the guise of unending inquiry. We can’t do anything because we can't possibly know anything in the presence of uncertainty. We cannot take action to improve that knowledge until everyone in the world agrees we're not headed down the slippery slope of governance of how we spend other people's money. Until then, I say it leads to controversy.
Now, the good news is this. Bullshitters have gotten pretty lazy. And their work is easily detected. And looking for it is kind of a pleasant way to pass the time.
So when you encounter some claim - estimates are the smell of dysfunction. Or the latest There are countless good ways to make decisions: Estimates is one. Then ask for evidence. Ask for working examples that can be tested against some set of principles. Not just personal anecdotes. If there are no principles, no testable evidence, then ...
The best defense against bullshit is vigilance. So if you smell something, say something.
One of the escape clauses of #Noestimates is to re-label Forecasting as NOT Estimating, It is forecasting, based on empirical data. Ignoring for the moment the empirical data discussion since ALL data is empirical, otherwise you wouldn't have the data.
Empirical is defined as based on, concerned with, or verifiable by observation or experience rather than theory or pure logic.
And just to beat this horse some more
Empirical also means observed, factual, experimental experiential, pragmatic, speculative, provisional. This is in comparison for estimating purpose for a theoretical model that produces data, parametric from empirical or theoretical models that produce data. In all cases the data is used to estimate some outcome in the past, present, or future. When that estimate is about the future, it can also be referred to as a Forecast. Weather forecasts, sales forecasts, market forecasts, earnings forecasts. All are estimates about some outcome. The antonym of empirical is theoretical, un-observed (as would be the case for a model), hypothetical, conjectural.
So just the repeat - empirical data is observed and empirical data can be one of the sources of making estimates about the past, present, or future.
Since Scrum is an empirical process, it is also shares its attributes with empirical process control systems. Project management and the management of work is a process control systems. This work - writing software for money - is a process. And as a process it needs to be controlled. The control makes use of empirical data. Empirical Process control systems have three major attributes. 
This type of control is found from your home thermostat to the dynamic flight controller used to autonomously rendezvous dock the a vehicle with the International Space Station, to provide corrective actions to keep a project on track toward a planned finish date. It's all the same principle.
So Back To Estimating and Forecasting
No matter how many times the #Noestimates advocates make unsubstantiated claims that Forecasting is not estimating, it's not true.
Estimating can be based on empirical data or theoretical models, or better yet from my accelerator days, theoretical models informed by empirical data - Forecasting are estimates of outcomes in the future.
But forecasts are estimates,period. Anyone claiming otherwise needs to come up with reference materials from control systems books, financial modeling books, statistics books, and something to shows Forecasts are not Estimates about future outcomes. Even one of the founders of eXtreme Programming makes that cocla mammy claim that Forecasts are not estimates. Time to send them back to the High School math class.
 Modern Control Systems, Richard Dorf. This is the 5th edition. Mine is the 1st Edition, 1970 from the control systems course needed to write FORTRAN 77 code to control the sampling processes on the particle accelerator for our experiment. Google will find this book and Modern Control Systems, Dorf, in PDF form if you actually want to explore further. In the latter book here's the architecture of a general control system, which can be used to manage software development while spending other people's money, fly a spacecraft to Mars or you Boeing 737 you're riding on to home. Jeff Sutherland knows this from his days of driving around in an Air Force F-4 about the same time I was driving around in an Army CH-47 and his further development of Scrum from John Boyd's Air Force work of the OODA Loop. All project work, is all process control all the time. No way out of it, unless your conjectured method is open loop - then you're not controlling anything, you're just watching it fly into the ditch or crash into the ground. Not usually what your customer or commanding officer woould find desirable behaviour.
So time again to call bunk on this notion
A popular mantra in Agile is deliver fast, deliver often. In some domains this may be applicable. Where we work and where agile is becoming the norm, we have another view.
Deliver as planned
The Plan for the delivery of value is shown in the Product Roadmap and implemented in the Cadence Release Plan, or sometimes in the Capabilities Release plan. Fast is a term replaced by Planned. The Plan is based on a Capabilities Based Plan. This Plan shows the increasing maturity of the Capabilities produced by the project. These Capabilities are needed by the business to accomplish the business case or fulfill a Mission.
Showing up fast is defined by showing up when needed. The need is defined in the Capabilities Based Planning process.
A Capability is the ability to accomplish something of Value. Here's a sample of what a Capability sounds like.
These are mission and business case terms, defined by the owners of the mission or Business Case. If y9ou show up Fast, that also means you can't show up early. For a simple reason - early means you may not be able to put that Value to work. It may mean that Value is not needed yet, and that Value may have to change when we get ready to use that Value. This is the role of the Plan.
In what order do we need what Capabilities - and all their associated technical and operational requirements having been fulfilled - for the needed cost, effectiveness, and performance requirements as well.
A final example - one of my favorites is the notion of the intent of the commander as the basis of defining capabilities. I have a colleague, who was General Schwarzkopf's logistic person in the first Gulf war. She was an Army Colonel and one of a small number of women combatants at the time. There are many more now, but she was a pioneer. One of reasons the US Army was able to move up the coast prior to crossing into Iraq in rapid time was because of her and her staffs planning skills. The notion of agile is the basis of all military process, not just 5 coders in the room with their customer.
So this statement says it all in terms of needed Capabilities
So when you hear deliver early and deliver often. Ask a simple question - what are we delivering? Is that deliverable arriving in the right order for the end user - the customer, the warfighter? Are there any predecessors to that deliverable that have to be in place for the FAST deliverable to be of any use?
This is the role of a Capabilities Based Plan. If your project has no interdependencies, if everything that is produced can be used as a standalone deliverable - arriving in any order - than Capabilities Based Planning is not likely to be of much value. And that's fine. But when we enter the Agile At Scale domain - ERP, Enterprise IT, Software Intensive System of Systems - we've got a separate issue. Order does matter. Fast is no longer of much value. As planned and as needed are the Critical Success Factors.
And a final thought
If you're going to Deliver Fast, do you have a Plan for how to do that? No, then how in the world are you going to deliver fast of you don't know what you are going to deliver, when that delivery will be done, and how you are going to deliver that value? Without a plan of some sort, how can you assert the naive notion of deliver fast and deliver often can ever be executed? It's just a platitude, without any actionable outcomes without a Plan on how to do that.
The post Top 5 Ways Agile Mitigates Risk is one of those posts that's not wrong in what it says at the detail level - more or less, but is not right in principle.
Agile Mitigates Risk - is not correct. Agile provides rapid identification of risk. That's all.
First let's start with risk, risk management, and risk management frameworks.
First all risk comes from uncertainty,
Uncertainties are thing we cannot be certain about. Uncertainty is created by our incomplete knowledge - not by our ignorance
What is risk management?
Risk management is an endeavor that begins with requirements formulation and assessment, includes the planning and conducting of a technical risk reduction phase if needed, and strongly influences the structure of the development and test activities.
There are two types of uncertainty that create risks to an Agile Project. Aleatory (irreducible) uncertainty creates risk from the naturally occurring random behaviours of the projects. Things like defects, productivity of the development staff, estimates of effort, random performance issues. Epistemic (reducible) uncertainties from event based occurrences. These are probability based. The probability that we didn't order enough SAN for our first 3 months of operations and we'll run out of fast access storage. The probability that there will be 5 times the number is users logging and and this will crater our current server configuration. Remember the Affordable Care Act site's first months of operation. Or the probability that our test coverage is not sufficient for the needed reliability of our offering. Remember the Affordable Care Act sit's first months of operation? These risks are risks to the success of the project. The blog's risks are mostly process failure risks.
There are several sources of risk from both Aleatory and Epistemic uncertainties.
The relationship between Uncertainty and Risk is:
This distinction is important for modeling the future performance of cost, schedule, and technical outcomes of a program. Both Aleatory and Epistemic uncertainty create risks.
And these unavoidable uncertainties create risks for software projects
So Let's at the suggestion from the Blog that Agile Mitigates risk
The notion that most risk can be avoided is naive at best. Along with Agile preventing risks from occuring. Both are not true.
Agile provides a good means of identifying risk. Agile is NOT a risk management process. Agile does not prevent risk, only margin or explicitly risk reduction activities can mitigate risks. Risk can not be prevented, they can only be mitigated, avoided, transferred, or ignored.
Posts like this are common. I'd suggest it's because the actual source of information needed to identify, management, and reduce risk on software projects is not being read by developers who are applying agile. It's one of those repeated occurrences of using a word that the meaning is not known
And of course one final thought
In order to handle risks - reducible and irreducible - through mitigation, transferring, ignoring, or accepting the risk - we MUST be able to estimate several things about the risk:
- The probability of occurrence of the reducible risk
- The range of naturally occurring variancs for the irreducible risks
- The probability that he mitigation will be effective
- The probability of any residual risk post-mitigation
This is Risk Management, Agile is NOT risk management. Agile is one part of risk management - Identification, potentially tracking, a contributor to control, and a contributor to communicating the risks.
A burst of Tweets of No Estimates fixes these problems came across Twitter this morning. I won't repeat who they are attributed to, to protect the guilty. But here's a core concept that is totally missing from not only the No Estimates conjecture, but from most every discussion, where a solution is proposed before the problem has been defined. Let's start with Dean Gano's introduction to Apollo Root Cause Analysis, which is a George Bernard Shaw quote so fitting to the discussion of ignoring the root cause and going straight for a solution to the symptom - in many cases an Unnamed symptom.
Ignorance is a most wonderful thing.
It facilitates magic.
It allows the masses to be led.
It provides answers when there are none.
It allows happiness in the presence of danger.
All this while, the pursuit of knowledge can only destroy the illusion.
Is it any wonder mankind chooses ignorance?
~ George Bernard Shaw
So until the symptom is named - and the smell of dysfunction is not a symptom. Until the search for the root cause of the symptom, and applying the 5 whys to an unnamed symptom is applied properly, the root cause will be undetermined. Without discovering the root cause, there will be no chance that any suggested processes, method, or change in behaviors will have any impact on the symptom - named or unnamed.
To see how the statement Estimating is the smell of Dysfunction is seriously flawed and the approach of asking 5 Whys equally flawed, please read RealityCharting® Seven Steps to Effective Problem-Solving and Strategies for Personal Success, Dean L. Gano
The Seven Steps are:
When one or all of these steps are missing, anyone conjecturing their solution - or worse conjecturing we're just exploring for the solution - that conjectured solution is NOT a solution, it's just unsubstantiated conjecture.
One of my favorite quotes when hearing unsubstantiated claims is this:
How many legs does a dog have if you call the tail a leg? Four. Calling a tail a leg doesn't make it a leg - Abraham Lincoln
Calling estimating the smell of Dysfunction doesn't make estimating the smell of dysfunction. You've only identified an unsubstantiated symptom. Until you have found the cause and certainly can't suggest Not estimating is the corrective action.
When we have willful ignorance of the Microeconomics of decision making, managerial finance as a governance process for managing other people's money, denial that the uncertainties of projects - aleatory and epistemic uncertainty - can be addressed without estimates of the impact of those uncertainties. Then we are no better than the people mentioned in George Bernard Shaw's quote above. And we are doomed to repeating the symptoms that result from ignoring these principles of managing in the presence of uncertainty.
Perhaps Mr. Elliott could provide answers to these questions to our clients, when he suggests that estimates are worthless.
Instead of just delivering demos, how about delivering capabilities in an order needed to earn the value in exchange for the cost to produce that value? And do this according the plan described in the Product Roadmap, using the Cadence or Capability Release Plan? That way those paying for the production of value can have confidence they'll be getting their investment returned as needed to fulfill the business plan or accomplish the mission?
A recent twitter post started out with I predict my train will depart from platform 12 in 10 minutes: degree of predictability correlates with length of time I then asked and what is the evidence on which you base this estimated time of departure? I got back I didn't estimate departure, there is a timetable there, but Trains someone's run late & platform sometimes changes.
But in fact - mathematical fact - there was an estimate made. The trains are sometimes late informs the probability of leaving as planned
With the time table there is a target departure time, but unless you're riding the S-Ban from our Eching office to downtown Munich office, as I did for a year - the train departure is approximate. The S-Ban departure was "exactly" 8:04, first because it was Germany 1986, and second because the train was parked at the end of the line.
But no matter if the train is departing Eching station or a London station, or the Station in Lower Downtown Denver, "margin" is needed for both you the traveler and the train. This "margin" protects the Aleatory uncertainties that exist in ALL systems, even trivial systems. The airlines bake this into their schedules. I fly a lot. Many times we arrive early to no gate and have to wait. Rarely in good weather do we arrive late.
When we do arive late on Southwest Airlines, it's most always due to some Event based uncertainty. These are Epistemic uncertainties.
Both Aleatory and Epistemic uncertainties exist on projects and in real life. They are part of life all life.
Aleatory uncertainty is handled by margin. Epistemic uncertainty is handled with by down processes. Here's the details on managing in the presence of uncertainty. Uncertainties that are ALWAYS present. Uncertainties that ALWAYS require making an estimate of the probability of occurrence (Epistemic), range of variance (Aleatory), and probability of outcomes, and probability of the impact from those outcomes, and the probability of the residual uncertainty and associated risk when the initiating uncertainty is not 100% removed.
In other word, you can't make a decision in the presence of uncertainty without estimating all those variables - occurrence, outcome, impact, residual uncertainty. That's the way life - and projects work. Saying decisions can be made without estimating - #Noestimates - doesn't change the way nature and projects work, no matter how many times it is said. Especially how many times it is said without evidence of how to actually make those decisions without estimates.
It is common to use the phrase
If we feel pain X. Let's explore solutions? & possibly "Solutions A & B might be a starting point?"
This approach seeks a solution to the symptom not a solution to the problem
This is the false notion used by #NoEstimates advocates, with the conjecture that estimates are the smell of dysfunction. Without stating what the dysfunction is, then makes the statement that Not Estimating will fix it. So in the end nothing is fixed, the dysfunction is not identified, the Root Cause is not identified, and therefore is it not possible to claim Not Estimating will fix anything, for the simple fact that the problem has not been defined. It's a solution - at doubtful solution - looking for an undefined problem to solve. A lose-lose situation.
So don't fall for the fallacy of the smell and most of all don't fall for the fallacy, we're just exploring and I have no recommended solutions.
The only way to provide an effective solution to any problem, is to determine the Root Cause to that problem and confirm that the proposed solution will in fact prevent the problem from recurring. If you want to learn how to do this - and not follow the naive 5 Whys - read this. Then if you're working in a domain that does Root Cause analysis buy the Reality Charting tool. It's saved us from catastrophe several times.
Neil Killick posted a good question, what's the common ground for talking about estimates.
In this post I'd suggest predictability is very difficult to achieve in the presence of uncertainty. And all projects operate in the presence of uncertainty. All estimates have two attributes - accuracy and precision. The values of these two attributes are what those needing the estimates are after. With the knowledge of the two values of these two attributes the decision makers can assess the "value" of the estimate.
It may well be all they need is an order of magnitude (10X). "How much does it cost to install 50 sites of SAP, with 100 users at each site?" A question asked to our firm in the past. 100M, 200M, 500M? Or something more accurate and precise. "Can these two Features 2015007 and 2015008 going to make into the next Cadence release, scheduled for the end of November?"
Predictable is not actually an attribute without knowing the "Desired" accuracy and precision. That request comes from those asking for the estimate. In our domain that starts with a Broad Area Announcement to provide a "sanity check" to those asking for the solution to "test the bounds of the budget they may need to acquire the capabilities from the project.
Regarding the dysfunctions of business that are connected to estimates - there are many. Many in our domain and most other domains. But that dysfunction is not "caused" by the estimate. It may a "symptom" of poor estimates, but not the "cause." Without first identifying the Root Cause of the symptom, not suggestion for improvement will be effective. Root Cause Analysis is the key here. It's been suggested Estimates are the smell of dysfunction, but not root cause is defined, nor any Corrective Actions. Just the conjecture of a symptom. This is bad root cause analysis, no matter how many times the 5 Whys are suggested, it's still Bad RCA. So if we want to actually find the root cause of the dysfunction, it's going to have to follow a process.
Finally without a context for asking for the estimate and providing the estimate, the assessment of the needed precision and accuracy cannot be determined. Here's a paradigm to agile project management, where we can separate the domains and ask when is it appropriate to estimate and when is it not needed
My suggestion for a common ground is to establish
Then those exchanging ideas about the need for estimates can have a common set of principles to exchange ideas. At this point there are no common set of principles. I'd suggest that the #NoEstimates advocates have not provided the principles by which decisions can be made without estimating the impact of that decision. And until there are principles from the #NoEstimates advocates, it's going to be hard to actually have that conversation.
Two Critical Videos for Agile Transformation. Here's Part 1
Here's Part 2
So when we hear estimates are the smell of Dysfunction - what are the actionable outcomes needed to address these dysfunctions? To date there have been ZERO suggestions. In fact one of the originators claims he's not going to provide any - we're just exploring
For anyone accountable for delivering Value in exchange for the Cost of that Value, we're exploring is the very definition of Muda for the business.
Had a Twitter conversation today about deadlines and how to remove the stigma of a deadline and replace it with a more Politically Correct term. Like most Twitter conversations there is a kernel of truth in there somewhere that sparks a thought that turns into a Blog post. This was one of those.
In our Software Intensive System of Systems world, we are not 5 people sitting around the table with the customer building a warehouse management application for our privately help gadget making company. We work on large scale, mission critical systems - critical to the Enterprise or critical to the sovereign funding an Enterprise application, building a weapon, or accomplishing something that'll you'll read about in the paper. This is not to say those 5 developers sitting around the table with the Product Owner and the Scrum Master are not working on vitally important code. But Agile at Scale has a different paradigm than Agile at the Table.
One critical paradigm is the Product Roadmap and the resulting Release Plan. Release Plans come in two flavors.
Capability Releases have variable delivery dates for the capabilities - Cadence Releases have fixed delivery dates for the Capabilities
What's the Schedule For Getting the Products in the Product Release Plan?
Customers have a fiduciary obligation to have some sense of when the work they are paying for will be completed, how much it will cost and some assurance that what they are paying for will deliver the needed capabilities in exchange for that investment.
This is the basis of managerial finance and decision making in the presence of uncertainty - Microeconomics of Decision Making.
In both the Cadence and Capabilities Release Plan, the Product Roadmap speaks to what is planned to be released. The Release Plan is built during the Release Management process which plans, manages, and governs release of products resulting from the development effort. The owners of this governance process has the decision rights to determine what gets released.
The Capabilities are laid out Cadence Releases in the chart below. The Sprints containing the Stories that implement the Features that produce the Capabilities that occur on regular periods of performance.
With the Product Backlog, the Product Roadmap, and the Cadence or Capabilities release plan, we've got all we need to define what Done looks like.
By Done it means what Capabilities are needed to fulfill the business case or accomplish the mission delivered by the project. It doesn't mean requirements, it doesn't mean tasks, it doesn't mean the details of the work. But if you don't know what Done looks like in units of measure meaningful to the decision makers the only other measure is we ran out of time and money. This has been shown through extensive research to be in the Top 5 of Root Causes of Project failure. The other 4 include Bad Estimates for the cost and schedule to reach Done.
And by the way, the term Done is NOT the Definition of Done in Scrum. That's a term used to state what processes have been applied to the work - a list of criteria which must be met before a product increment . It's a developers definition of Done. A critical activity for sure, but still far removed from the Mission Accomplished Definition of Done. Both are needed, both are Critical Success Factors, but at the business management level, developers DOD, is just the start of recognizing that the project has fulfilled the business case or accomplished the mission.
Done Done and Contrastive Reduplication
While the VP of Program Management at a nuclear weapons cleanup program, one of the Project Managers working for me introduced an idea. When we talked about being Done he chimed in and said, no Glen that's not the question. We need to know when we are Done Done. The use of the term Done Done Contrastive focus reduplication and is very useful in the Agile world, where there the notion of Definition of Done that is NOT based on a formal technical specification, but on customer approval that the results will deliver the needed Capabilities.
Risk management is about making informed decisions by consciously assessing what can go wrong, as well as the likelihood and severity of the impact when that possibility comes true. The heart of Risk Management is making informed decisions in the presence of the uncertainties that create risk.
Managing in the presence of these uncertainties involves the evaluation of the trade-offs associated with risk mitigation in terms of their costs, benefits, and risks, and the evaluation of the impact of current decisions on future options. This process of risk management embodies the identification, analysis, planning, tracking, controlling, and communication of risk.
The sources of uncertainty for all domains, project or not comes in two forms:
A straight forward process for Managing Risk looks like this. There may be others, but what ever the process is suggested it needs to address the six areas of Risk Management in the upper left corner of this diagram and the data model shown here.
The development and deployment of software intensive systems continues to suffer large cost overruns, schedule delays, and poor technical performance. This should be of no surprise to anyone working in the software business.
Research shows these failures result from failing to deal appropriately with the uncertainties in the development of complex, software-intensive and software-dependent systems. Many development communities lack a systematic way of identifying, communicating, and resolving technical uncertainties that are present on all projects. Often the focus is on the symptoms of cost overruns and schedule delays rather than on the root causes in product development. Simply asking the 5 Ways is never sufficient to manage in the presence of risk. The cause and effect processes must be understood and corrective actions produced to fix the cause rather than treat the symptom. Here's the basis of Root Cause Analysis. So when you hear something (like estimating) is the Smell of Dysfunction, you'll know that can't possible be true and it's not made true by asking why without a process to find and fix the Root Cause.
When we hear #Noestimates is Risk Management, that's not only not true, it can't possibly be true.
By Progress to Plan I mean actually Progress to Plan. The Plan is the Product Roadmap for the delivery of the needed Capabilities the customer is paying for. These capabilities need to appear at the needed time for the business strategy to be fulfilled. In some parts of our work the business strategy is replaced by a Mission. But the needed Capabilities at the needed time is still the same.
If those Capabilities don't appear at the needed time, the simple Return on Investment calculation is disrupted. Since the time cost of money is ALWAYS at play in any business.
Now for the Punch Line
If we are managing in the presence of uncertainty, using time cost of money as one (just one, more are below) of the measures of our business or mission success, we need to make decisions in the presence of uncertainty to increase the probability of success.
But first a short interlude on Project Success. It is popular in the Agile and #NoEstimates community to conjecture that project success is not about on-time, on-budget, or performance. First if you show up late and over budget with your working software you need to ask those paying you if they consider that a success.
So now what does project success mean?
Success means different things to different people. Projects are conceived with a business perspective in mind. The goal of the project is usually focused on producing a better result or increasing organizational performance—more profits, additional growth, and improved market position. Otherwise why are we funding this work? 
There are four dimensions of project success.  Each of these dimensions operates in the presence of uncertainty.
These four dimensions can to be applied across a spectrum of projects and tailored to fit the needs of those projects. Here's one approach to identifying projects across that spectrum.
You can't manage risk without estimating. And as we know Risk Management is how Adults Manage Projects - Tim Lister. If you're saying you're managing risk and you're not estimating, then you're not managing risk. And you're not behaving like an adult with other peoples money.
 Project Success: A Multidimensional Strategic Concept, Aaron J. Shenhar, Dov Dvir, Ofer Levy and Alan C. Maltz, Long Range Planning 34 (2001) 699–725