It's been a busy month for reading. I've been on the road, so I try and focus on reading rather than on working while on the plane. Here are three books underway that are related for the programs we work
using Principles, Practices, and Processes to Increase Probability of Success
It's been a busy month for reading. I've been on the road, so I try and focus on reading rather than on working while on the plane. Here are three books underway that are related for the programs we work
This book contains processes for improving the performance of Scrum teams when they are distributed.
Two of my clients are in this situation. Mainly because the cost of living near the office is prohibitive and travel distances are the worst in Metro DC.
The book shows how to develop User Stories using a distributed team, engaging in effective release planning, managing cultural and language differences, resolving dependencies, and using remote software processes.
It seems many of the idea debates we get into are based on logical fallacies.
Here's a nice book on how this happens and how to address the issues when it comes up.
I've saved the best for last.
This is a MUST READ book for anyone working with agile or thinking about it.
With the Logically Fallacious book in hand, Agile! can be read in parallel.
There is so much crap out there around Agile, this book is mandatory reading.
From the nonsense of #Noestimates to simply bad advice, Bertrand calls it out. Along with all the good things of agile
I work in a domain where engineered systems are developed for complex software-intensive system of systems. These systems are engineered with a variety of development methods. Ranging from traditional to agile and combinations in between. These systems are developed for external and internal customers. Some are products in their own right, some are embedded in larger missions.
In all cases, we start our work with ...
Risk Management is How Adults Manage Projects - Tim Lister
Since all risk comes from uncertainty - reducible (Epistemic) and irreducible (Aleatory), estimating is a foundation of all we do. There is no discussion of the conjecture that estimates are a waste, estimates can't be done, estimates are evil, estimates must be stopped immediately. This would be like saying, risk management is a waste, control system engineering is a waste, thermal analysis of the computer system is a waste, assessment of the reliability, repairable, survivability - all the ...illities are a waste.
In our domain of engineered systems there is a broad range of problems, complex issues, approaches to solving problems. This is a familiar example of that range...
This book provides a clear, complete understanding of how to estimate software costs, schedule, and quality using real-world information.
This includes planning for and execution the project by phase and activity level cost estimates. How to estimate regression, components, integration, and stress tests for the software. How to compensate for inaccuracies in data collection, calculation, and analysis.
How to test the design principles of operational characteristics of the product suing prototyping. How to handle configuration change, research, quality and documentation costs.
Software projects are often late and over budget leading to major problems for customers. There is a serious issue in estimating realistic software project budgets and schedules. Generic models cannot be a reliable source of estimating for complex software projects.
This book presents a number of examples using data collected over years from various organizations the build software. It presents an overview of the International Software Benchmarking Standards Group, which collects data on software projects. This data collection is based on ISO Standards for measuring the functional size of software
Dr. Abran shows how to build estimation models from the data of an organization using statistically sound process and how to focus on the quality of the estimation models.
Often referred to as the black art because of the complexity and uncertainty, software estimation is as difficult or puzzling as people think.
Generating accurate and precise estimates is straightforward.
This book shows how to estimate schedule and cost and the functionality that can be delivered within a time frame. Gow to avoid common estimating mistakes. The estimation techniques needed for specific projects activities. How to apply estimation techniques to any type project - small to large
Many software projects fail because their leaders don't know how to estimate, schedule, or measure them accurately. Proven tools and techniques exist for every facet of software estimation. This book bring them together in a real-world guidebook to help software managers, engineers, and customers immediately improve their estimates - and drive continuous improvement over time.
Software engineering has become procedural and controlled. Agile is a highly procedural process along with more traditional development methods.
The estimating of the development process still needs maturing. This book provides a concise guide for estimating software development efforts. It shows why accurate estimates are needed, what different estimating methods can be used, and how to analyze risks to make appropriate contingency allowances for the uncertainties encountered on all projects, not just software development projects.
This book is a practical guide to estimating and planning agile projects.
The book speaks to why conventional planning fails and why agile planning works. How to estimate feature size using story points and ideal days and how to use each measure to make decisions. How and when to reestimate. How to prioritize features using financial and technical approaches. How to split large features into smaller features. How to plan iterations and predict the team's initial rate of progress. How to schedule projects that have unusually high uncertainty or schedule-related risk. How to estimate projects that will be worked by multiple teams.
These books are just a small sample of the resources available for estimating. When you hear someone say it's too hard, can't be done, never seen it done, it's a waste and the variety of other reasons - ask have you read any of these books and found then wanting for your needs?
When Agile development moves beyond a single scrum team and into the Enterprise IT domain, several considerations must be addressed. For Scrum (or most other agile methods) focus on the team is paramount. The behaviours of the team should follow Katzenbach's definition of a team as...
...a small group of people with complementary skills who are committed to a common purpose, performance goals and approach for which they are mutually accountable - Katzenbach, J. R. and Smith, D.K. (1993), The Wisdom of Teams: Creating the High-performance Organisation, Harvard Business School, Boston
This definition for single scrum teams needs to come in contact with Corporate Governance, managerial finance, decision making in the presence of uncertainty when spending the firms money that has impacts beyond the team and their natural desire to control their own destiny. Governance is about decision rights. IT Governance: How Top Performers Manage Decision Rights for Superior Results is a place to start.
In a recent exchange on Twitter it was mentioned in response a my post that some teams consider the corporate money their own that imagine a world where Ownership has faith and trust in the workers to spend the money appropriately, and this seeming lack of trust in the team's ability to consider external governance as part of their behaviour ...speaks to a lack of trust that the people they have hired are capable of making proper decisions.
Governance is the basis of business management in some form for all organisations beyond that small group of individuals Katzenbach speaks of. The title of this Blog - Herding Cats - speaks to the issue of organisational governance. Herding cats is "an idiomatic saying that refers to an attempt to control or organize a class of entities which are uncontrollable or chaotic." (Warren Bennis, Managing People is Like Herding Cats (1997)).
When Agile encounters governance here's some thoughts from a book chapter on how to assure the benefits of both paradigms are delivered.
“Would you tell me, please, which way I ought to go from here?”
“That depends a good deal on where you want to get to,” said the Cat.
“I don’t much care where–” said Alice.
“Then it doesn’t matter which way you go,” said the Cat.
“–so long as I get SOMEWHERE,” Alice added as an explanation.
“Oh, you’re sure to do that,” said the Cat, “if you only walk long enough.”
(Alice’s Adventures in Wonderland, Chapter 6)
When we hear
The idea behind the #NoEstimates approach to software development isn't to eliminate estimates but, rather, to explore other ways to solve problems without specifically asking, 'How long will it take?'
We first need to ask by what principle of decision making in the presence of uncertainty, what kind of business project has no interest in how long will it take? The answer seems to be a de minimis project.
Because in the real world not Wonderland with Unicorns, those paying have some sense of where they want to go, how much they are willing to pay, how long they are willing to wait to get there, and what value will be produced by their investment. De Minimis projects have no concern for those answers.
And those spending that customers money appear not very interested in applying known solutions, instead are answering with the Cheshire Cat's words, since the No Estimates advocates appear to live in Wonderland.
For those interested in learning how to produce credible Estimates and make decisions in the presence of uncertainty, here's some starting places that have served me well:
There are many dozens of other books, and 100's of papers describing how to make estimates in the presence of uncertainty.
After you do some reading and you hear someone say, estimates are hard, estimates are guesses, estimates are always wrong. estimates are a waste, we can decide with estimates, you'll know they didn't read any of these, haven't a clue what they're talking about, and just making things up as the go. Just like the cat
As readers of this blog know, managing in the presence of uncertainty is how adults manage projects. This is called Risk Management.
And as always in order to make decisions in the presence of the uncertainties that create risk, we need to make estimates.
No estimates? No decisions based on probabilistic choices. No probabilistic choices? No understanding of the resulting risks (both epistemic and aleatory).
No understanding of the probabilistic outcomes and the statistical variances? No Adult Management of risk. (Remember Tin Lister's quote - Risk Management is How Adults Manage Projects)
So it's this simple, no estimates of the outcome, no risk management. No risk management, no adult management. No estimates means No Adult Management of the naturally occuring and probabilistic risk on all project work.
Here's another Tim Lister presentation show how estimates and estimating are part of all decision making when spending other people's money
I've started reading Vasco's book #NoEstimates and will write a detailed deconstruction. I got the Kindle version, so I have a $10 investment at risk. Let's start with some graphs that have been around and their misinformation that forms the basis of the book.
The Chaos Report graph,is the 1st one. This graph is from old 2004 numbers. That's 12 year old numbers. Many times the books uses 12, 16, even 25 year old reports as the basis of the suggestion that Not Estimating fixes the problems in the reports. The Chaos reports have been thoroughly debunked as self selected samples using uncalibrated surveys for the units of measure for project failure. Here's a few comments on the Standish reporting process. But first remember, Standish does not say what the units of measure are for Success, Challenged, or Failure. Without the units of measure, the actual statistics of the projects and the statistical ranges of those projects for each of the three categories, the units as essentially bogus. Good fodder for selling consulting services or for use by those with an idea to sell, but worthless for decision making about the root cause of Failure, Challenged, or even Success. Any undergraduate design of experiments class would have all that information in the public.
So the 1st thing to read when you encounter data like this is Project Success: A Multidimensional Strategic Concept, Aaron J. Shenhar, Dov Dvir, Ofer Levy and Alan C. Maltz. Only then start to assess the numbers. Most likely, like the numbers in the book, they're not credible to support the hypothesis. Which by the way there is no hypothesis for you can make decisions in the presence of uncertanty without estimating
So let's look further at the difficulties with Standish and why NOT to use it as the basis of a conjecture
A simple google search would have found all this research and many many more. I get the sense V didn't do his homework. The bibliography has very few references to actually estimating, no actual estimating books, papers, or research sites. Just personal anecdotes from a set of experiences as a developer.
The Standish Report failure mode is described in Darrell Huff's How to Lie With Statistics - self select the samples from the survey. Standish does not provide any population statistics for their survey.
None of these questions are answered in Standish reports. No Estimate picks these serious statistical sampling error up and uses the, as the basis of the pure-conjecture that Not Estimating is going to fix the problems. This would garner a high school D is the Stats class.
Next comes a chart that makes a similar error. This reference is from Steve McConnell's book, but is actually from another source. The No Estimates book does a poor job of keeping the references straight. It is common to misattribute a report, a graph, even a phrase. The book needs a professional editor.
The graph below is used to show that estimates are usually wrong. But there is a critical misunderstanding about the data.
I'm in the early parts of the book and already have a half dozen pages of notes for either fallacies, incorrect principles, 30 year old references, and other serious mistakes on understanding on how decisions are made in the presence of uncertainty. My short assessment is
It's a concept built on sand.
We all know estimates are hard. But there are lots of hard things in the development of enterprise software. We wouldn't be whining about how hard it is to construct a good First Normal Form database schema, or bullet proof our cyber security front end from attack by the Chinese would we.
So why is estimating a topic that seems to be the whipping boy for software developers these days?
My first inclination is that estimating is not taught very well in the software arts. In engineering schools it is. Estimating is part of all engineering disciplines. One undergraduate and one graduate degree is in physics. Estimating is at the very heart of that discipline. A second graduate degree is in Systems Management - which is a combination of Systems Engineering and Managerial Finance - how to manage the technical processes of engineering programs with the principles of managerial finance, contract law, and probabilistic decision making.
This book comes with a spreadsheet for making the needed estimates to increase the probability of project success. It opens with an important quote that should be a poster on the wall of any shop spending other people's money
For which of you, intending to build a tower, sitteth not down first, and counteth the cost, whether he have sufficient to finish it? Lest haply, after he hath laid the foundation, and is not able to finish it, all the behold it begin to mock him, saying, This man began to build, and was not able to finish - Luke 14:28-30
To predict the future of a software project with acceptable accuracy and precision, it is necessary to measure past project and keep track of current and ongoing projects. Estimation and measurement are closely aligned, and good historical data is of great value in estimating future outcomes of future software projects. - Opening of Chapter 2 of the book to the left.
Software Development, using other peoples money in the presence of uncertainty is a microeconomics paradigm. What choices are needed to assure project success, confirm that the funding invested produces the planned return on that spend? What choices are best for the current understanding of the uncertainties faced by the project. Both reducible and irreducible uncertainties?
To suggest decisions can be made without estimating the outcome of those choices it to willfully ignore the foundation principles of microeconomics and managerial finance of software development projects.
Those conjecturing those decisions can be made without making estimates, are participants in this willful ignorance.
I've started writing more book reviews for ACM Computing Reviews this year. I'l start putting references to the reviews here. The current review is for
I started my career in Fortran doing signal processing in graduate school for Mie scattering and processing of particle accelerator Pictures from digitized film. This is a heavy duty book for the science and engineering developers.
Education is not the learning of facts, but the training of the mind to think - Albert Einstein
So if we're going to learn how to think about managing the spending of other peoples money in the presence of uncertainty, we need some basis of education.
Uncertainty is a fundamental and unavoidable feature of daily life. Personal life and the life of projects. To deal with this uncertainty intelligently we represent and reason about these uncertainties. There are formal ways of reasoning (logical systems for reasoning found in the Formal Logic and Artificial Intelligence domain) and informal ways of reasons (based on probability and statistics of cost, schedule, and technical performance in the Systems Engineering domain).
If Twitter, LinkedIn, and other forum conversations have taught me anything, it's that many participants base their discussion on personal experience and opinion. Experience informs opinion. That experience may be based on gut feel learned from the school of hard knocks. But there are other ways to learn as well. Ways to guide your experience and inform your option. Ways based on education and frameworks for thinking about solutions to complex problems.
Samuel Johnson has served me well with his quote...
There are two ways to knowledge, We know a subject ourselves, or we know where we can find information upon it.
Hopefully the knowledge we know ourselves has some basis in fact, theory, and practice, vetted by someone outside ourselves, someone beyond our personal anecdotal experience
Here's my list of essential readings that form the basis of my understanding, opinion, principles, practices, and processes as they are applied in the domains I work - Enterprise IT, defense and space and their software intensive systems.
So In The End
This list is the tip of the iceberg for access to the knowledge needed to manage in the presence of uncertainty while spending other peoples money.
I had the pleasure of meeting Mark Kennaley a week ago in Boulder when he attended the Denver PMI conference. I have talked with Mark on several occasions on social media about the SDLC 3.0 book and concepts. And now his Value Stream: Generally Accepted Practice in Enterprise Software Development, which is a continuation of the first book, but focused not just in the development life cycle but the entire enterprise process.
I say all this for a simple reason. Mark's book is unique in that from the first page it resonated with the ideas I hold dear. It is not only well written, it contains powerful ideas that need to be read any anyone in the enterprise IT business. Mark signed the title page with a phrase that reflects my feelings on many modern topics in project management and software development.
This pretty much sums up my World View for those suggesting solutions to complex problems can be had with simple and many time simple minded ideas. Mosty ideas borrowed from quotes about completely different domains, or recycle words like Systems and Systems Engineering into psycho-babble terms that cannot be tested in practice.
I'm going to write a review of Mark's book chapter-by- chapter. I'm on chapter 3. So far it's a breathtaking read. Mandatory for anyone claiming to work in the Enterprise IT world and be accountable for the spend of other peoples money.
I met Ron at an Agile Development Conference long ago, where I was speaking about agile in government contracting. As well Ron spoke at a local ACM meeting that included many software engineering staff from US West, now Centurylink.
It was interesting to observe the interactions between eXtreme Programming and RBOC software engineers.
I've just started this book, but thought it would be a good read just to see how far the eXtreme Programming paradigm has come. I expect I'll get something out of both books. I'll do the same chapter-by-chapter review here as I'll do for Mark's book.
Making decisions in the presence of this uncertainty is part of our job as project managers, engineers, developers on behave of those paying for our work.
It's also the job of the business, whose money is being spent on the projects to produce tangible value in exchange for that money.
From the introduction of the book to the left...
Science and engineering, our modern ways of understanding and altering the world, are said to be about accuracy and precision. Yet we best master the complexity of our world by cultivating insight rather than precision. We need insight because our minds are but a small part of the world. An insight unifies fragments of knowledge into a compact picture that fits in our minds. But precision can overflow our mental registers, washing away the understanding brought by insight. This book shows you how to build insight and understanding first, so that you do not drown in complexity.
So what does this mean for our project world?
In both these conditions we need to get organized in order to address the underlying uncertainties. We need to put structure in place in some manner. Decomposing the work is a common way in the project domain. From a Work Breakdown Structure to simple sticky notes on the wall, breaking problems down into smaller parts is a known successful way to address a problem.
With this decomposition, now comes the hard part. Making decisions in the presence of this uncertainty.
Reasoning about things that are uncertain is done with probability and statistics. Probability is a degree of belief.
I believe we have a 80% probability of completing on or before the due date for the migration of SQL Server 2008 to SQL Server 2012.
Why do we have this belief? Is it based on our knowledge from past experience. Is this knowledge sufficient to establish that 80% confidence?
The answers to each of these informs our belief.
Chaos, Complexity, Complex, Structured?
A well known agile thought leader made a statement today
I support total chaos in every domain
This is unlikely going to result in sound business decisions in the presence of uncertainty. Although there may be domains where chaos might produce usable results, when some degree of confidence that the money being spent will produce the needed capabilities, on of before the need date, at of below the budget needed to be profitable, and with the collection of all the needed capability to accomplish the mission or meet the business case, we're going to need to know how to manage our work to achieve those outcomes.
So let's assume - with a high degree of confidence - that we need to manage in the presence of uncertainty, but we have little interest in encouraging chaos, here's one approach.
So In The End
Since all the world's a set of statistical processes, producing probabilistic outcomes, which in turn create risk to any expected results when not addressed properly - the notion that decisions can be made in the presence of this condition can only be explained by the willful ignorance of the basic facts of the physic of project work.
Microeconomics is a branch of economics that studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources.
All engineering is constrained optimization. How do we take the resources we've been given and deliver the best outcomes. That's microeconomics is. Unlike models of mechanical engineering or classical physics, the models of microeconomics are never precise. They are probabilistic, driven by the underlying statistical processes of the two primary actors - suppliers and consumers.
Let's look at both in light of the allocation of limited resources paradigm.
In both case time, money, capacity for productive value are limited (scarce) and compete with each other and compete with the needs of both the supplier and the consumer. In addition, since the elasticity of labor costs is limited by the market, we can't simply buy cheaper to make up for time and capacity. It's done of course but always to the determent of quality and actual productivity.
So cost is inelastic, time is inelastic, capacity for work is inelastic and other attributes of the developed product constrained. The market need is like constrained as well. Business needs are rarely elastic - oh we really didn't need to pay people in the time keeping system, let's just collect the time sheets, we'll run payroll when that feature gets implemented.
Enough Knowing, Let's Have Some Doing
With the principles of Microeconomics applied to software development, there is one KILLER issue, that if willfully ignored ends the conversation for any business person trying to operate in the presence of limited resources - time, money, capacity for work.
The decisions being made about these limited resources are being made in the presence of uncertainty. This uncertainty - as mentioned - is based on random processes. Random process produce imprecise data. Data drawn from random variables. Random variables with variances, instability (stochastic processes), non-linear stochastic processes.
Quick Diversion Into Random Variables
There are many mathematical definitions of random variables, but for this post let's use a simple one.
A simple example - silly but illustrative - would be HR wants to buy special shoes for the development team, with the company logo on them. If we could not for some reason (doesn't matter why) measure the shoe size of all the males on our project, we could estimate how many shows of what size woudl be needed from the statistical distribution of males shoe sizes for a large population of make coders.
This would get use close to how many shoes of what size we need to order. This is a notional example, so please don't place an order for actual shoes. But the underlying probability distribution of the values the random variable can take on can tell us about the people working on the project.
Since all the variables on any project are random variables, we can't know the exact value of them at any one time. But we can know about their possible ranges and the probabilities of any specific value when asked to produce that value for making a decision.
The viability of the population values and its analysis should not be seen not as a way of making precise predictions about the project outcomes, but as a way of ensuring that all relevant outcomes produced by these variables have been considered, that they have been evaluated appropriately, and that we have a reasonable sense what will happen for the multitude of values produced by a specific variable. It provides a way of structuring our thinking about the problem.
Making Decisions In The Presence of Random Variables
To make a decision - a choice among several choices - means making an opportunity cost decision based in random data. And if there is only one choice, then the choice is either take the choice or don't.
This means the factors that go into that decision are themselves random variables. Labor, productivity, defects, capacity, quality, usability, functionality, produced business capability, time. Each is a random variables, interacting in nonlinear ways with the other random variables.
To make a choice in the presence of this paradigm we must make estimates of not only the behaviour of the variables, but also the behaviors of the outcomes.
In other words
To develop software in the presence of limited resources driven by uncertain processes for each resource (time, money, capacity, technical outcomes), we must ESTIMATE the behaviors of these variables that inform our decision.
It's that simple and it's that complex. Anyone conjecturing decisions can be made in the absence of estimates of the future outcomes of that decision is willfully ignoring the Microeconomics of business decision making in the software development domain.
For those interested in further exploring of the core principle of Software Development business beyond this willful ignorance, here's a starting point.
These are the tip of the big pile of books, papers, journal articles on estimating software systems.
A Final Thought on Empirical Data
Making choices in the presence of uncertainty can be informed by several means:
This is empirical data. But there are several critically important questions that must be answered if we are not going to be disappointed with our empirical data outcomes
Calculating the number of samples needed for a specific level of confidence requires some statistics. But here's a place to start. Suffice it to say, those conjecturing estimates based on past performance (number of story point in the past) will need to produce the confidence calculation before any non-trivial decisions should be made on their data. Without those calculations the use of past performance be very sporty when spending other peoples money.
Thanks to Richard Askew for suggesting the addition of the random variable background
The 3 volume set is still in our library. Mine are hardbound, there are paper backs available now.
The books are not actually very good text books. The Lectures are just that, transcriptions of lectures. When reading them, you can hear Feynman talk, in the way several other authors write in the way they talk.
The point of this post is the Lectures are now available electronically at The Feynman Lectures on Physics.
For anyone interested in physics, or has ever heard of Richard Feynman should take a look.
My memory - after many decades - is Feynman loved students in ways not all physics professors do. He was a professional teacher as well as a physicist. His Nobel prize never got in the way of his love of students. One of our own Nobel Laureates, Fred Reines had a similar view of students - both undergrad and grad students. Dr. Reines would invite us to his house for BBQ and entertain us with stories of Los Alamos and other adventures.
Take a look, see how a true teacher writes about a topic he loves.
On a twitter discussions and email exchanges there is a notion of populist books versus technical books used to address issues and problems encountered in our project management domains. My recent book Performance-Based Project Management® is a populist book. There are principles, practices, and processes in the book that can be put to use on real projects, but very few equations and numbers. It's mostly narrative about increasing the probability of project success. But the to calculate that probability based on other numbers, processes, and systems is not there. That's the realm of Technical books and journal papers.
The content of the book was developed with the help of editors at American Management Association, the publisher. The Acquisition Editor contacted me about writing a book for the customers of AMA. He explained up front AMA is in the money making business of selling books. And that although I may have many good ideas, even ideas that people might want to read about, it's an AMA book and I'll be getting lots of help developing those ideas into a book that will make money for AMA.
The distinction between a populist book and a technical book are the differences between a book that addresses a broad audience with a general approach to the topic and a deep dive book focused on a narrow audience.
But one other disticntion is for most of the technical approaches, some form of calculation takes place to support the materials found in the populist material. One simple example is estimating. There are estimating articles and some books that lay out the principles of estimates. We have those in our domain in the form of guidelines and a few texts. But to calculate the Estimate To Complete in a statistically sound manner, technical knowledge and the underlying mathematics of non-linear, non-stationary, stochastic processes (Monte Carlo Simulation of the projects work structure) is needed.
Two examples of populist versus technical
Two from my past two from my current work.
These two books are about the same topic. General relativity and its description of the shape of our universe. One is a best selling popularization of the topic, found in many home libraries of those interested in this fascinating topic. The one on the left is on my shelf from a graduate school course on General Relativity along with Misner, Thorne, and Wheeler's Gravity.
Dense is an understatement for the math and the results of the book on the left. So if you want to calculate something about a rapidly spinning Black Hole, you're going to need that book. The book on the right will talk about those Black Holes in non-mathematical terms, but no numbers come out from that description.
The book on the left is about probabilistic processes in everyday life that we misunderstand or are biased to misunderstand. The many cognitive biases we use to convince ourselves we are making the right decisions on projects are illustrated through nice charts and graphs.
We use the book on the left in our work with non-stationary stochastic process of complex project cost and schedule modeling. Making these decisions is critical to quantifying how technical and economics risk may affect a system's cost. This book is a treatment of how probability methods are applied to model, measure, and manage risk, schedule, and cost engineering for advanced systems. Garvey's shows how to construct models, do the calculations, and make decisions with these calculations.
Here's The Point - Finally
If you come across a suggestion that decisions can be made in the absence of knowing anything about the future numbers or about actually doing the math, put that suggestion in the class of populist descriptions of a complex topic.
If you can't calculate something, then you can't make a decision based on the evidence represented by numbers. If you can't decide based on the math, then the only way left is to decide on intuition, hunchs, opinion, or some other seriously flawed non-analytical basis.
Just a reminder from Mr. Deming stated in yesterday's post
If it's not your money, there's likley an expectation that those providing the money are intestered in the calculations needed to make those decisions.
With the plethora of opinions on estimating - some informed, many uninformed - here's my list of books and papers that inform our software estimating activities for Software Intensive Systems. These books range for hard core engineering to populist texts
is not actually true after you have read the book. So please read the book and see how McConnell provides step-by-step actions for producing credible estimates.
Estimating software development starts with understanding what the software system is supposed to be doing and how we're able to measure that. This process is based on defining the needed capabilities, the measures of Effectiveness, Measures of Performance, Key Performance Parameters, and Technical Performance Measures all needed for the ultimate success of the project. Along with a Plan showing the increasing maturity of the delivered capabilities. If we don't these in some form, it's going to be a disappoiintment for those payinig for our efforts when they get to the end and the outcomes are what they were expecting.
Capabilities are not Requirements. Requirements implement Capabilities. Capabilities are pretty much fixed while the Requirements evolve. Capabilities Based Planning is the basis of project management in many Software Intensive Systems.
With the project's capabilities defined to a level needed to start the project - failing to do this results in a Death March at worse, and spending the customer's money to discover what should have been discovered before starting. With the capabilities, the project needs to be managed in a way that will increase the probability of success.
So when you hear of some new approach to project management, ask if there is any connection to a domain and a context in that domain. Because there any many ideas about how to improve the probability of project success. But without a domain and context it'll be hard to assess if they are applicable to your specific situation. Here's one way to think about this domain dependency. From solo projects to national assets, methods, processes, tools are different as is the value at risk.
There is a popular quote used by many in the #NoEstimates community, that is sadly misinformed.
Those who have knowledge, don’t predict. Those who predict, don’t have knowledge. − Lao Tseu
This of course was from a 6th Century BC Chinese philosopher, who was not likely familiar with the notion of probability and statistics developed some 900 years later. The quoting and re-quoting of Lao Tseu as an example of why estimates can't be made brings to light one of the more troublesome aspects of our modern age.
The lack of understanding of basic probability and statistics when applied to human endeavors.
Or possibly the intentional ignorance of probability and statistics as it is applied to the development of software systems. I can't really say if it is for lack of understanding, lack of exposure, or just a simple intent to ignore.
But for any of those reasons and more, here's a starting point on how to actually become a member of the modern of statistical estimating community, once it is decided that is better than ignoring the basic knowledge needed to be a steward of other peoples money.
Here's some starting points in no particular order, other than that's how they came off the office book shelf.
These are just a small sample of the information readily available at your local book store or through the mail. If you google "software cost estimating," (all in quotes) there will be 100's of more articles, papers, and web sites. As well tools for estimating software are used every single day in a variety of domains.
The Value at Risk is a starting point as well. Low value - this is defined by those providing the money, not by those doing the work, and low risk - this usually defined by those doing the work, not by those providing the money - at least in the domains we work. This Value at Risk, sets the tone. Low Value, Low Risk - and this is in absolutely no way an assessment of the relative value and risk - usually doesn't need much estimating.
Got a 6 week, 2 person database update project. Just do it. Got a 38 month, 400 person National Asset sofwtare project, probably so. Everything and anything in between needs to ask and answer that value at risk question before deciding.
So poor Mr. Tzu was sadly informed when he made his quote. As are those repeating it. In the 21 century
Those who have knowledge of probability, statistics, and the processes described by them can predict their future behaviour. Those without this knowledge, skills, or experience cannot.
From the Introduction of the book to the left.
Good estimates are key to project (and product) success. Estimates provide information to make decisions, define feasible performance objectives, and plans. Measurements provide data to gauge adherence to performance specifications and plans, make decisions, revise designs and plans, and improve future estimates and processes.
Engineers use estimates and measurements to evaluate the feasibility and affordability of proposed products, choose amoung alternatives designs, assess risk, and support business decisions. Engineers and planners estimate the resources needed to develop, maintain, enhance, and deploy a product. Project planners use the estimated staffing level to identify needed facilities.
Planners and managers use the resource estimates to compute project cost and schedule, and prepare budgets and plans. Estimates of product, project and process characteristics provide "baselines"to assess progress the execution of the project. Managers compare compare estimates and actual values to identify deviations from the project plan and to understand the causes of the variation.
For products, engineers compare estimates of the technical baseline to observed performance to decide if the product meets its functional and operational requirements. Process capability baselines establish norms for process performance. Managers use these norms to control the process and detect compliance problems. Process engineers use capability baselines to improve the production process.
Bad estimates affect everyone associated with the project - the engineers and managers, the customer who buys the product, and sometimes even the stockholders of the company responsible for delivering the software. Incomplete or inaccurate resource estimates for a project mean that the project may not have enough time and money to complete the required work.
If you work in a domain where none of these conditions are in place, then by all means don't estimate.
If you do recognize some or all of these conditions, then here's a summary of the reasons to estimate and measure, from the book.
Unless you're building sofwtare as a hobby, someone is paying you to do that work. Those paying aren't likley doing it as a hobby either. They have some expectation of getting their money back sometime in the future. Somewhere in the discussion of writing software for money, the notion of writing software for money was lost.
Those with money pay those with software writing capabilities to produce products that can be sold or put to use to create a value in return. Along the way was a disconnect that software is an end in itself. That the needs to developers trumps the needs of those providing the money for the developers. That those spending the money get to say what they'll do, how they'll do it, or what they won't do with that money.
Writing software for money as practiced in a sole contributor paradigm provides nearly infinite flexibility on requirements, cost and schedule forecasting, and the current notion of making business, programmatic, and technical decisions in the absence of estimating the cost and impact of those decisions.
When that paradigm leaks into a larger domain of producing a return on the investment from that cost, there are two varaibles that must enter every conversation. The Value generated by expending a Cost to produce an assessment of both those variables.
ROI = (Value - Cost) / Cost
A Value at Risk is one approach to assessing what processes should be in place when spending othe people money. The larger the Value at Risk requires a larger discipline of managing both the Cost and Value. There are many paradigms of Agile and the domain and context of software development, or any project for that matter, is important to assess before stating any method is applicable outside the ancedotal domain of the speaker.
The first assessment is always Value at Risk. That is, what is the cost of making a wrong decision? This is the basis of Microeconomics. This is the oppotunity cost assessment of decision making.
Microeconomics studies the behavior of individuals and small impacting organizations in making decisions on the allocation of limited resources. Cost, schedule, and technical capabilities are certainly a limited resource.
Those conjecturing decisions can be made in the absence of estimating the cost and impact have yet to show the viability of those ideas in practice, at least outside small projects with low Value at Risk.
The book The Incremental Commitment Spiral Model: Principles and Practices for Successful Systems and Software, Barry Boehm and Jo Ann Lane is a good bridge book between small low value at risk agile, Scaled Agile for Enterprise, and the full up formal DOD 5000.02 acquisition processes that are trying very hard to move into the agile domain.
The book starts with four principles:
There are extensions to these principles:
From another source The right principles trump practices everytime - Dean Leffingwell.
This notion that practices and processes can be put forward in the absence of testing them against principles has become popular.
The most visible of course is that decisions can be made in the absence of estimating the cost and impact of that decision. The principle of MicroEconomics of Software development was first stated by Dr. Boehm. Early in that #NoEstimates discussion was a comment that all those ideas are old and no longer applicable. Of course that ignores the principle of Microeconomics along with most every other principle of managing projects while spending other peoples money.
As well there are other principles of project success
Here's how to develop the answers to those Principles questions.