Jim Highsmith's presentation at the PMI Agile Community of Practice, titled Beyond Scope, Schedule, and Cost: Optimizing Value caused me to think about were we're going with agile development methods on our large programs.There are words used in Jim's presentation that are also used in our domain (aerospace, defense, and embedded systems).
Let me start by saying the notion of "optimizing" value is NOT unique to agile - although many in the agile community may make that claim. From Value Stream Mapping, to Business Process Improvement, to my beloved Integrated Master Plan / Integrated Master Schedule method embedded in the DoD 5000.02 Integrated Logistics all measure increasing "value" to the stakeholder as the primary assessment of progress.
Any credible business management process measures increasing value as the primary performance attribute. I say credible so those processes that are used as "red herrings" by the agile community are eliminated frmo the discussion.
IT is typically a "cost center," rather than a "value center." I've experienced this first hand.
The presentation has some useful pieces of information. But let me walk through the slides and provide my observations from the point of view of where the words go off track.
- I'm motivated to comment here for several reasons. I'm now involved in the integration of agile development methods and Earned Value Management on IMP/IMS programs for the US DoD.Take a look at the IMP/IMS guidance above. This is a paradigm of defining - in a Plan - how the program will produce increasingly mature elements from the work efforts. There is a move afoot to use agile software development in the acquisition process for DoD IT. I'm speaking at an upcoming conference on this very topic Earned Value + Agile = Success. My motivation is to remove the hype, fog, and much of the unsubstantiated claims around agile and get down to measurable beneficial outcomes.To put agile to work on these programs - within the boundaries of the Federal Acquisiton Regulations.
- While the slides need to audio to go with them to have a complete understanding of the topics, there are some slides that a crystal clear in their message. And sometimes the message is wrong or at least misinformed in the domain of our software intensive systems development.
- As well this type of power point violates the principles espoused by Cliff Atkinson's Beyond Bullet Points. And that makes it hard not to have a gut response to the slides alone in the absence of the audio. Sinice we earn our living writing proposals for DoD, DOE, and NASA contracts, I've become sensitive - maybe overly sensitive - to communication methods using PPT. The "oral presentation" is many times the opportunity to win or lose. So I'm hyper sensitive to poorly constructed power points.
So now that you've downloaded the presentation - sorry the audio requires a PMI membership - let me speak to some issues. Too bad since some of the criticism of the briefing is corrected with the audio.
This is the disconnect BTW from this style of presentations. The visual content of the charts MUST standalone in the absence of the speaker, otherwise readers of the charts will be confused or even misinformed about the message.
So let's look at some charts. Iit woudl have been nice to have page numbers ;>)
- Page 5, the notion of "conforming to plan" is simply poor project management. The Plan is the strategy for completing the project. Plans change, the weather for large construction projects causes change. The failure of a launch vehicle changes the plan for an earth observing process (last week). The distruoption of a work site due to an accident changes the plan. You think the Plan stays fixed in the presence of a rain storm when we're pouring concrete? When organizations "conform to Plan," in the absence of any process to change the plan in the presence of new information, they get what they deserve. This is one of those arguments that says "people doing bad things, need to change how they how they do them." The real answer is "people doing bad things, need to stop doing bad things."
- Page 14 – This example does not tell us if these are the same projects, same teams, same environment. Correlation does not automatically mean causation. Might be true, but it’d be nice to see the real numbers. I know this firm personally. I worked there "after" they brought in an XP (eXtreme Programming) method. In the absence of product development governance processes, the results went in the ditch. Several leaders of the XP initiative were let go, budgets cut, and product lines disrupted.
- Page 22 - The diagram on this page is actually from a briefing I provided to this same firm during the recovery from the XP experience. Glad to see it again. This approach actually comes from the Quadrennial Defense Review Capabilities Based Planning (CBP) paradigm. CBP originated with the Canadian Forces and came to US DoD after it's effectiveness was recognized from other defense organizations. The key here is you must define what DONE looks like in units of measure meaningful to the decision makers. Without this knowledge no development method is going to save you from failure. This if course is harder than it looks. This of course is why failure rate in the IT is high.
- Page 26 - this is typical "cooking the books" slide. The term "waterfall" has to be thrown in the start the conversation. Then a graph showing no value until the end. "Doctor, Doctor, it hurts when I do this." "Then stop doing that, you bonehead." I wish Jim would actually ask to see a curve from a "real" IMP/IMS program where both Earned Value and Program Maturity (capabilities to the war fighter for example) are in place. And stop using and reusing these completely bogus charts. Read the IMP/IMS guidance and stop doing stupid things on purpose. I once mentioned to Jim that we had a poster campaign at Rocky Flats with that phrase "doing stupid things on purpose." Jim took offense that I was demeaning the audience (in this case in the cafeteria) but it's even more true today. Why do projects, their managers and participants, and the sponsors continue to do stupid things, like not measuring increasing maturity and increasing value to the customer (in our case, the war fighter). The only answer I still have is they do it on purpose.
- Page 37 - this is called the Integrated Master Plan (IMP). This is great advice. There is detailed guidance for building the IMP. Google will find it for you.
- Page 38 - this is the lamest Gantt chart I have every seen. This is an example of how those who don't actually work in the domain where Gantt's are used to successfully manage programs, use them - misuse them actually - to make a point that is essentially a "red herring." These style charts are forbidden in the domain I work. We use many other notations for showing how the work elements (Work Packages) are dependent on each other. This is absolutely mandatory for any non-trivial project. Without this understanding the project will fail over time. I've written about Long live the Gantt, Revisiting the Gantt, and just plain Gantt Charts. The Gantt is just a "picture." There are good Gantt's and bad Gantt's. It's another "red herring." Brought about by what I think is simple inexperience in domains where planning and scheduling are professional skills. Large construction, manned space flight, mission critical software intensive domains. But inexperience or poor experience all the same.
- Page 39 - where are the dependencies? No dependencies, just a list of work? How hard is it goint to be to manage this list when there are actually interdependencies between those little pieces of paper on the wall? In the perfect agile project there must not be any dependencies. But here's the real problem - what is the domain where this approach (sticky notes on the wall) is appropriate? On the program where $600M of software flight avionics is being built for an "on schedule, on cost, on specification" fixed launch date of the space craft? Probably not. But then where.
The next months Agile PMI presentation is about the 7 Deadly Sins of Project Management. Look here for the pre-response. Mike's got good points, but has similar "domain and context free," advice.
So Why This Criticism?
While most of these items are "small" in the overall scheme, there is a critically important point.
Until we start to have "real" discussion about the known and verified benefits of agile in domains where agile is most needed - defense, large enterprise (ERP), mission critical systems (process control) - then we're going to get presentation like this.
These presentations are informative - to a point - but many times contain red herrings, mis-representations, and generally fluffy information. No units of measure, no comparative data baselines, no specific domain and context guidance. We're asking specifically how do you deploy agile in a flight avionics environment, or may agile in a front line provisioning system for combat materiel. These contain rapid response, changing requirements, and all the other "drivers" agile claims to address.
But we've got to "get real" here, and stop making claims on small projects that are conjectured to work outside the domain and context of their anecdotal examples.