Michael Hatfield conjectured that once the baseline schedule is established, risk management is just institutional worrying. I'm here to tell you that just ain't the case on any real program. Institutional worrying sounds like a phrase used by someone who doesn't have to worry about showing up on time, on budget, and on specification and not killing anyone or the program along the way.
For those of us working programs that have contracts that describe the outcome in specific terms continuous risk management is a way of life. Terms like a budget limits, a schedule commitments, and a set of promised system capabilities, "worrying" about the compliance to those contractual terms is replaced by the management of actionable outcomes provided by the information we get from the risk management processes, Earned Value Management, systems engineering models and general program planning and controls processes. These processes are usually wrapped up in a Program Performance Management Process Guide, used by everyone on the program.
A sample of one of these guidebooks is Project Performance Management Process Description for Corporate Items Greater Than $20 Million.
A simple example is the probability of completing on or before a specific date and probability of that having that work cost a target amount or less. Below are two "notional" pictures generated by the Risk+ Monte Carlo Simulation tool to provide this information.
The first is the Cumulative Distribution Function for the completion date. The Monte Carlo Simulation tool generated this distribution function by picking possible durations for all the work activities driving this deliverable. The tool then records the completion dates and constructs this graph and the histogram. What this means is the samples used for the durations are "plugged" into the duration field of the work activities (tasks) and "watched" activity is recalculated for the FINISH date. This is done a few 1,000 times using a special random number sampling process. The result is this Probability Distribution and Cumulative Distribution graph.
The table on the right is a summary of the probability of "completing on or before" a specific date. We like to use 80% as a guide. This means there is an 80% probability (chance) of completing on or before 2/23/99. If the target completion date is something earlier than that, then we have a problem. If the target date is later than 2/23/99, then there is margin in the schedule.
The same can be said for the cost of reaching the target delivery date. With a resource loaded schedule and the associated costs for those resources, a cumulative distribution function of the probability of the effort costing some value or less can be modeled. In this notional model the target cost for the "watched" deliverable has an 80% probability of costing $181,611 or less.
This is Not Institutional Worrying or Mumbo Jumbo Numbers
First this modeling is mandated on DoD, DOE, NASA, and DHS contracts through DID 81650. Second every Program Performance Management Process Guide at every aerospace, defense and DOE program I have ever worked mandates a "Schedule Risk Analysis" (SRA) of the cost and schedule for the program. In 81650 for example, it says
A schedule risk assessment predicts the probability of project completion by contractual dates.Three-point estimates shall be developed for remaining durations of remaining tasks/activities that meet any of the following criteria: (1) critical path tasks/activities, (2) near-critical path tasks/activities (as specified in the CDRL), (3) high risk tasks/activities in the program’s risk management plan. These estimates include the most likely, best case, and worst case durations. They are used by the contractor to perform a probability analysis of key contract completion dates. The criteria for estimated best and worst case durations shall be applied consistently across the entire schedule and documented in the contractor’s schedule notes and management plan.
This activity is performed monthly...
Monthly analysis is a monthly assessment of schedule progress to date and includes changes to schedule assumptions, variances to the baseline schedule, causes for the variances, potential impacts, and recommended corrective actions to minimize schedule delays. The analysis shall also identify potential problems and an assessment of the critical path and near-critical paths. Thresholds for reporting significant variances to the baseline schedule and near-critical paths shall be specified in the CDRL. If a CPR Format 5 is required, the monthly analysis shall be submitted to the procuring activity prior to or concurrently with the CPR Format 5.
To do otherwise would be foolish of Program Management and nonsense from the program planning and controls process point of view. To not assess the future performance from the past performance is equally foolish.
Simple tools are available for MSFT Project and other scheduling tools that do the work for you. These examples are from Risk+.
The term NOT CREDIBLE comes to mind when I hear people talk about cost and schedule in the absence of these approaches. Baselining the program and then not reexamining the assumptions and impacts from past performance every time new performance status is taken. That approach is a nice way to drive right off the cliff.
Here's some background materials for building and executing a CREDIBLE project plan. These processes are performed on a weekly basis on most of our programs. And on a monthly basis as a minimum with the submission of the Contract Performance Report (CPR), the Integrated Master Schedule (IMS), and the related Risk Management reports.
This actionable information is used by the Program Manager to make decisions that increase the Probability of Success for the Program. To do otherwise would be like:
Driving in the rear view mirror in the dark.
It can be done, but it is very sporty business.
How To Build A Credible Performance Measurement Baseline