This edition of Defense Acquisition, Technology & Logistics has an article titled "Replacing Risk with Knowledge to Deliver Better Acquisition Outcomes."
If we
- Focus actionable, meaningful, and relevant knowledge on reducing risk - we need to discover all the elements of risk, probe each path of the system architecture looking for possible and probable failure modes and the effect on the system. The system can include machines and well has personnel using those machines.
- Capture all information is some form of "model." sysML (System Modeling Language) is one approach. sysML is the cousin to UML used in software development systems modeling. There are tools for sysML, but the simple pencil and paper is a starting point. Asking and answering the question what are the components of the system, how do they interact, what happens when this part fails, what are the externalities of the system that will cause of problems during development or operations?
- Capture all knowledge from the subject matter experts, both internal and external. Who knows most about the type of system we're building? Go find that person or persons and have them tell you all the reasons it will fail. Find people who know people and ask the same question. If there is any one motivation for social networking in the program management world, it is finding people who know more than you do and asking them to identify people who know more than they do. There is no such in the modern world of "knowing what to do" without testing that knowledge on others. The hubris of development knowledge "I know what I'm doing" is the number one risk.
- Provide the best solutions through Analysis of Alternatives (AoA). AoA is a systems engineering discipline rarely practiced in the commercial IT world. Again many times from the hubris of failing to see beyond one's own experience.
- Make the best decisions based on the outcomes of the AoA. Testing those decisions against bench marked system, models, actual development, and again external subject matter experts.
What This Means in the Face of Unknown Unknowns
Unknown Unknowns (UnkUnk) can only exist if in fact that class of risk is UNKNOWABLE. That is the risk to your project, program, system or any "engineered" outcome is in fact hidden from all efforts needed to discover it.
If you don't have enough money to seek the risk, if you don't have enough time to seek the risk, if you don't have the political will to seek the risk, then that does not mean the risk is a UnkUnk. Regardless of the naive and possibly lame definition provided by sources, the risk is still there and it is KNOWABLE with sufficient time, money, and skill. You're just not willing to pursue it.
So when you proceed using the definition of an UnkUnk :
The term unknown unknown refers to circumstances or outcomes that were not conceived of by an observer at a given point in time.
Ask your self, am I lazy, am I competent enough to actually manage this project, am I behaving with the hubris of Donal Rumsfeld, when he failed to pursue all the conseqeunces of invading a foriegn land with insuffucient - but readily available - understanding of the consequences?