The uncertainty triangle

I am currently reading the “Analyst-Oriented Volume: Code of Best Practice for ‘Soft’ Operational Analysis” of the “NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making”, which was published in 2012 and is available here. The guide is the product of an international team and provides for very interesting reading to those of us who are deeply interested in soft systems thinking. I came across an interesting diagram, which I would like to discuss briefly.

Uncertainty         In chapter 2 “Problematic situations and ‘soft’ OA” we find the normal explanation of the nature of wicked problems or messes. One of the takeaways of the chapter is that “Uncertainty is one of the key phenomena that need to be addressed in a problematic situation as it characterises a problematic situation to a large extent.” There is no doubt about that. Churchman developed the (dialectical) systems approach as his philosophical life’s effort to enable management “to secure improvements in the human condition by means of the human intellect.” (Churchman 1982, 19). By the verb “secure” he meant that in the larger system over time the improvement persists.

The problem with messes       Problem solving often appears to produce improvement, but the so-called “solution” often makes matters worse in the larger system. This has given rise to the concept of “wicked problems” or “messes”, which chapter 2 in the NATO report explains really well. “The important point is that in managing messes, the best we can hope for is to dissolve them. Next best is to resolve them. The notion that [wicked] problems can rarely be solved is so important that we cannot stress it enough.” (Mitroff et al. 2013, 21) Hence, the pervasive sense of uncertainty.

The uncertainty triangle         … was first suggested by Dreborg et al. (1994). In order to cope with uncertainty, three basic attitudes or biases towards it may be distinguished:  (1) prediction, which tries to make as accurate predictions as possible; (2) control, which seizes the initiative and takes control of the relevant parts of the environment; and (3) acceptance, which admits that one can never know exactly what is going to happen, so 100% accurate prediction or 100% complete control can never be achieved, especially when preparing a response to wicked problems or complex messes. One possible response would be to design a flexible, module-based response strategy in preparation of any contingency or challenge that might occur. In general, best response uses a mix of all three attitudes or biases. Adopting only one will be rarely sufficient. That’s good advice.

Churchman’s search for assurance      In 1968 Churchman designed a generic framework to better cope with the problem of uncertainty in social planning, i.e. all planning involving humans, including military planning. In 1979 he completed this teleological framework. It contains 12 categories of assurance or uncertainty. The question is how these 12 categories relate to the 3 attitudes? Here is my effort to an answer: (1) Churchman accepts the idea of uncertainty, so acceptance is part of his approach; (2) the best resolutions involve an optimum mix of insights from the 12 categories, which resembles the optimum mix of attitudes; (3) the categories of control are linked to the role category of the decision-maker, but the conditions for control are linked to the categories of environment and measures of success; (4) the notion of prediction can be linked to the categories of expertise (I am using here the terminology developed by one of Churchman’s students, Werner Ulrich), which are linked to the role category of the planner, who is charged with making a plan that can be implemented.

The problem of acceptance       … is that in Churchman’s framework acceptance can only result from optimizing all twelve categories (or fundamental teleological considerations) interdependently. Acceptance is conditioned by assurance, which is found in a concept of the purposive whole, which itself can be linked to ideas of purpose, effectiveness, sustainability and legitimacy. This doesn’t mean that Churchman is opposed to making the best possible models for prediction and control, quite  the contrary. He developed his (dialectical) systems approach as a complement to his earlier, seminal work in operations research (syn. operational analysis), on which he had started during the Second World War at Frankford Arsenal.

Assessment       The triangle is particularly good as a warning against control and prediction bias, which must come as quite a shock to some planners. It underscores the problem of uncertainty in planning for messes or wicked problems. The conditional nature of acceptance should be clarified, but perhaps Dreborg et al. have done that in their original 1994 article. I would add a little asterisk to denote that acceptance needs to be qualified, conditioned, or clarified by a range of other considerations.


Wijnmalen, D.J.D. et al, “NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making – Analyst-Oriented Volume: Code of Best Practice for ‘Soft’ Operational Analysis”, RTO-TR-SAS-087, 2012

Dreborg, K-H, Eriksson, E.A., Jeppson, U. and Jungmar, M., “Planera för det okända? Om hantering av osäkerhet”, FOA-R–94-00005-1.2–SE, 1994 (In Swedish. Google translation of title: “Planning for the unknown? How to handle uncertainty”)

About csl4d
This entry was posted in General. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s