Systems thinking for evaluation design

Video transcript, summary, and concept map

In March, 2015, Bob Williams, the main author of Wicked Solutions and several other books on systems thinking, gave a 2-day workshop “Wicked Solutions: A Systems Approach to Complex Problems” on the use of the systems approach in evaluation design at the Research Institute for Humanity and Nature (RIHN) in Kyoto, Japan. RIHN is a development organization that conducts practical transdisciplinary studies of development problems and their solution. Some of the researchers participating in the workshop were involved in peatland management research on Bali and Sulawesi, Indonesia. Peatland is a major part of coastal wetland geomorphology around the world (4 million km2). The catch (22) is that peat grows naturally, but the process is reversed when the land is drained for agriculture. Peat covers half of the Netherlands, where the peatlands – which are often already below sea level – subside to ever lower levels, while sea levels are rising. A small part of the 2-day workshop was taped on video, edited, and posted on Youtube in two 30-minute parts youtu.be/lFcWhGE7moQ and youtu.be/5RRHpXl2hrw. They are not so much about peat but rather about the principles of systemic design. The transcription and slides have been combined in a single pdf. This post contains the concept map and summary. Have fun.

The video’s      … are in two parts: (1) introduction to systems thinking and the concepts; and (2) core concepts & the deep dive (that is, mostly of “Wicked Solutions”). In the concept map these two parts are clearly separated by a dotted line, yet the two parts are closely related. The introductory part provides the background for the second part. And the second part explains in adequate detail what the first part is mostly about.

Introductory part       Systems thinking is an interdisciplinary field of science and practice. As an academic interdiscipline it borrows from and contributes to almost all ‘straight’ disciplines including biology, philosophy, engineering, economics, cognition (i.e. psychology), computing, and management science. Systems thinking is special in that it integrates elements of ontology and epistemology. Sometimes it is more on the ontological side (e.g. systems analysis), sometimes more on the epistemological side (e.g. critical systems), sometimes it’s in the middle (management sciences, information systems). As a result of this enormous variety there is obviously no such thing as THE systems theory, there is just the “systems field” with a lot of ramifications in theory and practice that do not always same to have much in common with each other. There are a number of concepts that pop up quite often, e.g. planning, uncertainty, complexity, learning, (problematic) situations, and change.

The problem       Most people agree that systems thinking is a good antidote to the multitude of more narrow approaches to human reality. The trouble is that the systems field is so wide that hardly anybody knows where to start. Learning how to apply systems thinking in a practical way is itself is complex problem. People who need to learn systems thinking are usually experts in a field of their own (e.g. ecologist, soil scientist, mechanization expert etc.), each dealing with a subsystem of a larger whole. Below is an illustration of how an agricultural system is an integration of a natural ecosystem and a social system with about 20 subsystems (and counting, because I don’t see the peatland system, the irrigation and drainage system, the agricultural research system, the farmer communication system, the agricultural extension system etc.). My educational background is in tropical agriculture (BSc) and irrigation engineering (MSc), so I get really fired up about this. Below picture is from a set of slides from Ray Ison (here).

The solution       The problem of “allowing evaluation to become more systemic without having to adopt, accommodate and learn specific systems methods” was “solved” in 2004 by bringing a group of (about 16?) systems and evaluation experts together in Berkeley to explore “how to promote the use of systems ideas in monitoring and evaluation, not in terms of specific systems methodologies, but based on a set of principles. In other words, allowing evaluators to modify their existing practice rather than learning entirely new practice.” (Williams, 2016) After three days they came out of their room with three basic concepts: inter-relationships, perspectives, and boundaries.

Publications      From there Bob Williams was involved in a number of publications: (1) Systems Concepts in Evaluation (with Imam, 2007), with a first account of the three concepts (p. 6, only inter-relationships is termed “entangled systems”); (2) Systems Concepts in Action (with Hummelbrunner, 2011), in which 19 systems methods are described in terms of the three basic concepts; (3) Wicked Solutions (with Van ‘t Hof, 2014/2016), which provides a clear-cut and generally applicable methodology for applying the three basic concepts without any previous knowledge of systems thinking whatsoever; and (4) Using Systems Concepts in Evaluation Design (2016), which specifically applies the Wicked Solutions approach to evaluation (also available in Spanish). There is also a recent USAID discussion note on systemic monitoring (2014). The story doesn’t end there. The Systems in Evaluation TIG (Topical Interest Group of the American Evaluation Association) published a final draft of Principles for Effective Use of Systems Thinking in Evaluation in September 2018, which adds “dynamics” as key concept. This may not be necessary if we think of inter-relationships as a spatiotemporal concept, as has always been the case with C. West Churchman (1968, 1971, 1979). In 2014, Bob received the American Evaluation Association (AEA) Lazarsfeld Theory Award for his contribution to systems approaches in evaluation.

Wicked Solutions       … is by far the most practical of all these. In the bottom half of the concept map above its essence is explained. Each of the concepts could be considered a step in the process of designing an intervention, research program, or evaluation. Each step uses a set of tools and ideas to make it work: (1) rich picturing is used to map inter-relationships, which helps identify systemic structures, patterns, processes and dynamics, but also stakeholder values, norms, resources, conflicts, aspirations, goals, and motivations; (2) stakeholder analysis and stake analysis is used to identify possible framings (or “images”) of the problematic situation and possible purposes of the design. Since we are dealing with human, soft or social systems (which includes practically all systems, even the ‘hard’ ones), they can only be properly understood by discussing their purpose and meaning to the people involved; (3) critical heuristics is used to discuss boundaries. It helps us answer questions raised in the previous two steps by “revealing , exploring, and challenging boundary judgments associated with a situation of interest.” (Williams & Imam, 2007).

Participants        During the workshop the RIHN participants apply the Wicked Solutions methodology to their own research design in Indonesia. At the end four of them give comments. What they say is highly relevant. They consider the systems approach particularly useful in multi-stakeholder situations so as to enable them to make a more comprehensive design or project plan. Another issue is that the approach enables better stakeholder communication. The main problem is that it is difficult to use insights from the system approach to redesign an ongoing plan. Let me add a few comments to this: (1) the approach makes it less likely to ignore key stakeholders; (2) it is more likely that all stakeholders are heard and that they hear each other; (3) the key to application of the approach is to use it at as early possible a time. This will work much better if there is time to conduct project pilots or evaluation tests; and finally (4) there will always be learning as the activity unfolds. Learning will be better if the initial plan is more comprehensive, i.e. more systemic.

Final notes      In the workshop Bob uses cases from Latin America, Africa and Asia to illustrate the practical value of Wicked Solutions. They all happen to be in the field of agricultural development: (1) irrigated rice and malaria control in Peru; (2) irrigation development and pump procurement in Mali; and (3) peatland management in Indonesia. The Mali case is the detailed, worked-out case used in Wicked Solutions to ensure that starting systems learners can directly apply the methodology. Sometimes I use the words “solution” and “problem”. These terms must be understood in a systemic way, which means that most of the time there is not one, easily identifiable “problem”, nor is there one identifiable “solution.” For more information about that (i.e. wicked problems and the like), go here. Bob’s website is here.

This post has been reblogged by The Systems Community of Inquiry at https://stream.syscoi.com, the global network of systems thinkers, scientists and practitioners (Thanks!).

 

Advertisements
Posted in General | 1 Comment

The “enemies” of the systems approach

Here is a link to the abstracts of the individual chapters of C. West Churchman’s ‘The systems approach and its enemies’, which was Churchman’s last book of his great trilogy. Churchman’s work is the focal point of CSL4D. I used the abstracts to produce a concept map (see below), which I will describe in this post. I believe it could serve as a useful introduction to Churchman’s work. This post has been reblogged by The Systems Community of Inquiry at https://stream.syscoi.com, the global network of systems thinkers, scientists and practitioners (Thanks!).

Systems     Social systems are all systems with humans in them. The human environment is full of systems: organizations, businesses, projects, governments, nations, the world, shopping systems, transport systems, security systems, financial systems etc. All of these systems have been designed. The systems idea implies that systems have components (subsystems) and boundaries. Systems are not closed, they are mostly conceived as semi-open, which implies that their boundaries are subject to debate. In soft systems lingo we speak of the “boundary critique.”

Problems     Many human “systems” function poorly. The question is: what can be done about it? Fixing the most obvious problem in systems often doesn’t work, but even makes the situation worse,  delays a real solution, or simply costs a lot of money and effort to keep it functioning, be it poorly. Systems thinkers believe that such problems – often designated “wicked problems” – are caused by taking too narrow a view of the situation. Such a narrow view is what Churchman calls the ‘environmental fallacy.’ This fallacy is not part of classical logic. In fact, a whole set of new principles had to be developed in order to provide a solid foundation for addressing wicked problems. One of them is the Maximum Loop Principle (see here).

People      Among the principles that Churchman developed are also the four principles of deception-perception. The key idea is that when we perceive a problematic situation in a certain way, this obscures certain key aspects that can only be brought to light by perceiving  it in a totally different way. The essence of the systems approach, “therefore, is confusion [deception] and enlightenment [perception]” (The Systems Approach, p. 231). In Churchman’s words, we must see through the eyes of others to discover the restrictedness of our own perspective (including that of ‘experts’ or ‘managers’). We must also abandon any claims of absoluteness (without falling victim to the idea of relativity, which is replaced by the idea of approximation). Churchman’s great “trick” is to build a categorical framework centered on people. These people play roles in system design as client, decision-maker and/or planner. “And/or” here means that people can play different roles at the same time. It is the key to systemic inquiry as described in “The design of inquiring systems” (Churchman 1971, see also here).

Enemies       The systems approach is a rational system. Humans are not always rational, or our last name would be Spock or something. Rationality is also not easy to achieve. It is not always very convincing. Many a rational systems plan has disappeared in the bottom drawer (as have many other plans). Churchman identified 4 main “enemies”: politics, morality, religion and aesthetics. Politics is the easiest to explain: politics is all about getting enough support to get things done. In other words, politics is necessary for implementation. This usually means that the plan has to be adjusted to suit the wishes of various supporting parties. The plan becomes less rational. It will not serve the intended client (or beneficiary) as well as the planner would like to. That’s when the boundary critique becomes important. Similar things happen with the other enemies.

Heuristics        The categorical framework is a heuristic for operationalizing Churchman’s  systemic principles. The categories unfold by asking the basic ethical question: who is and who should be the client? What is and what should be the purpose? In England Churchman’s framework is used a lot, be it in the form of Ulrich’s “critical heuristics”. Ulrich was a Swiss student of Churchman in the late 1970s. His “critical heuristics” uses an adaptation of Churchman’s categorical framework. The good thing about critical heuristics is that it is easier to learn. It leaves out the entire “enemies” part. It is also part of the methodology described in Wicked Solutions, which has been used repeatedly at universities in Europe, Australia and the United States.

Significance      … is the last category of Churchman 4 x 3 framework. Churchman went flat out to make sure the dialectical systems approach was a significant contribution. Part of his approach was to seek the maximum generality. This allowed him to contrast it to existing philosophical schemas, oppose it to classical logic, and ensure the broadest possible applicability. This doesn’t mean there are no problems with it. In the preface Churchman writes that generality in the context of social systems is a highly debatable concept, because social systems are never wholly rational nor wholly irrational. They are beyond that. This means that in the end the effectiveness of the systems tool is in the eye of the beholder. If it works, it is OK. But that type of argument would apply to many another approach, especially the one associated with politics.

Book abstract       The design of social systems (organizations, businesses, projects, governments, nations, the world) in the most general sense is described and guidelines for system design are provided. It is argued that social system design is beyond dichotomies such as rational-irrational, teleological-ateleological and objective-subjective. Churchman proposes the dialectical systems approach as a rational solution for the so-called “environmental” fallacy, but admits that irrational “enemies” or non-friendly viewpoints are necessary to come to a fuller understanding of the systems approach. The historical roots of the systems approach in Western and Eastern thought traditions are explored, the limitations of classical logic are outlined, and the need for an alternative “whole system” form of logic and objectivity is discussed, in which people are the center of the planning reality. The first nine categories of Churchman’s framework are described, followed by a detailed discussion of the problems of a systems approach defined from within as contrasted with the outside perspectives of the “enemies” such as politics, morality, religion and aesthetics. Suggestions for unfolding the different categories into each other are included.

Well, that’s it. Now you can read the abstracts or the original book. Do not forget to put some of the abstracts, or at least the book abstract, in your university’s catalogue. You may also prefer to make your own abstracts. It’s good exercise. To get a better feel of what it is all about you may also go through a worked out case in Wicked Solutions

Posted in General | 1 Comment

Thought, knowledge, and the systems approach

As I wrote  earlier, I am currently studying two books, one by Jonathan Baron (Thinking and deciding, 2nd edition) and the other by Diane Halpern (Thought and knowledge: an introduction to critical thinking). My need is for a thorough overview of both to better situate the systems approach in the general theory of complex decision making and problem solving, so my posts about both books are not reviews. My last post about Baron can be found here. This post is about Halpern’s. Baron and Halpern serve slightly different audiences, but the subject (critical thinking, deciding) and approach (cognitive psychology) are very similar. Having gone through both volumes showed how books about critical thinking can be a design challenge. Baron solved it neatly by sticking to his search-inference framework throughout the book. I don’t think there are any topics that are dealt with in one book and missing in the other. The differences are mostly a matter of emphasis. Both books are very rich in insights and examples. I summarized both books in concept maps. In this post I describe my concept map of Halpern’s book. 

Critical thinking         .. is about the application of problem-solving and cognitive (thinking and learning) skills to problems in general and ill-defined situations in particular. The problem-solving skills are used in strategies that help resolve these problems. There is no point in applying one skill (say problem-solving) and forget about the others. They all hang together. This even applies to the memory aspect of the learning skills. The scientific basis for critical thinking comes from the learning sciences, cognitive psychology in particular.

Key intellectual skill      Many of the problems in the 21st century (in business, government and our own lives) occur in complex situations that are characterized by ambiguity. The ambiguity is the result of the inherent complexity of the situation, the individual values (or personal goals) of the actors involved and even the emotional aspects of their perception of the situation.

Creativity      … (or innovativeness) can be enhanced by problem-solving strategies. The problems we face are increasingly complex and lack clear-cut solutions, yet we feel a need for desirable outcomes (competitiveness, popular support etc.). Halpern describes creativity as the searching of people’s knowledge nets (i.e. memories) for remote ideas that can be connected to form new ideas or concepts and innovative solutions.

Fallacies and biases     Critical thinking is often associated with learning a large number of fallacies and biases that muddle our reasoning. Good reasoning is as essential in science as it is in real life. Many of our reasoning errors can be avoided by a better understanding of hypothesis testing and the use of statistics. Other forms of reasoning are based on building a correct argumentation. This can be improved by using argument analysis involving the critical assessment of assumptions and counterarguments.

Systems thinking       … also deals with complex situations, yet Halpern’s book contains no mention of it. It does, however, contain a reference to the systems approach in the form of a psychological application to creativity. She writes that some psychologists “believe it [creativity] is the confluence of many different types of variables that make creativity more or less probable. Of course, the creative individual is central to the process, but she or he works in a domain of knowledge (e.g., art, physics, decorating, teaching, horse breeding, and any other domain you can think of). The contributions of an individual to a domain are judged by the people in that field—the art critics, funding agencies for physics  research, editors of decorating magazines, etc. The people in the field act as the gatekeepers for a domain—they decide what is and is not valued.” The systems approach is of course well known to psychologists, see e.g. “A brief history of systems approaches in counselling and psychotherapy” (Bauserman and Rule, 1995).

Churchman’s dialectical systems approach           … seems relevant to critical thinking in a variety of ways: (1) it can handle complex situations; (2) it can handle ambiguity; (3) it encourages creativity and innovativeness; (4) by looking at (and ideally involving) actors playing different roles it improves reasoning in a variety of ways, creating a propitious environment of sensitivity, synergy, and serendipity; (5) its full application requires a boundary critique, during which assumptions and counterarguments are identified and critically examined, thus avoiding a great many biases and fallacies. A straightforward methodological application of Churchman’s dialectical systems approach is described in Wicked Solutions (see also here, here or here).

Book abstract        The 5th Edition of Thought and Knowledge: An Introduction to Critical Thinking applies theory and research from the learning sciences, including cognitive science, to teach students the critical thinking skills they need to succeed in today’s world. Critical thinking is the use of purposeful, reasoned and open-ended cognitive skills and strategies to increase the probability of a desirable outcome when solving problems, formulating inferences, calculating likelihoods, and making decisions. The role of memory is elucidated and a broad range of fallacies, biases, and techniques of propaganda is described. Other key topics include deductive reasoning, argument analysis, hypothesis testing, statistics, decision-making, problem-solving skills, and the need for creativity. It is emphasized that most of the problems encountered in life are ill-defined. Twelve different strategies for generating and evaluating solutions are presented, including means–ends analysis, working backwards, random search, rules, hints, split-half method, brainstorming, contradiction, and analogies and metaphors. All of the strategies to enhance creativity involve searching an individual’s knowledge net so that remote ideas can be associated, analogies can be applied across domains of knowledge, and information that is stored in memory can become available.

The companion website to the book is available from routledgetextbooks.com
The abstracts of the book and the individual chapters can be downloaded here (pdf)

Posted in General | Leave a comment

Thinking and deciding: concept map

I am currently studying two books, one by Jonathan Baron (Thinking and deciding, 2nd edition) and the other by Diane Halpern (Thought and knowledge: an introduction to critical thinking). My need is for a thorough overview of both to better situate the systems approach in the general theory. The common factor is that all three – deciding in general, critical thinking and the systems approach – use heuristic methods to deal with ambiguity, biases, fallacies, and ill-founded assumptions in one way or another. The question is: how do they relate to each other? From the look of things Baron has the most general theory, which – according to his own theory – should allow for some transfer of learning to the other two. So, that’s why I will start with Baron. I have already made some preparations in 5 previous posts in the form of an abstract for each of the 24 chapters, which I used for creating a meaningful concept, which is organized in 4 colums. It works for me, so I hope it will do the same for you, in spite of the about 100 concepts. (Halpern’s book will follow in a future post).

Search-inference framework              The cognitive psychologist Baron describes thinking about beliefs, actions and goals in a single, fundamental framework for thinking processes of search and inference. This process has the following main steps: (1) thinking is kindled by the realization (dubbed ‘doubt’) that a certain belief in a certain action may not be justified; (2) thinking, especially rationally thinking, then seeks to resolve those doubts by searching for alternatives (or possibilities) of action, belief and even goals; (3) evidence is then sought to justify, or rather infer the strengths of certain courses of action or belief; (4) the process ends by taking a decision about how well certain actions are expected to help achieve certain goals. The framework embodies the so-called principles of open-mindedness to avoid biases, fallacies, ill-applied heuristics, and ill-founded assumptions in our understanding and decision-making. The framework also serves as a basis for thorough approach to the normative, descriptive, and prescriptive aspects of judgment and deciding. The framework could also be used as a heuristic or thinking method in its own right.

Part I: Thinking     … is all about inference. Human inference – as opposed to its automated form in artificial intelligence – is studied within the field of cognitive psychology. Charles Sanders Peirce divided inference into deduction, induction, and abduction. Deduction uses formal logic, induction uses modern science to generalize from controlled observations, and abduction uses insight to try to diagnose (i.e. to hypothesize the most likely conclusion) an ambiguous, uncontrolled situation (ambiguity). Abduction leads to relatively uncertain or provisional conclusions, but in daily life there is often no alternative some sort of trial-and-error. If it works, it is OK. If it doesn’t it suggest the need for searching another possibility that fits the evidence better. This may involve reweighing the evidence. Formal logic has little use in the real world. Science is great, but if things get complex, which they do in reality, it has its limitations. The problem with abduction is that trial-and-error is beset with difficulties. So we want to infuse as much insight as possible. Thinking, especially abductive thinking, requires all human faculties, including memory, intelligence and creativity. It is good to be aware of the biases and errors one may fall victim to, including rationalization and resistance to correction. Social systems thinking belongs in the category of abductive thinking methods.

Part II: Probability     …. is all about statistics. Statistics can be subdivided in probability theory, hypothesis testing, and correlation theory. Statistics is useful because it is able to handle uncertainty. Baron is interested in it as a tool in decision making but also as a tool for identifying human biases. Cognitive studies of probability judgment have shown a great number of biases, including anchoring, gambler’s fallacy, availability heuristic, and hindsight bias. A serious problem in hypothesis testing is that we may be testing the wrong hypothesis. In fact people tend to rely on direct testing, rather than indirect testing. This is known as the congruence heuristic, which works like a confirmation bias or myside bias. Correlation theory poses its own problems. People tend to confuse correlation with causality, which is not a good idea, because addressing a single cause in a complex situation often makes things worse (a well-known phenomenon among systems thinkers). This is mostly due to attention bias.

Part III: Decisions          …. is the part that looks at some practical issues in decision making, especially quantitative judgment and moral thinking. Baron stresses the adaptability of utility theory as a normative model for handling trade-offs and for analysing fairness, social dilemmas, and decisions about the future, e.g. by discounting. Cognitive studies have shown that quantitative judgment of people is often way off, especially when people try to assess a situation without the use of calculations (“unaided holistic judgment”), which suggests the need for thoroughgoing decision analysis. Statistics and utility theory are two normative models with a high transfer of learning, i.e. with generality applicability in a wide range of contexts.

Systems approach       Churchman designed his systems approach with generality – of the type that enables learning transfer – in mind. In the concept map we can read that “heuristic methods support decision making, which mostly deals with complex material, where a structure serves a purpose, and that needs to be disentangled.” It also shows that: “principles of active open-mindedness prevent insufficient or partial search and the common fallacy of overweighing the evidence.” In my view, utility theory, decision analysis and statistics are all very good normative or prescriptive models, but in many cases these models are best used after applying the systems approach. The main reason is that the systems approach is more powerful for disentangling the ‘structure that serves a purpose’ (i.e. the system, however it may be bounded) and for conducting the search for possibilities, goals and evidence. When it comes to learning critical thinking  and decision-making skills, the dialectical systems approach has a lot to offer. The heuristic model of Churchman’s categorical framework would seem highly suited for operationalizing Baron’s search-inference framework and bring to life in a natural and coherent manner the notions, theories, models and techniques of critical thinking. Food for thought. The dialectical systems approach is particularly good in avoiding the myside bias and in learning by understanding, using various systems principles (deception-perception, maximum loop).

Logic, statistics, & cetera        Logic is the main theme of part I, while statistics is the main theme of part II. It is interesting to note that Churchman started his career with a PhD on logic, somewhere in the Dust Bowl years of the mid-1930s. It was called “Towards a General Logic of Propositions” (Churchman 1979, xi). Generality is an ideal (e.g. for learning transfer), which is very hard to approximate. Churchman concluded that logic falls in the trap of tautology or self-referencing with open eyes. Statistics is another theme in his early career. It had a double appeal to him in those years: statistics was a developing field of study with great practical utility in scientific management, the American discovery at the beginning of the 20th century. Thanks to the war he could apply his knowledge of statistics to improve the quality of small arms munitions. This gave him his first successful experience in real management and set him up for his contributions to the development of management science in the 1950s, which is about decision-making, the main theme of part III. Management science uses computer-based, quantitative models. Churchman found out their moral limitations and designed the dialectical systems approach. The quantitative and moral dimensions correspond to the two main sub-themes of part III.

Conclusion       Reality is ambiguous, evasive, uncertain, unstructured. Decision-making cannot be based on logic or quantitative optimization alone, because they are non-ambiguous in nature and seek certainty where none can be found. Statistics cannot solve this either, mainly because it tries to look at uncertainty in particular relationships, not at the situation as a whole.Certainty is an ideal that can only be approximated by giving place of honour to ambiguity. The denial of uncertainty is at the root of many human problems. A true framework of ambiguity attempts to look at decisions from a broad perspective. Churchman’s dialectical systems approach is a very good starting point. Wicked Solutions is based on that approach and provides a methodology for its practical application.

Posted in General | Leave a comment

Decisions and plans (Part III of Baron, 1994)

In a previous post I said I would cover parts II and III of Baron’s guide to the psychology of human decision-making entitled Thinking and Deciding (1994 2nd edition). Part III is about decision making, including both the decisions that affect only the decision maker and the decisions that affect others, that is, decisions that raise moral questions. It also examines long-term planning, with special emphasis on the choice of personal goals. The part is particularly concerned with the inference of a course of action from our goals and from evidence concerning the consequences of our options for achieving them.

Normative theory of utility    … concerns the theory of how we should choose among possible actions under ideal conditions.  It is concerned with inference not search. The basic idea is that utility – or desirability of outcomes – should be maximized, so the best decision is the one that best helps us achieve our goals. In the case of conflict it is a matter of maximizing total utility. The concept of utility represents whatever people want to achieve: virtue, productive work, enlightenment, respect, love. It is not the same as monetary reward, happiness or satisfaction. wide range of human goals. This model serves as a theoretical ideal that we can use to justify prescriptive models for decision analysis, whether in business, government, or medicine. There are three main aspects: (1) expected-utility theory, which considers the trade-off between probability and utility; (2) Multiattribute Utility Theory (MAUT), which considers the trade-off between different goals; and (3) decision that involve conflict among the goals of different people. Normative utility theory helps justify the prescriptive model of actively open-mindedness, because it could find a neglected possibility with a higher expected utility. The same applies to evidence, goals, and biased weighing of evidence. Ideally, belief formation should take place after a thorough search and unbiased evaluation of evidence. Search thoroughness must be in proportion to the expected benefits. (Ch. 16 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Decision analysis       … is normative utility theory of decision making applied to real-life situations such as airport location, nuclear waste disposal, and school desegregation. Decision analysis, as discussed in this chapter, attempts to assess the utilities and probabilities of outcomes to calculate the expected utility of various options. It is not a simple cost-benefit analysis, because that would confuse market value with subjective value. Instead, outcomes are represented by how much they are estimated to help achieve a given set of goals. Decision analysis can also be used as a second opinion or to evaluate decisions after they are made or to resolve conflict among different decision makers. To be useful, he need for thorough search must be respected. The result of the analysis must be contrasted to the initial intuition to see why they differ. Various techniques of utility measurement are discussed, including prediction, direct scaling, difference measurement, a standard gambling approach, conjoint measurement, and Multiattribute Utility Theory (MAUT). Special emphasis is placed on aspects such as trade-offs and the value of human life. When it comes to teaching decision making to 8th-grade students a promising approach may be to give additional emphasis to the types of errors that people make in the absence of decision analysis, such as short-sightedness (neglect of relevant goals), single-mindedness (neglect of alternatives and evidence), impulsiveness, and neglect of probability. (Ch. 17 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Choice under uncertainty       This chapter provides descriptive theory on the way in which we respond to risk or uncertainty about the outcome of an option. We tend to be averse to risks when the alternative is a gain that is certain. Biases in decisions under uncertainty include: (1) neglect of probability; (2) preference reversals and contingent weighting; (3) the Allais paradox; (4) prospect theory, which provides the simplest possible ‘explanation’ for observed violations of expected-utility theory; (5) the certainty effect; (6) anticipated emotional effects of outcomes, including regret, rejoicing, disappointment, elation and their rationality in decision making; (7) the ambiguity effect in the case of unknown risks, such as those of nuclear power and genetic engineering; (8) aversion to missing information; (9) deferral under conditions of uncertainty or conflict. It is concluded that some apparent violations, such as those caused by regret, are not necessarily violations at  all, since an overly simple analysis could have neglected real emotional consequences. Other violations might result from using generally useful heuristics in situations where they are harmful in achieving our goals. In the case of ambiguity deferral points to the need for acquiring more information. All of the violations occur in holistic judgment, but can be avoided if utility theory and decision analysis are used. (Ch. 18 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Choice under certainty        … concerns decisions in which we analyse as if we knew what the outcomes would be. Many biases found in such decisions involve an incomplete search for goals. In the extreme form only a single goal is considered: the “bottom line,” the defeat of communism, the protection of abstract “rights.” Such singlemindedness often involves social roles, as in the case of soldiers, scholars, politicians, lawyers and the like, but can be cured by actively open-minded searching for other goals that may on some occasions outweigh their professional goals. Attention to a single, dominant dimension can lead to unnecessary disagreement between groups and to violations of each individual’s true preferences. Making trade-offs helps – when we have the time to do it. Biases (or heuristics) in making trade-offs include: (1) the prominence effect; (2) attitude to (non-)compensatory decision strategies; (3) the status quo (endowment) effect; (4) omission (inaction) bias; (5) opportunity costs (easily neglected); (6) reference point framing; (7) integration vs. segregation in mental accounting; (8) reference price manipulation; (9) intransitivity of preferences; (10) elimination by attributes; (11) asymmetric dominance; and (12) compromise. Most of these heuristics save time, so we can use them provided we do so knowingly. Often a good way to avoid getting in trouble is to develop counterheuristics (“What would we do if …”). (Ch. 19 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Quantitative judgment       This chapter examines further our ability to make consistent quantitative judgments without the aid of formal theories. Quantitative judgment seeks to improve on unaided holistic judgment by evaluating cases on the basis of a set of evidence and given goals with respect to a set of criteria or attributes, somewhat similar to Multiattribute Utility Theory (MAUT). It consists of rating (numbers, grades), ranking (order), and classifying (assigning to one of two groups). Topics of quantitative judgment include: (1) multiple linear regression; (2) the lens model (a statistical prediction model that looks at decision making and goal attainment on the basis of cues for piecemeal judgments and their combination into an overall score); (3) the mechanism of judgment; (4) non-numerical impression formation; (5) averaging, adding, and number of cues; (6) representativeness in numerical prediction; and (7) classification. The main lesson of the research on quantitative judgment is that people are overconfident in their holistic judgments. Increased awareness of this ought to make jus more tolerant of one another, as well as more accepting of new (computer-based) method for decision making. This is less obvious when moral issues are involved (public policy, personal lives) we tend to disagree over. (Ch. 20 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Moral thinking       Certain issues obviously involve moral questions, including abortion, property rights, capital punishment, aid for the poor, and almost all other political issues, but wherever our choices affect the utilities of others, a moral decision must be made, including in our own lives. In the first part of this chapter, the nature of moral judgment is examined, with special attention to their universality. In the second part, it is argued that utilitarianism provides a way to understand and analyse moral issues in terms of the consequences of decisions. It also provides a way to think critically and constructively about our basic moral intuitions, the beliefs we acquired as children. The distinction between “intuitive” (normative) and “critical” (pre-/descriptive) moral thinking is examined in detail. According to R.M. Hare we should usually follow our moral intuitions, but we should also, however, be able to think critically about them and about decisions in particularly important cases. Kohlberg’s research suggest that Hare’s distinction corresponds to different “levels” of moral thinking, which suggests that many people are not capable of critical-level thinking about many issues. Of course, some of the biases in moral thinking are exactly the biases that occur in all thinking: failure to consider goals (in this case, moral goals), biased search for evidence and biased inference and failure to examine critically our own intuitive rules. (Ch. 21 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Fairness and justice         Moral issues range from the allocation of organs for transplantation to patients to the assignment of penalties for companies that harm the environment. The question is: who gets what in terms of rewards and punishments. Utilitarianism holds that fairness is whatever yields the best overall consequences in the long run, summed across everyone. Alternative normative theories hold that fairness is an extra consideration, beyond consequences. Such alternatives can even work prescriptively in a utilitarian context, provided people have a good understanding of the relation between fairness (to the individual) and utility (for the greater good). Experiments have shown that people desire to see fairness and will try to bring it about, even when utility is not maximized or when it demands a sacrifice on their part. Equity theorists also maintain that people want rewards and punishments to be proportional to “inputs” of some sort (effort, ability, end result). The key question is: what is the ultimate goal? Perhaps the most important problem is that people do not understand the justifications of the principles that they endorse, even when these principles can be linked to utilitarianism or some other theory. (Ch. 22 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Commons dilemmas        Because so many situations can be analysed as social dilemmas, much of the philosophy and psychology of morality is contained in decisions involving problem situations in which the narrow self-interest of each person in a group conflicts with the interest of the group as a whole. Examples include: cheating on one’s taxes, the arms race, accepting bribes, pollution, having too many children. Economic systems induce people to do their share of the labour and moderate their consumption. More cooperation would solve many human problems, including war. The prisoner’s dilemma is a standard example of why two rational persons might not cooperate. We often fail to recognize that we inappropriately decide to defect instead of cooperate. With a utilitarian approach it may be possible to avoid such biases. Prescriptively, we can approximate a normative decision theory by drawing on traditional modes of cooperative behaviour and by using certain heuristics that counteract the most common causes of defection and insufficient thought. The problems with three theories are discussed: cooperative theory, utilitarianism, and self-interest theory, which comes in two forms: (1) nobody cares about other people; and (2) individuals should look out for their own interest. All theories have limitations, even an explicitly more balanced one such as weighted utilitarianism. Important motives in social dilemmas include: altruism, competition, fairness, equality, envy, fear, and greed. (Ch. 23 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Decisions about the future       Planning and the potential conflict between goals for future outcomes and goals for immediate outcomes are roughly analogous to those between self and others: we can think of our future selves as somewhat different people form our present selves. We tend to be biased in favour of our present self and need methods of self-control. A distinction is made between plans and policies. Some policies are made all at once, whereas others may evolve from precedents. Plans and policies create new goals or change the strength of old ones. The chapter deals mostly with personal goals, including the choice of personal goals. We must strike a balance between the risks of striving for high achievement and the principle that our goals should be achievable. There are four good reasons to stick to plans: (1) the involvement of others; (2) the value of dependability; (3) increased chances of success; and (4) the value of trust in ourselves. Bad reasons for sticking to plans include: (1) the status quo bias; (2) the sunk-cost effect; and (3) escalation of commitment. Other subjects include the rationality of discounting, methods of self-control, emotional aspects, identity formation and the importance of morality in choosing the basic goals for our lives. A key consideration is that the essence of morality is not self-sacrifice but benefiting others, and there are many ways in which it is possible for individuals to “do well by doing good.” (Ch. 24 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Posted in General | Leave a comment

Climate politics and climate modeling

Last year two Dutch left-wing parties, the socialist PvdA and the green GroenLinks, started work on a climate act that aims to reduce Dutch CO2 emissions by 55% in 2030 and at least 95% in 2050, compared to 1990. Meanwhile work is continuing on a national climate and energy agreement. Over the past months doubts started to creep in as to the costs, feasibility and scientific basis of this act. In this post I will present the view of one of the most reasonable sounding sceptics, Prof. Dr. Kees de Lange, professor emeritus in chemical physics. The overview is very concise to forestall reader confusion or disinterest. A longer version (in Dutch) can be accessed here. I will refrain from asking certain questions or drawing any conclusions .

Temperature measurement      Climate politics rests on the assumption that the increase of atmospheric CO2 by human activity causes global warming. According to De Lange we lack reliable data to prove a significant and steady rise of the average global temperature. Conventional measurement of the average global temperature is very difficult. However, the highly reliable UAH satellite dataset of the global lower atmosphere (1979- sept.2018) shows that the temperature rise over the past 20 years has halted at about 0.2 deg. C. above the 1981-2010 average.

Carbon dioxide        It is assumed by many that anthropogenic carbon dioxide is a key driver of global warming. Therefore some people speak of anthropogenic global warming (AGW). Many natural scientists, including geologists, doubt this hypothesis. They point to the ‘inconvenient truth’ that geologically there is hardly any correlation between CO2 and global temperature. Water in its three phases of vapor, water and ice is a much more important driver than CO2.

Climate modelling       Climate modelling is incredibly complex. We lack the necessary reliable data over a sufficiently long period and we lack a whole range of scientific insights necessary to be able to build reliable models. Science progresses so it cannot be excluded that one among the 100 models we have now is able to predict future climates reliably. However, we won’t know that until after a century has passed. Climate models will have to be very precise and reliable to be able to separate out the effect of CO2 from other natural factors, including the role of water and solar activity.

Afbeeldingsresultaat voor knob climate change cartoon

Cartoon downloaded from https://wattsupwiththat.com/2013/09/03/the-tuesday-tittering-the-big-knobs-of-climate-control/  (watch the CO2 “control” knob!!!)

Climate politics       The main worry of climate change for The Netherlands is sea level rise. The country is notoriously vulnerable and the sea has always been its main enemy (and friend). Work is underway to secure the country against a rise of 1 m by 2100, so we won’t see an immediate threat very soon. During ‘recent’ ice ages the sea level was often 80 m lower than it is at present. The main assumption behind climate politics is that we can regulate the sea level and average global temperature in a very controlled way. This is what the people are promised, sometimes without saying as much. Supposedly, the main knob is CO2. However, this assumption is unproven.

Energy policies       Modern societies depend on a reliable source of electricity supply. Without it they come to a halt. In an age of globalization this means that electricity supply – as well as a host of other production factors – are as cheap and reliable as possible. Global prediction show that coal, natural gas and oil will produce about 75% of the world’s electricity in 2040. The remainder is mostly atomic and hydro energy. There is hardly any scope for the latter in The Netherlands, but nuclear power could be an option. In the latest climate agreement, the Netherlands will reduce fossil energy production to only 5% in 2050, a wide difference indeed, especially if one considers that the difference is to be made up mostly by renewables such as wind and solar power (in the IEA prediction no more than 1%). These seem to lack the necessary reliability, especially in the Dutch climate zone, hence the need for cost-efficient nuclear power, even in the short term, especially if the hitherto uncertain AGW hypothesis  can be shown to be correct.  The costs of the proposed energy transition are enormous, although there may be gains too. The effect on the sea level rise will be minimal, except in the highly unlikely event that the IEA predictions are way off.

Posted in General | Leave a comment

Probability and belief (Part II of Baron, 1994)     

In my previous post I said I would cover parts II and III of Baron’s guide to the psychology of human decision-making entitled Thinking and Deciding (1994 2nd edition). So here is part II, which is divided in five chapters to explore three major approaches to the analysis of belief formation: probability theory, the theory of hypothesis testing, and the theory of correlation. It concludes with a discussion of biases in belief formation in general. I am now halfway the book and I can only say that this one of the greatest books that I have ever read. What follows are just abstracts of each of  the five  chapter of part II. The next blog will cover the nine chapters of part III. I am looking forward to reading the last part of the book.

Probability theory       Probability theory is a normative and possibly not a prescriptive theory for evaluating numerically the strength of our beliefs in various propositions. Like logic, it is a theory of inference, but unlike logic, it concerns degrees of belief rather than certainty. Higher numbers are assigned to more certain beliefs. If quantification is impossible the rules of probability theory do not apply. Probability studies can show: (1) the rationality of beliefs; (2) people’s understanding of the model itself; and (3) the accuracy of communication about belief strengths. Any probability statement involves two subquestions: (1) its truth value; and (2) its evaluation. The three main theories say that probability is about: (1) frequencies (reflecting past observations); (2) logical possibilities; and (3) personal judgments, in which case the answers to the two subquestions are not the same. The problem with the frequency theory is that any event can be classified in many different ways. The problem with logical theory (e.g. for flipping coins problems) is that it has little everyday use. In the widely applicable personal theory, probability judgment can be based on any of one’s beliefs and knowledge, including knowledge about frequencies and logical possibilities. Different people have different beliefs and knowledge, so two reasonable people might differ in their judgments. Rules for constructing and evaluating probability judgments and Bayes’s theorem are discussed in detail. (Ch. 11 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Errors in probability judgment       Our belief strengths, as manifest in our judgments of probability, seriously violate the normative theory of probability. Since the late 1960s psychologists have shown that they are often systematically out of order. Typically we attend to the wrong variables. Misleading heuristics, naïve theories, and the basic processes that cause poor thinking contribute to these errors. With practice and education we can correct these errors and even become very good judges indeed. We tend to underestimate high frequencies and overestimate low frequencies. A similar pattern was found for confidence in the correctness of judgement. Overconfidence can be reduced by asking for ‘against’ reasons. Consistent errors of judgment can be traced to e.g.: (1) the  representativeness heuristic; (2) the gambler’s fallacy; (3) the availability heuristic; (4) hindsight bias; (5) anchoring and adjustment; and (6) averaging. Active open-mindedness can help to guard against some of these violations. When probability judgments are very important, computers can help us as well. (Ch. 12 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Hypothesis testing      … concerns the selection of evidence: asking the right question to find out how much to believe some hypothesis. Detectives and physicians do this all the time. A hypothesis is a proposition that we evaluate, or test, by gathering evidence concerning its truth or, more generally, its probability. Hypothesis formulation is the search for possibilities to test. Good experimental design involves asking questions that enable efficient inferences. When a predicted result is found, there is still the risk of false confirmation. If not, there is the risk of false rejection. Any experiment involves additional assumptions other than the hypothesis itself. Certain biases can be avoided by the active open-minded consideration of other hypotheses than those we favour. Popper’s theory for refutation of “bold conjectures” is criticized and contrasted with other theories of scientific advance by John Platt and Imre Lakatos. A good heuristic is to “be sure to consider alternative hypotheses.” Several biases prevent this, including the: (1) congruence bias (overreliance on directly testing a given hypothesis); and (2) information bias (seeking information that is not worth knowing). (Ch. 13 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Correlation theory      … deals with beliefs that concern the correlation between two variables. We tend to perceive data as showing correlations when we already believe that we will find a correlation. Often we are concerned with such relationships because we want to decide whether to manipulate one thing in order to affect another. The normative theory is provided by statistical definitions of correlation. The assumptions behind it do not apply to every situation. Descriptively, many people systematically violate the normative theory. Prescriptively we may be able to come closer to the normative model by considering all the relevant evidence, including the evidence against what we already believe. Correlation is sometimes confused with causality, which rules out other reasons. Contingency exists if one variable causally affects another. Our perception of contingency is an important determinant of our success at achieving our goals. The judgment of correlation tends to suffer of attention bias, i.e. ignoring counterevidence. In part, this insensitivity may result from our own naïve theories of the relation between beliefs and evidence. (Ch. 14 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Biases and beliefs      Irrational belief persistence is the major bias in thinking about beliefs of all types, including the strongly held personal beliefs that shape one’s moral, social, and political attitudes. This bias is the main justification for the need to be open to new beliefs and evidence against our old ones. Ideally, beliefs conform to probability. This does not work for complex beliefs that are formed over a long time. The irrational persistence of belief is one of the major sources of human folly (Bacon 1620). It involves two types of biases: (1) overweighing or underweighing of evidence in favour or against it, respectively; and (2) the failure to search impartially for evidence. Some belief persistence is not irrational. It drives some very good scientific research. The amount of belief persistence that is rational is whatever best serves the goals of the thinker. How much explaining away or ignoring evidence is acceptable? We could consider the primacy effect and the neutral-evidence principle. Determinants of belief persistence include our desire to belief that we are good thinkers and morally good people or external demands of accountability for decisions or stress. Kitchener, King et al. identified 7 developmental stages about truth, reality, knowledge, and justification that people move through as they get older. In stage 7, critical inquiry helps identify better approximations of reality. Finally, three major forms of groupthink are discussed. (Ch. 15 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Posted in General | Leave a comment