This post continues the one on Baron’s search-inference framework (here). It follows chapters 2 and 3 of Baron’s textbook (1994 edition) on human decision-making entitled Thinking and Deciding. I will make references to critical thinking and systems thinking where appropriate.
Cognitive psychology … is a relatively young branch of science, which seeks to understand the human thinking processes. The two most referenced scientists in Baron’s book are Kahneman and Tversky (1937-1996), who did most of their important work together in the late 1970s. Daniel Kahneman won the 2002 Nobel Memorial Prize in Economics for their development of prospect theory, which is also dealt with in Baron’s book (chapter 18 in the 2nd edition). Prospect theory is central to human economic behaviour. It provides a descriptive model of decision-making under conditions of risk. It shows how human decision-making is affected by attitudes to risk rather than following the expected utility theorem of classical economics. The general attitude is that of avoiding bad feelings due to economic loss. So in general people are more risk aversive than predicted in optimization theory.
Normative and prescriptive models Normative models of human behaviour specify decision-making under ideal conditions. That would be the expected utility theorem that was just mentioned. Normative models typically follow from principles. For instance the principle that given known probabilities of economic utility, the decision would be in favour of optimization, irrespective of one’s emotions following loss. In Churchman’s dialectical systems approach a clear example of such a principle would the principle of non-separability, i.e. the principle that all relevant factors must be discounted in a systemic decision. Because ideals generally remain unattainable, prescriptive models are designed to find the best possible decision or solution under the circumstances. These models advance rationality by allowing the kind of thinking that helps us achieve our goals.
Heuristics How people think in reality depends on how people decide to think. In many cases this is not a single decision, but a decision complex. People find that certain thinking routines suit them well and use them more often. Most people make this type of decision in a certain cultural climate. In the context of this blog (which is about systems learning) it is pertinent to point out that most people in the West are trained to apply analytical principles of linear causality instead of synthetic principles of non-linear causality. These thinking routines are called heuristics. A linear thinking routine is described here, to be followed by the non-linear model that is better suited to systemic problems. Thinking routines or heuristics come in many forms and are short-cuts in thinking. A simple example is the question: “what if everyone did that?”, which helps us reflect on the negative consequences of more people showing the same (risky, dangerous, costly etc.) behaviour.
Limited resources In many real-life problems we cannot spend an eternity collecting all the necessary information for a thorough decision. That’s why Baron came up with the three principles of active open-mindedness (see previous post). A very simple model of open-mindedness can be found in the bottom right of the concept map of this post (see above): the first step is to found out the goals and some contrasting possibilities of action and belief together with some evidence for their justification. The next step is to seek counterevidence, reflect on the consequences for the goals, actions and beliefs and make adjustments accordingly. This process should continue until we reach the point of diminishing return, where it is clear that the additional thinking effort does not seem to generate additional worthwhile insights. We should often settle for ‘appropriate confidence’ rather than strive for an unattainable ‘truth’.
Thinking methods Over the ages, a great many thinking methods has been developed to achieve our goals. There is the logical reasoning of the classical Greeks, decision analysis of post-war management science, the scientific method as it evolved since the 17th century, legal argumentation which goes back to the romans and the middle ages, and systems thinking, which is an evolving discipline. According to many, including Ackoff, we now live in the systems age, so that would make systems thinking particularly important. It applies an approach that provides a strong and useful contrast with much scientific thinking. Thinking methods are very useful in education. They often have stood the test of time and mature didactics are usually available to make students acquainted with them at a starting professional level. In the case of systems thinking, however, the didactics are still evolving, which provides the rationale for this blog.
Bias and self-deception In table 2.1 of the 4th edition of Baron’s Thinking and Decision (2014) we find 53 neatly categorized biases in human thinking. Another book, Heuristics and Biases (co-edited by Kahnemann) suggests there are many more. We are not from Vulcan. Emotions get in the way of rationality in many, many ways. A fair degree of self-esteem can help one to perform well in many domains, including decision-making. Excessive self-esteem, however, lies behind many harmful forms of irrational self-deception. Baron leaves one form of self-deception goes unmentioned, viz. deception-perception. One could call this the perception bias in thinking. In real-life, complex situations, no matter how hard we try, every form of perception will wholly or partially suppress the perception of part of reality. Hence the need to “look through the eyes of others”, i.e. to apply some sort of systems thinking method, which is the true reason behind many participatory approaches.