Probability and belief (Part II of Baron, 1994)     

In my previous post I said I would cover parts II and III of Baron’s guide to the psychology of human decision-making entitled Thinking and Deciding (1994 2nd edition). So here is part II, which is divided in five chapters to explore three major approaches to the analysis of belief formation: probability theory, the theory of hypothesis testing, and the theory of correlation. It concludes with a discussion of biases in belief formation in general. I am now halfway the book and I can only say that this one of the greatest books that I have ever read. What follows are just abstracts of each of  the five  chapter of part II. The next blog will cover the nine chapters of part III. I am looking forward to reading the last part of the book.

Probability theory       Probability theory is a normative and possibly not a prescriptive theory for evaluating numerically the strength of our beliefs in various propositions. Like logic, it is a theory of inference, but unlike logic, it concerns degrees of belief rather than certainty. Higher numbers are assigned to more certain beliefs. If quantification is impossible the rules of probability theory do not apply. Probability studies can show: (1) the rationality of beliefs; (2) people’s understanding of the model itself; and (3) the accuracy of communication about belief strengths. Any probability statement involves two subquestions: (1) its truth value; and (2) its evaluation. The three main theories say that probability is about: (1) frequencies (reflecting past observations); (2) logical possibilities; and (3) personal judgments, in which case the answers to the two subquestions are not the same. The problem with the frequency theory is that any event can be classified in many different ways. The problem with logical theory (e.g. for flipping coins problems) is that it has little everyday use. In the widely applicable personal theory, probability judgment can be based on any of one’s beliefs and knowledge, including knowledge about frequencies and logical possibilities. Different people have different beliefs and knowledge, so two reasonable people might differ in their judgments. Rules for constructing and evaluating probability judgments and Bayes’s theorem are discussed in detail. (Ch. 11 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Errors in probability judgment       Our belief strengths, as manifest in our judgments of probability, seriously violate the normative theory of probability. Since the late 1960s psychologists have shown that they are often systematically out of order. Typically we attend to the wrong variables. Misleading heuristics, naïve theories, and the basic processes that cause poor thinking contribute to these errors. With practice and education we can correct these errors and even become very good judges indeed. We tend to underestimate high frequencies and overestimate low frequencies. A similar pattern was found for confidence in the correctness of judgement. Overconfidence can be reduced by asking for ‘against’ reasons. Consistent errors of judgment can be traced to e.g.: (1) the  representativeness heuristic; (2) the gambler’s fallacy; (3) the availability heuristic; (4) hindsight bias; (5) anchoring and adjustment; and (6) averaging. Active open-mindedness can help to guard against some of these violations. When probability judgments are very important, computers can help us as well. (Ch. 12 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Hypothesis testing      … concerns the selection of evidence: asking the right question to find out how much to believe some hypothesis. Detectives and physicians do this all the time. A hypothesis is a proposition that we evaluate, or test, by gathering evidence concerning its truth or, more generally, its probability. Hypothesis formulation is the search for possibilities to test. Good experimental design involves asking questions that enable efficient inferences. When a predicted result is found, there is still the risk of false confirmation. If not, there is the risk of false rejection. Any experiment involves additional assumptions other than the hypothesis itself. Certain biases can be avoided by the active open-minded consideration of other hypotheses than those we favour. Popper’s theory for refutation of “bold conjectures” is criticized and contrasted with other theories of scientific advance by John Platt and Imre Lakatos. A good heuristic is to “be sure to consider alternative hypotheses.” Several biases prevent this, including the: (1) congruence bias (overreliance on directly testing a given hypothesis); and (2) information bias (seeking information that is not worth knowing). (Ch. 13 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Correlation theory      … deals with beliefs that concern the correlation between two variables. We tend to perceive data as showing correlations when we already believe that we will find a correlation. Often we are concerned with such relationships because we want to decide whether to manipulate one thing in order to affect another. The normative theory is provided by statistical definitions of correlation. The assumptions behind it do not apply to every situation. Descriptively, many people systematically violate the normative theory. Prescriptively we may be able to come closer to the normative model by considering all the relevant evidence, including the evidence against what we already believe. Correlation is sometimes confused with causality, which rules out other reasons. Contingency exists if one variable causally affects another. Our perception of contingency is an important determinant of our success at achieving our goals. The judgment of correlation tends to suffer of attention bias, i.e. ignoring counterevidence. In part, this insensitivity may result from our own naïve theories of the relation between beliefs and evidence. (Ch. 14 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Biases and beliefs      Irrational belief persistence is the major bias in thinking about beliefs of all types, including the strongly held personal beliefs that shape one’s moral, social, and political attitudes. This bias is the main justification for the need to be open to new beliefs and evidence against our old ones. Ideally, beliefs conform to probability. This does not work for complex beliefs that are formed over a long time. The irrational persistence of belief is one of the major sources of human folly (Bacon 1620). It involves two types of biases: (1) overweighing or underweighing of evidence in favour or against it, respectively; and (2) the failure to search impartially for evidence. Some belief persistence is not irrational. It drives some very good scientific research. The amount of belief persistence that is rational is whatever best serves the goals of the thinker. How much explaining away or ignoring evidence is acceptable? We could consider the primacy effect and the neutral-evidence principle. Determinants of belief persistence include our desire to belief that we are good thinkers and morally good people or external demands of accountability for decisions or stress. Kitchener, King et al. identified 7 developmental stages about truth, reality, knowledge, and justification that people move through as they get older. In stage 7, critical inquiry helps identify better approximations of reality. Finally, three major forms of groupthink are discussed. (Ch. 15 of Baron, J. 1994 Thinking and deciding 2nd ed.)

Advertisements

About Sjon van ’t Hof

Development professional who worked in rural development, tropical agriculture, and irrigation development in Chad, Zambia, Mali, Ghana, Mauritania, Israel, Burkina Faso, Niger, and the Netherlands in capacities ranging from project design and management to information management. Conducted missions to India, China, Kenya, and Bangladesh. Experience in the development and delivery of trainings in irrigation equipment selection, information literacy, Internet searching and database searching. Explores systems thinking in relation to international development, education, and management, with an ever stronger focus on the systems approach of C. West Churchman. Knowledgeable in tropical agriculture, project design and development economics, agricultural mechanization, irrigation, plant pathology, environmental degradation and protection, rural development. Co-authored "Wicked Solutions: a systems approach to complex problems", a book written by Bob Williams and Sjon van 't Hof. It was published in June 2014 and provides a practical way of dealing with wicked problems. Wicked problems are complex, ill-structured, human problem situations. This book will help you design an inquiry and intervention in such messy, wicked situations. It does so by guiding you through the steps and stages of a systemic process that addresses your own wicked problem. For more information, see https://csl4d.wordpress.com/ or http://www.bobwilliams.co.nz/Systems_Resources.html
This entry was posted in General. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s