I am finally reading the whole of Michael Lewis’ book, The Undoing Project, about the work and friendship of two Israeli psychologists, Amos Tversky and Daniel Kahneman. Their collaboration and friendship seemed unlikely to others because they were so different in personality, but they worked closely, intensely and brilliantly for many years before they broke up in a storm of resentment. Their simitlarities are also important. Both were descended from Russian Jewish emigres, were atheists, served in the Israeli army in several wars, and were keenly interested in how the mind works and found insight through studying human errors. Both were incredibly intelligent and creative; Tversky was more outgoing and happy while Kahneman was more reclusive and described as depressed. Tversky was gifted as a mathematical psychologist and Kahneman was gifted as an applied psychologist. He advised the Israeli army on several issues over the years about training and selection of talent for military specialities. Together and singly they made pioneering contributions to the founding of behavioral economics. Tversky won a MacArthur genius award but died before receiving a Nobel and Kahneman won a Nobel prize for economics. One of the many who followed their inspiration, Richard Thaler, was just awarded the 2017 Nobel in economics. What I like about their work is that they demonstrate that our rational mind makes mistakes because of cognitive biases, i.e., our rationality is riddled with irrationality. Once again it seems that we can see plenty of truth, none of it absolute, if we look carefully.
The biases they uncovered operate on two levels. The first is ongoing across many situations and the second operates with each framing of a situation. In the first instance they found that people did not respond logically according to a cogent analysis of probabilities. That may be no surprise but they extended their research to include professional statisticians and found the same biases leading to the same errors and that is interesting. Many of their experiments involved posing scenarios and offering choices as to winning/losing/risk/certainty money and I confess this old clinical psychologist found them to be a bit arcane and begging for ecological validity. Still their results have been shown to be robust and to operate somewhat in the real world outside of the experimental design.
They described several biases, which they termed heuristics (a question here later) underlying these cognitive errors. One of these is availability, i.e., judgments and decisions are made with the information easily available, and I would add, given the ocean of information in which we moderns are drowning, information that is easily selected and usually in accord with our given beliefs. Another is representativeness, i.e., how prototypical is the phenomenon under review. This matters a good deal because we tend to think we know what will happen or what is going on if some similarity exists between phenomena. Kahneman and Tversky listed several other heuristics about base rates (failing to understand the frequency of categorical occurences in estimating deviation), sample size (believing small samples are valid), misunderstanding randomness (plenty of patterns to find though not significant), and anchoring (judgments made relative to starting point), and so on, you get the idea. They, especially Kahneman, also saw the influence of emotions. (Again, this is not news to clinical folk).
For the second level they investigated the influence of framing, i.e., how a situation is defined, and found, for example, that if a choice was framed in terms of financial loss, people took greater risks, and that if that same choice, i.e., exactly same outcomes, was framed as a gain, people were risk aversive. Again, many cognitive psychologists and pollsters understood this to be the case. Part of Kahneman’s and Tversky’s impact was based not on their rigorous systematizing and generalizing their ideas but on the fact that they were entering into the field of economics where tradition held that people, like the economists themselves of course, acted rationally. Discounting the fact that economic theories fail repeatedly to be predictive, in part because of irrationality in the system and in larger part, I think, because they are certain when they should be doubtful. (Ah, I hear the whisper of a tale about yet another civilization coming to an end.)
As I read along I wondered this about heuristics and framing: are they innate, based upon some neural algorithm or grammar like linguistic syntax, are they cultural developments like the acquired predispositons of the habitus? Are there individual variations even then? How we frame situations would seem cultural but also affected by personality, e.g., pessimists frame one way, optimists another, reclusive creative Kahneman one way, blithe and logical Tversky another. The judgment that something is an heuristic that serves us well except in key situations is based upon a knowledge of statistics and probability, and these are modern refinements of cognitive operations. It is telling that those whose intellect has been trained in statistics make the same mistakes as those who have not.
The larger issue for me is that we are animals, that our native talents for logic etc. are biological, and that our feelings, however inaccurate they may be in some modern situations, are the evolved basis of our intellect. To understand the embodied mind requires an understanding of our biological roots, how our capabilities are adaptive and maladaptive. Heuristics are both. Consider this speech given by Kahneman in 1974 entitled “Cognitive Limitations and Public Decision Making,” where he said he worried about “an organism equipped with an affective and hormonal system not much different from that of a jungle rat being given the ability to destroy every living thing by pushing a few buttons.” Further, “crucial decisions are made today as thousands of years ago in terms of intuitive guesses and preferences of a few men in positions of authority” and “the fate of entire societies may be sealed by a series of avoidable mistakes committed by their leaders.”
Consider a message Tversky gave to historians, essentially that as they formulate the patterns of history, seeking to explain what, why and how events transpired, their efforts are marked by the same heuristics and biases as any other such efforts. Tversky’s and Kahneman’s research had shown that two more biases important in this field. One is that once people form an intellectual product they hold on to it despite evidence to the contrary. The other is that people think their predictions based upon hindsight are more certain than they really are. Further, Lewis states that their work countered Santayana’s famous dictum, “Those who cannot remember the past are condemned to repeat it” because knowing the past actually contributes to repeating it, i.e., making the same mistakes again. And that makes a lot of sense to me.
My intent here is always to understand how our humanity arises through our biology, hormones, emotions, heuristics and all but especially our empathy and symbolization. Tversky and Kahneman have little to say about our biology but their work points to the messiness of our biological selves and contributes importantly, I think, to Monod’s ethic of knowledge. Now I live in 2017 America where many citizens and leaders do not understand the fragility of life and society, do not understand the importance of making decisions through a rational decision-making system that takes into account the vulnerability and limitations of our mind, and all too many actively reject an ethic of knowledge. Oops! How has American society come to this (end)? Travel on while still we can.