Musings & Threads. Temperature Rising.
I was among the pioneers in the field of Human Factors. We designed the human/computer interface for all electronic equipment using an allocation model. Basically, we assigned functions to the computer that the computer would do best…memory, retrieval, storage…and assigned functions to the human based on what we believed the human did best…decision making, problem solving, “big picture” analysis. We were misguided.
Cracks in our thinking began when we discovered Simon’s Nobel prize winning research on satisficing. Satisficing is a decision strategy that allows us to make good decisions quickly. “Satisficers” choose the first good option. For example, a “satisficer” scanning a restaurant menu is likely to choose the first item that appeals to him or her; rather than taking the time to evaluate all options and selecting the “best” meal. Satisficing is an extremely efficient strategy, but it doesn’t always yield the optimal solution. It yields a “satisfactory” one.
Subsequent research revealed that we are not particularly good at gathering information either. We make significant errors in perception. These errors occur because of our attention span, memory and sensory limitations, biases, and just plain “laziness.” We inject our own values in perceiving and storing information. There are so many errors that it would be impossible to list them all in this post. Here is a sample: leniency error, where we choose to “go easy on an individual;” halo error, where we ignore contrary data; primacy error, where we give a stronger weight to the first information (e.g., first impression); familiarity, where we prefer the familiar; priming which will set how we perceive information. For those who like to be entertained by perceptual errors, look up the “gorilla experiments” of Chabris and Simons. In this study, subjects were required to count the number of bounces between two basketball players. It was a task that required close attention. During that sequence someone in a gorilla costume walked across the court, even getting close to the basketball players. Yet, 50% of the participants never saw the gorilla.
Another serious judgement error we make is a causal one. Most of you probably know that “correlation is not causation,” but our need to make sense of the world biases us to infer causality. What this means is that just because things happen at the same time doesn’t mean they are causal. For example, I am a NY Giants football fan and I have noticed that every time I wear blue the NYG lose; and when I don’t wear blue they win. Do I think that the color of my clothing determines whether my team wins? Of course not, the Giants have had miserable seasons and I wear blue a lot, but the correlation remains virtually perfect. (However, Redskin fans may wish for me to continue to wear blue during football season, just in case.) I see this error a lot in commentary and opinion pieces. For example, I read a commentary that expressed a belief that there was prejudice against Catholics based on negative behaviors by a group of activists toward a group of Catholic men. These young men were white, wore MAGA caps and attended a pro-life rally. Since there were many possible causes for the scuffle, inferring that Catholicism is the cause is making the correlation/causation error. In our efforts to see a predictable, understandable world, it is our nature to jump to causation.
Stories vs facts. Another common error that we all make is confusing stories with facts. We are natural story tellers, so it is part of our DNA that we “fill in the factual gap” with stories. A simple example occurred when I was at the dog park yesterday, I waved several times to a friend who did not wave back. The fact is that he didn’t wave back, the story becomes “he didn’t see me,” or “he is mad at me,” or “he never liked me,” or “he is going blind” or “he hates my dogs” or “he was embarrassed to know me,” etc. We choose the story that fits our belief system and then we merge it into the facts. Unfortunately, we need very few facts to create our stories.
It also turns out that we also make a lot of judgement errors. Kanneman summarizes decades of research in his popular book, Thinking Fast and Slow. It is a sobering and damning critique of our capabilities.
If I could rate the worst of our errors, I would say it was confirmation bias. We form opinions based on our beliefs and then rally data to support it. Then we ignore, de-value or even lie about inconvenient facts that don’t support our beliefs.
Unfortunately, our decision-making limitations (AND WE ALL HAVE THEM!) have contributed to the political chasm that exists today. Gathering supporting data for an opinion, without scrupulously analyzing all of the alternatives merely incites anger in those who do the same in forming an opposing position.
Not surprisingly, when we add emotion, such as anger, frustration, religious beliefs, our perceptual, memory and judgement skills diminish even more.
Our opinions, commentaries, signs, biased news programs do not move a conversation forward, they only exacerbate our poor decision-making skills.
But we share one trait that can help us. It is even stronger than our deficient perceptual and judgement skills. It is our need for connection, to connect with other people. That need can bring out the better angels of our nature. It can allow us to listen and empathize. It can enable us to reach out to the other side.
We can do this. We can be better than we are now.