When things go wrong, people have a tendency to want to blame someone, often as soon as possible. It makes us feel better to find the culprit or get the ‘bad apple’. We have the opposite tendency when it comes to ourselves. The cognitive dissonance of not meeting our self-image or expectations can be so powerful that we make up stories to cover our failures. And we actually believe them. This happens to judges, lawyers, doctors, nurses, and many other professionals. But it happens less frequently with pilots. Why? It’s all about the systems they work in.
Aviation is one of the few industries that:
- Encourages learning from mistakes on an industry-wide scale.
- Leaves blame for after any accident investigation (usually).
- Uses data and science to confirm or deny assumptions.
- Tests hypotheses through simulation before implementation new procedures.
- Disseminates new knowledge openly and widely.
Matthew Syed calls this Black Box Thinking in his book of the same name. Syed provides examples of how many of our work systems are broken, suffering from closed loop thinking by not allowing us to test assumptions and conventional wisdom. One excellent example is of 5 Israeli parole judges whose cases over a 10 month period revealed some shocking statistics. The chances of being paroled just after breakfast were 65%, while the chances of parole just before lunch fell to 0%. The judges did not make the same types of decisions when they were hungry. No one had ever looked at these human tendencies. While we may think that we are rational and professional, we are not. Data and science can help us build systems that are better, safer, and ensure equality.
“The evolutionary process cannot function without information about what is working , and what is not. The information can come from many sources, depending on the context (patients, consumers, experiments, whistleblowers, etc.). But professionals working on the ground have crucial data to share in almost any context. Health care, for example, cannot begin to reform procedures if doctors do not report their failures. And scientific theories cannot evolve if scientists cover up data that reveal the weaknesses in existing hypotheses.”
I would strongly recommend this book to anyone in management or interested in improving organizational performance. This is one of the best books I have read in the past few years, along with Gary Klein’s, Seeing What Others Don’t. Some of Klein’s recommendations align with the personal knowledge mastery framework, in that people need to seek out new connections in order to get new insights, as well as put forth ‘half-baked’ ideas in order to test them. Syed has similar recommendations.
“But consider the following questions. Do you fail in your judgments? Do you ever get access to the evidence that shows where you might be going wrong? Are your decisions ever challenged by objective data? If the answer to any of these questions is no, you almost certainly are not learning. This is not a question of motivation or intelligence, but of iron logic. You are like a golfer playing in the dark.”