Predictions and the Problem of Hindsight Bias

Predictions and the Problem of Hindsight Bias

TheImprovementGuide.jpg

 In The Improvement Guide, 2nd edition, Langley et al. propose this definition of a Plan-Do-Study-Act cycle:

“To be considered a PDSA cycle, four aspects of the activity should be easily identifiable:

1. Plan: the learning opportunity, test, or implementation was planned and included

  • Questions to be answered

  • Predictions of the answers to the question

  • Plan for collection of the data to answer the questions

2. Do: the plan was attempted. Observations are made and recorded, including those things that were not part of the plan.

3. Study: time was set aside to compare the data with the predictions and study the results.

4. Act: action was rationally based on what was learned.”  (p. 98-99)

It’s good to keep this definition in mind when someone tells you they regularly use PDSA. They might be missing a critical piece that will add to the power of their tests.

Of course, as the authors say, not every improvement requires the formality of PDSA as described here; nonetheless, “purposeful improvements in large or complex systems will usually require one or more cycles.” (p. 99).

The authors go on to describe why they insist on including predictions in the definition of the Plan step.

They include this reason: “Prevent hindsight bias (‘I knew it all along.’)” (p. 99)

Hindsight Bias and What to Do About It

Ulrich Hoffrage and Ralph Hertwig argue that hindsight bias is inherent to the way humans interact with the world:

“…it is a by-product of two generally adaptive processes: first, updating knowledge after receiving new information; and second, drawing fast and frugal inferences from this updated knowledge.” (Gerd Gigerenzer; Peter M. Todd; ABC Research Group. Simple Heuristics that Make Us Smart (Evolution and Cognition); Chapter 9: Hindsight Bias: A Price Worth Paying for Fast and Frugal Kindle Locations 2583-2585. Kindle Edition.)

Hoffrage and Hertwig go on to cite Baruch Fischoff’s cogent summary of why hindsight bias is a problem for anyone attempting to learn from experience, including those of us using PDSA cycles:

“When we attempt to understand past events, we implicitly test the hypotheses or rules we use both to interpret and to anticipate the world around us. If, in hindsight, we systematically underestimate the surprises that the past held and holds for us, we are subjecting those hypotheses to inordinately weak tests and, presumably, finding little reason to change them. Thus, the very outcome knowledge which gives us the feeling that we understand what the past was all about may prevent us from learning anything from it.” (Baruch Fischoff, “For those condemned to study the past: Heuristics and biases in hindsight.” In D. Kahneman, P. Slovic, & A. Tversky (Eds.) (1982), Judgment under uncertainty: Heuristics and biases Cambridge, UK: Cambridge University Press, p. 343.)

If Hoffrage and Hertig are right that hindsight bias is a natural result of human evolutionary history, then we should expect hindsight bias in me and you just about every time we try to learn from experience.

Conscious prediction, as part of the Plan step, serves as a strong corrective to hindsight bias, which otherwise can severely limit the value we might extract from tests.

So it looks like The Improvement Guide authors have it right:  prediction is not a nice-to-have feature of a Plan-Do-Study-Act cycle but vital to getting the most out of learning.

Notes

I discussed prediction related to PDSA in two previous posts:

"Predictions drive deeper learning:  true for pre-verbal infants as well as for you and me" (here) and "It's tough to make predictions especially about the future but it's worth the effort" (here).

Pancake Sunday: Lessons from Shigeo Shingo

Pancake Sunday: Lessons from Shigeo Shingo

The Nature of Science, According to a Scientist

The Nature of Science, According to a Scientist