What are Operational Definitions Good For?

What are Operational Definitions Good For?

operational.jpg

W.E. Deming discusses operational definitions in Chapter 9 Out of the Crisis (MIT Center for Advanced Engineering Studies, Cambridge, MA, 1986).

Until last week, I thought of operational definitions only in terms of measurement. 

In my defense, Deming’s chapter is almost exclusively about measurement.  Deming gives multiple examples of customer-supplier agreements and specifications that refer to measurements of physical properties and social phenomenon.   Later in the chapter, he draws on his knowledge and experience with the Bureau of the Census to explain that there is no true number of inhabitants in a Census count. 

With my eyes firmly locked on measurement, I’ve acted as if a product or service is produced off-stage and then arrives for inspection: we need an operational definition to determine how to measure conformance to one or more specifications.   With an operational definition, we can count defective items and measure attributes.

While Deming’s chapter gives examples and details related to measurement, an early paragraph holds a clue to another application:

“An operational definition is one that people can do business with.   An operational definition of safe, round, reliable, or any other quality must be communicable, with the same meaning to vendor as to purchaser, same meaning yesterday and today to the production worker.” (emphasis added, p. 277).

To do a good job, the person doing the work must know the meaning of ‘good’.   In Juran’s formulation of ‘self-control’, a person doing the work needs to know what to do and be able to determine whether the work product is good or not.

Shingo advised that knowledge of goodness should be built into the work, so that the person doing the work can carry out a ‘source inspection’ and catch and repair errors before passing on a product or service that does not meet requirements.

We need operational definitions to communicate the meaning of a good job.

Example from the "What Matters Project"*

Colleagues working in primary care have drafted a guide to engaging patients in "What Matters".  They’ve outlined a sequence of steps—set-up, invitation to conversation, asking questions, summarizing, next steps, and documentation. 

The guide is a good start in developing a standard procedure for care team members.   With a standard, supervisors can train new people.  The team using the standard has a foundation to improve their work.

The advice for documentation asks the user to enter "What Matters" information into a longitudinal plan of care section in the electronic medical record.   The information is supposed to be ‘clear, concise, and reflective of the patient’s values/wishes.’ 

In examining a patient's record, how would a care team member or supervisor know whether documentation is clear, concise and reflective of the patient’s values/wishes?  More fundamentally, how would a care team member know how to create ‘good’ documentation in the first place?

Start by developing an operational definition for each of the three attributes. 

For example, what is ‘clear’?

Applying Deming’s advice in Chapter 9, this means you need to:

  • Describe the test to determine clarity:  what are the steps, who does them? 
  • Specify the criteria for judgment.
  • Decide whether a documented statement is clear (yes or no).

If we can’t figure out an operational definition, then we don’t have a meaning we can communicate to members of the care team who are charged with documenting "What Matters".

In the "What Matters" example, the three qualities must be met simultaneously.   A transcript or recording of the "What Matters" conversation might meet ‘clear’ and ‘reflective of patent’s values/wishes’ but perhaps not achieve the ‘concise’ quality.

To be concise, the team could set a limit in number of characters—140 or 280 characters  to align with current Twitter requirements, perhaps supplemented by a link to a full transcript to meet the other attributes.

Whether or not you agree with my suggestions, the key to reliable documentation is to have an operational definition for the attributes, agreed to by the people doing the work.   In the case of "What Matters" documentation, it’s a good idea to ask a few patients to weigh in, too.

*I discussed "What Matters" here and here.

When Purple Becomes Blue:  Perception Changes With Prevalence

When Purple Becomes Blue: Perception Changes With Prevalence

Measuring Process Adherence:  The Small Sample Insight

Measuring Process Adherence: The Small Sample Insight