Sunday, March 22, 2009

Where's Waldo? Finding Reliability in Surprising Places

Last Friday, I talked about look-alike, sound-alike (LASA) drug names, throwing out for discussion just one of the system-level risk points that predispose people to err. Remember Waldo, the popular figure in the striped shirt kids have fun trying to find? I think of risk points as "Waldos" because they're often significant stumbling blocks, features hidden in plain sight that undermine the ability to deliver intended care. Without specific measures that make Waldos visible in healthcare, they frequently derail our good intentions.

(Oh, and if I’ve sparked your interest in LASA-related drug errors, you can click here to access a graphics-rich, user-friendly, CE-granting tutorial about this problem. It offers a more complete discussion of LASA problems and prevention strategies. Disclosure alert: I'm a co-author of the piece, but I don’t receive any tangible benefit by sending you over to Medscape to access it.)

But today, I want to circle back to common beliefs about intention and individual performance, making comment about what's known about reliable performance.

If you're like me, you probably get a chuckle when signs like these hit your electronic in-box.



(The "Darwin Awards" are another funny cousin in this family of pass-along e-mails.) These "just do it" calls-to-action are helpful for instilling personal accountability when performance is not overly dependent on a system, like it is with, say, teenagers and curfews.

But imagine you are boarding a commerical airliner, and you see a warning posted in the cockpit that says, "This machine has no brains. Use your own." Are you staying on-board? It probably doesn't matter, because the crew isn't likely to!

The experience and competencies your flight crew and air traffic controllers bring to work are not considered sufficient to get a plane off of the ground if the plane's brains (think: radar, auto-pilot, computer) are on the blink. I know this from experience. Last year, I flew between Atlanta and Philadelphia weekly, a routine that was largely uneventful. But I vividly recall one flight that required a return to the gate, de-planing, and re-loading onto another aircraft, events that transpired when the captain nixed take-off because the mechanics could not explain to his satisfaction why a control panel light was behaving in an atypical fashion.

Aviation professionals understand that safety is a function of reliability, a term Wikipedia helpfully explains as the ability to deliver stable, predictable results under ordinary circumstances as well as when hostile or unexpected events arise. Aviation is highly procedure-oriented, and it's likely that the ability of individuals to perform in extraordinary circumstances, like when a plane lands on the Hudson, lies in the strength of the systems that support routine function. The aviation industry routinely adopts tools and technologies that enhance the considerable abilities of individuals to perform in a reliable fashion, and they share "Waldos" across the industry whenever the safety-threatening striped shirts are identified.

Human beings, healthcare's most significant output, are far more complex than airplanes. But this fact should not dissuade us from adapting reliability-promoting processes used elsewhere. Deviation from a standard flight plan (or plan of care) for cause--that is, for reasons that enhance benefit to individual flyers (or patients)--make sense only when deviation is not the norm.

I hope you'll stay safe, come back soon, and fly only in friendly skies!

No comments:

 
Creative Commons License
Florence dot com by Barbara Olson is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.