Monday, October 5, 2009

Monday morning quarterback: The big 3

Each Monday, I'm going to comment about 3 events that are likely to impact the art and science of patient safety, plus preview what I'm tracking this week.

Here's what caught my eye last week:

1. The Joint Commission (TJC) released information about the 2010 National Patient Safety Goals (NPSG). The number of NPSGs dropped from 20 to 11, with the phased-out goals migrating to regular JC standards chapters. There was widespread perception that the 2010 NPSGs reduce requirements, which on the surface appears to be true. The number of NPSGs is down, and no new goals have been added.

You'll find a variety of views about what these changes mean. Heather Comack (@PSeditor) wrote Joint Commission's 2010 Patient Safety Goals Reduce Requirements, a piece in which the changes were welcomed while Mark Graban (@leanblog) shared concerns in A "Step Backward" in Patient Safety. (The comments to Mark's LeanBlog post are worth a read, too.)

The take-away for people interested in risk-reduction?

  • NPSGs call attention to points in care where patients are frequently harmed. It's a good thing when standards are revised to clarify their intent or to make compliance with the intent easier to achieve (provided that "easier" doesn't make the process less effective). And at some point, expectations set forth in NPSGs have to become part of the way front line clinicians "do business" for safer care to become the norm.
  • The science of what makes patients safer hasn't changed nor is it influenced by where standards are housed or which items are called out on a master list:

Processes used to deliver care should make it hard to do the wrong thing (employ barriers).

Clinicians should expect and advocate for double-checks (redundancies) at high-stakes junctures.

And recovery opportunities--that is, the chance to identify errors that have been inadvertently set in motion before patient harm occurs--must abound.

  • As Larry the Cable Guy says, "That's good stuff. I don't care who you are."

2. Two highly respected, in-the-trenches physician leaders in patient safety wrote a Sounding Board piece in the New England Journal of Medicine. In it, Robert Wachter and Peter Pronovost call for greater accountability for compliance with basic patient safety practices (such as hand-hygiene). Both Wachter and Pronovost are well-versed in how system design impacts human performance. They're suggesting that when systems have been engineered in a way that makes it possible for good people to do the right thing, it's time to move beyond coaching and counseling when they don't. Safety leaders are suggesting that it's appropriate to impose sanctions and penalties on healthcare workers who exhibit patterns of non-compliance with safety-sensitive expectations. Johns Hopkins explains the thinking here, while Wachter digs deeper on his personal blog addressing errant physician behavior specifically.

The take-away?

  • Cognitive science upholds the notion that negative reinforcement can extinguish undesirable behavior. Every behavioral choice that has the potential to harm a patient does not stem from a system deficit. But ensuring that sanctions meted out for risky or reckless behavioral choices are fair and based on behavior, not the status of individuals (or their financial contribution to an organization), will require the wisdom of Solomon to implement.
  • If you are a front line clinician, patient, or patient advocate, this is the time to ask, "How is this organization equipped to evaluate and deal with performance deficits? Is this process transparent and fair?" When it's perceived that people are punished for simple human error or that less powerful people receive harsher punishment than more powerful people, voluntary error reporting is driven underground and the opportunity to learn from mistakes and near misses vanishes.
  • I think Just Culture, an evaluation method guided by well-thought out and reliable algorithms developed by David Marx (a systems engineer and lawyer), offers the most robust process for dealing with real-world overlaps involving system design, flaws, and behavioral choices. Learning to use Just Culture algorithms requires discipline, but the methodology is sound, aligns with desired outcomes, and stands up to high-profile scrutiny.

3. Finally, Michelle Fabio wrote a guest post at Code Blog about a Kentucky case that appears to be good news for patient safety. You can read more about the student's blog, what she shared, how the university where she was enrolled responded, and how a U.S. District Court judge ruled at What Can Nursing Students Blog About.

I see this ruling as a good thing for patient safety because the case helps establish boundaries between information students may share and what patients have a right to be held in confidence. It's a delicate balance. The "day in the life of" stories that clinicians publish on personal blogs provide a window into healthcare culture, customs, and norms that are not otherwise accessible.

The take-away?

  • Transparency is good thing. (But, as this case illustrates, transparency may let us peer into really messed up houses. It may be helpful to remember there are other ways to deal with what's inside than to tape up the windows.)

And what I'm watching this week?

Really looking for tweets out of the Health 2.0 conference being held in San Francisco. The place appears to be packed with patient advocates and start-up IT folks, a hopeful sign that innovative solutions are incubating. (If you're on Twitter, follow #hcsfbay for a real-time preview.)

No comments:

 
Creative Commons License
Florence dot com by Barbara Olson is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.