Thursday, October 29, 2009

Why a trick is still a treat

Yesterday morning, I found and posted what I thought was a great video clip, one that seemed to say all the things I am struggling to articulate in an article about healthcare culture and what happens to safety efforts when people don't report errors.

On first pass, the events captured on film jumped out at me, as the old expression "a picture's worth a thousand words" promises they will. An errant SUV, out of control for a split second, crushes two vehicles in the adjacent row of a parking lot. Amazingly, the SUV recovers, backing off of its unlucky neighbors. The video captures a brief latency, during which time one imagines the driver reflecting upon the situation and considering what to do next. Then, the vehicle slinks away, leaving viewers to judge the actions of the driver based upon what we've just seen.

I did. It looked like a vehicular telling of what I recognized, from very early in my career, as unacceptable behavior that might be excused, especially if the driver hadn't made mistakes like this in the past or had a reputation for using the errant vehicle to do good things (like deliver medical supplies to poor people). The owners of the affected vehicles might have had the damage explained in a "collateral" kind of way: These things happen when one chooses to park in a public lot and other disclaimer language, such as what's found in the "limits of liability" fine print on a parking ticket.

Mostly, I saw the lost opportunity to learn what had caused the vehicle to suddenly lose control. How did what appeared to be a routine parking manuever suddenly turn so sour? In the slinking away, I saw the opportunity to acquire information go missing. Information that, if shared, could help others avoid making a similar mistake. Did the driver mistake the brake for the gas pedal? Was he texting at the same time he was trying to park? Or did he wake up that morning and say, "By God, I think I'm going to see if I can dry hump a couple of cars on my way to the dentist?"

But enough about the lessons that could have been learned. There's another, more authentic one for people interested in cultivating a climate that promotes safety, a lesson I figured out when I demanded my son (a new driver) watch the video with me. It turned out that what I thought I had seen didn't make sense in the third or fourth viewing. The superficial "facts" (visible to everyone who views the incriminating video) don't add up. Post-hit, none of the vehicles exhibit any damage and the position of the passive vehicles in the aftermath of the event don't square with the events that one "sees" happening.

So what this video, still a great learning experience, really illustrates is the importance of moving beyond what we believe is readily apparent when investigating the root cause of error events.

People on Twitter are buzzing about the airliner that overshot the Minneapolis-St. Paul airport last week, with tweets like this being the norm:
Too late now but the #NWA188 pilots implausible story is worse for their careers than the likely truth (Zzzzz)

But this approach (that also came in the form of a tweet) shows a better way to get beyond perceptions and beliefs: Missed by 150 miles? And there are cool tools that help front line clinicians become fluent in proactive risk reduction activities, too.

It's fun to speculate about what went wrong when high profile mishaps hit the news cycle or appear to happen right before our eyes. But healthcare leaders who investigate errors and plan risk reduction strategies, benefit from using the same methodologies that FAA and NTSB professionals do.

And that's the safety lesson that really jumps out from the tricky little YouTube video.

Happy Halloween!

Wednesday, October 28, 2009

How we respond to error

If this is happening where you work (and I don't mean in the parking lot), your patients are not safe.

Tuesday, October 27, 2009

"Code Boo" at Grand Rounds

Gina at Code Blog is hosting Grand Rounds today. The Halloween theme works for me since I'm always finding something scary to write about. (Check back tomorrow for a piece on the Northwest airliner that overshot Milwaukee by 150 miles because the pilots, ummmm, seem to have lacked sufficient redundancies.)

Saturday, October 24, 2009

God is great, beer is good, and people are crazy

Insanity is doing the same thing over and over again and expecting different results.
- Albert Einstein
Results from a multi-center nursing "time and motion" study show that nurses in acute care settings spend about 35% of their time documenting care, 17% on responsibilities related to medication administration and monitoring, and 21% coordinating care. I've heard Marilyn Chow, one of this study's lead authors present these data before, and she included them in a presentation given last week in an IOM webinar on the Future of Nursing.

I don't think anyone is particularly happy with these statistics. (Although it remains unclear what patients actually think since high profile evaluations, like this one from US News and World Report, measure nursing care by how mom-like the experience of being cared for is.) Real patients--that is, those who have had the experience of being hospitalized and understand that the circumstances that land them there necessitate far more than a chipper smile and a well-timed fist-bump--might be able to evaluate nursing care using different metrics. But, for now, it appears we're living with "% of patients whose nurses were ALWAYS polite and communicative." Sigh. (Can I just say that when I'm an inpatient, I appreciate polite and communicative behavior on the part of all of my caregivers?)

It's hard to look at Chow's data and not be struck by a significant mismatch between intention and outcome. Surely this is not the best use of valuable, high cost resources.

But what makes Chow's presentation worth studying is that, beyond Slide 4, she gets out of the box, tossing out fresh ideas about how nurses will nurse in the future. And why they should. Plus who will benefit. And how technology will enable it. Review the 11 slides in this presentation for inspiration.

If you think I'm crazy, remember what Einstein said.

Wednesday, October 21, 2009

Car dating & cognitive dissonance at Grand Rounds

Here's a link to SharpBrains, where yesterday's host Alvaro Fernandez brought together Grand Rounds (a forum for medical bloggers) and Encephalon (a forum for people who blog about the brain and mind). Alvaro offers a tongue-in-cheek, "What a nice surprise! Hello. Nice to meet you" to both groups.

The introduction has already been made.

The need to recognize the inherent fallibility of humans (and design systems that are reliable in spite of the predictable faux pas humans make) was articulated nearly a decade ago in the first IOM report, To Err is Human. Alvaro's invitation, his need to suggest that healthcare professionals dip into the cognitive psychology well, is telling. It's surely part of the reason we've yet to post measurable gains in preventing inadvertent medical error.

It occurs to me that when introductions lead to a relationship, it's because both parties perceive a benefit. It's been ten years, and in the U.S, we're still discussing whether tired residents are really as tired as other tired people. And entertaining other intention-oriented ideas, like "Follow the 5 Rights." This suggests cognitive dissonance between the safety paradigm we have and the one we need. Apparently, "we're just not into you," SharpBrains.

Healthcare remains distinguished from other high consequence industries by the degree of personal vigilance we tolerate and rely on. No matter where you or your organization may be on the journey toward improving patient safety, you should agree to a second date with the folks who study the performance parameters of humans.

Applying lessons learned to healthcare workers and the systems used to deliver care is a necessary step in eradicating the public health problem called "medical error."

Sunday, October 18, 2009

Why they have to: Patients and patient safety

Last week, Bob Wachter, a patient safety leader I admire, wrote a post Can Patients Help Ensure Their Own Safety? More Importantly, Why Should They Have To? As the title suggests, Wachter addresses both the utility of patient participation in safe practices and the necessity for this.

On occasion, these issues make my own hard drive blink. They did most recently when I considered how patient involvement squared with principles used to engineer highly reliable systems while writing From Safe Practices to Safe Patients: The Evolution of a Revolution (published on the Medscape platform last month.) At one point, I considered jettisoning the piece, convinced that allowing variability of the magnitude that patients (humans) necessarily introduce to a system couldn't be defended, let alone operationalized.

Wachter seems close to casting patients overboard, too. He rightly points out that the ability to self-advocate varies both between individuals (who possess differing knowledge, abilities, desire, and social support systems) and within one individual across time (subject to things like severity of illness, level of consciousness, and use of medications). Systems engineers (one is quoted in his post) tell us that variability is the enemy of stability. And finding variability in a system and driving it down is what gets these folks out of bed in the morning.

I've wanted to do this kind of "people parsing" on occasion myself.


Who wouldn't like to eliminate the outliers in the patient population we serve? Hypervigilant, distrustful patients can be problematic. At the other end of the self-advocacy continuum are unconscious Jane Does. They, too, interrupt work flows. But eliminating variability in measures that inform patient safety risks treating all patients like the least common denominator: the "bar" gets set at the level of the anesthetized patient.

And here's the other problem: Neutralizing patient input in patient safety assumes that the system is sound. That is, it produces reliable results if you just sit back and let the system do its thing.

Wachter does something I like to do: comparing the experience of being a passenger on a commercial aircraft to being a patient. I travel a lot, enjoy flying, and I'm perfectly happy assuming the safety duties expected of every other passenger on board. I wouldn't think of offering to lend a helping hand to those on the flight deck.

A commerical aircraft crashes 1 time in every 6 million departures. The fitness of systems used in commercial aviation clearly do not depend upon input from me. I'm okay with saying that if I get booked on the unlucky 1 in 6 million flight, "It's my time." But safety leaders in aviation are not. They continually strive to improve the system, to find ways to drive the incidence of error down, further diminishing the likelihood of 1 in millions events.

A preoccupation with making things safer is what distinguishes aviation (and other high consequence industries with reliable safety records) from healthcare. There's no doubt that the "alert" signals engineered into aircraft are easier to read than those built into humans. But that does not diminish the effectiveness of an alert.

I've been a nurse for a long time, and I suspect I share many of Dr. Wachter's feelings about what professionals should do for their patients. We have duty and desire, but, at this point in time, we do not have the means. Wachter is right to call for systems that turn intention into outcome.

But the answer to, "Why should they have to?" is that safest care won't happen without them.

Friday, October 16, 2009

Safety Nurse's Top 25 Tweeps for Patient Safety: October 2009

I'm happy to share this 2nd, updated list of 25 tweeps who are advancing the science of patient safety through Twitter.

This is not a list of "who's who" in the world of patient safety (although tweets hashtagged #ptsafety will frequently take you to the work of patient safety researchers, clinicians, and exemplars). Rather, it's a list I maintain to help me remember how broad the patient safety stakeholder base is and to keep track of key elements that inform patient safety. (A person or entity must be active in the Twitterverse to make the list.)

If you picked my brain this month, this is who I would tell you to follow:
  1. @ahier Prolific and passionate. Interest in healthcare and IT bumps him into #ptsafety on a regular basis. Forward-thinking, sm early adopter, helps spot "how to."
  2. @alinahsu A systems-thinking, Lean enterprise tweep who finds & RTs #ptsafety sensitive information.
  3. @deadbymistake Visible, helping to keep the issue of med errors in the news. Interesting use of Twitter to sustain investigative reporting efforts.
  4. @dirkstanley Hospitalist, CMIO. Did someone already say, "Your doctor is on?" Say it again. Not deep into #ptsafety tweets but follow him to take a pulse from the frontline.
  5. @ePatientDave His "Give me my damn data" cry sets the bar for pt visibility in demanding access and transparency.
  6. @ecri_anderson Editor of ECRI Institute's risk management & #ptsafety publications. Outreach from a PSO insider.
  7. @hospitalrx Long-time advocate of automation. His mission? "Protecting patients & caregivers one bar code at a time."
  8. @IHIOpenSchool Useful tweets from demonstration project engaging next generation of HC professionals. Outreach may be a tipping point for culture change.
  9. @INQRI Researching and communicating nurses' contributions to safety scaffolding. Frequent #ptsafety sensitive tweets & resources.
  10. @improvementmap Another IHI endeavor. Regular RTs of worthy thoughts & ideas beyond their own portfolio (of worthy thoughts & ideas!)
  11. @ismp1 President of ISMP, a nonprofit, multidisciplinary, drug safety agency. Unflinching advocate, now on a PSO platform.
  12. @JCommission Cautious entry into the Twitterverse. Tweets helpful & often #ptsafety sensitive. More please.
  13. @jfahrni Pharmacist, infomatics. Tweets show how change hits the frontline. Exemplary use of the 140 constraint!
  14. @JohnSharp Infomatics research. HIT, the great patient safety enabler, now has a participatory healthcare champion.
  15. @JustinHOPE Parent founder of children's patient advocacy org. Shows that perseverance pays. Pros count on her tweets to find high-end #ptsafety info.
  16. @medusesafety Tweets from the American Society of Medication Safety Officers. Leaders in a high-stakes, interdisciplinary milieu.
  17. @midwifeamy Using participatory healthcare to engage women. "If it ain't broke, don't fix it" deserves a voice in #ptsafety. Her blog raises issues that inform safety.
  18. @NPRhealth High-end analysis and links to big pic set-ups that impair health (and #ptsafety). No Happy Meals here.
  19. @paulflevy Hospital CEO who blogs. Models transparency, leader engagement in #ptsafey. Exemplary posts not rare.
  20. @PSeditor Editor HCPro, Inc. Fosters engagement. Nurtures, networks effectively using 2.0 (without shameless self-promotion). Others could emulate.
  21. @quantros Tweets to improve patient safety & reduce medical errors in the US healthcare system from inside a PSO. Always on target.
  22. @SusanCarr Editor, Patient Safety & Quality Healthcare. Models how traditional modes for communicating #ptsafety info can morph. Original tweets always worth a look.
  23. @tully3000 Quality and #ptsafety RN insider willing 2 go outside of the listserv box. Thinker with broad scope of #hc & #hcsm interests. Will be waiting to welcome others.
  24. @writeo 1 of 2 consumer members of OR Pt Safety Commission. Tweets about process & progress of groups like these help others. More, please.
  25. @WSJHealthBlog Obviously, not #ptsafety only, but high profile news informing pt safety is there. Good fodder for systems thinkers.

Tuesday, October 13, 2009

Participatory Healthcare at Grand Rounds

Grand Rounds is a "must visit" place today irrespective of whether you're a consumer, healthcare provider, or have another dog in the fight to improve healthcare. You'll find clear explanations of what "participatory healthcare" is and have a chance to assess how it's emerging.

One thing about participatory healthcare that jumps out at me is how well it aligns with the way I was taught to approach patient care when I was an undergrad nursing student in the mid '80's. That curriculum also came with a hefty dose of "change management" theory, something that drew disdain from the "where's the beef?" crowd and, unfortunately, didn't change much.

But what does seem to be changing things is the information revolution. Patient access to information, ideas, outcomes, and communication modalities is doing more than just shoring up foundational changes in "how we do things around here," (the easiest way to describe healthcare culture). These changes must occur to make the delivery of healthcare more reliable, more safe.

I see patient engagement as transformational, meaning we're likely to get somewhere better as a result of letting patients take the lead for part of the journey. So take a trip to Survive the Journey and see how far we've come.

When you do, you'll find that a number of the people who contributed to the participatory healthcare Grand Rounds appear on the inaugural list of "Top 25 Patient Safety Tweeps" I published last month, among them Dave DeBronkart (epatientDave), Amy Romano (midwifeamy), and John Sharp (JohnSharp). The experience of patients is central to efforts to improve patient safety. So are initiatives and incentives arising from clinicians, organizations, payors, industry partners, regulators, and academics. I'll publish an updated list this Friday, 10/16/09.

I welcome nominations of individuals or organizations from any of these categories for consideration on Safety Nurse's Top 25 Tweeps for Patient Safety list. The entity must have a current, active presence on Twitter. The volume of tweets is less important than the quality of patient safety information that's passed along.

Thanks for participating!

Monday, October 12, 2009

Monday morning quarterback: The Little 2

The world of patient safety did not shake with high-profile standards, policy, or funding changes communicated via Twitter last week. But I found a few nuggets to pass along, things that promise both hope for the future and also suggest that past missteps may continue to hobble patient safety.

First, a smile:

Ilene Corina, a patient safety advocate, recounts what she found impressive at the National Patient Safety Institute's second annual Lucian Leape Institute gala. The changes in medical education previewed in one roundtable Ilene attended suggest that "reward what you value" is beginning to inform medical students' experiences. Changes which would enhance medical students' exposure to patient safety are in draft.

The recommendations put forth in forums such as this one serve as a "pulse check" for major patient safety initiatives. Ilene noted, with some dismay, how slowly patient safety-sensitive changes take hold in high stakes venues, like academia. She's right when she says the public assumes we are much further along, as my experience this weekend revealed.

My brother-in-law, an engineer, nearly choked on his steak Saturday night when I told him there were 4,000 wrong-site surgeries reported in the U.S. last year. (His number-crunching mind grasped the defect quotient this number represented before the meat bolus had cleared his trachea.) So a take-way corollary--learned not in a seminar but in a Houston steakhouse--is this: In the interest of patient safety, don't share disturbing facts when a healthcare consumer has a mouthful.

And something that rankles:

Brian Ahier wrote a short must-read piece You don't have to use EHR, in which he quotes Dr. David Blumenthal, National Coordinator of Healthcare IT at HHS. Brian puts the facts out, so I'll simply ask this follow-up question: "Would any other contemporary high consequence industry contemplate operations that bypassed electronic transmission of high-stakes data?"

In healthcare, when we finally get a Model T, we immediately tie an ass to the bumper. Here we go again.

My guess is that Dr. Blumenthal is hand-patting those who have fears, probably legitimate, about the misuse of health data. But he would be better served to act to shore up the security of EHRs. Providers cannot be responsive given the complexity modern healthcare absent data automation. Outcomes will suffer until the data-sink is resolved. People who are hand-patting about security concerns should take a turn alongside the clinical people who hand-pat the families of people who die as a result of lousy communication.

What I'm watching this week:

Grand Rounds, the weekly blog carnival, is taking on participatory healthcare this week. I expect to learn more about the power of patients to shape healthcare and improve safety.

Saturday, October 10, 2009

Participatory Safety

Patient safety is a natural fit with participatory medicine. And not because initiatives that include the word "patient" should seek to involve patients in some nominal, "so glad you could make it" fashion. I don't picture patients manning the Guest Book at the reception when I consider the potential of patients to improve the safety of care.

Patient safety is a scientific discipline, one that seeks to make complex systems work reliably. Systems turn intention into outcome whether you're flying a plane or reconstructing a breast.

Transparency, disclosure, error reporting, and an urge to prevent errors by learning from the mistakes of others are hallmarks of patient safety. People who champion the science of patient safety borrow from cognitive psychology, systems engineering, and human factors, recognizing the inherent fallibility of humans and looking for ways to mitigate the consequences of human error. These are principles patients should know.

Healthcare has suffered from the erroneous perception that good people automatically produce good outcomes. Both patients and providers have had a role in shaping this belief. Since we're all seated at the grown-ups' table, let's get this on it: Healthcare providers are fallible humans. It's not "if" we make mistakes, it's when. What really matters is the consequences of these mistakes, that is, whether they make it to you.

In highly reliable systems, the intended outcome is delivered under both normal circumstances and when conditions destabilize or become hostile. Intended outcomes arise from work processes that build in barriers, redundancies, and lots of opportunities to discover and mitigate errors set in motion before they cause harm. Highly reliable results do not come because the captain of an aircraft is godlike or the engineer at the nuclear power plant was the smartest kid in his class. High reliability comes when competent people:
  • perform within a system designed to accomplish the task at hand,
  • believe that the system could fail, and
  • are empowered to act when a threat, or potential threat, to safety is perceived
It's fair to say that the 100,000 or so unintended deaths due to medical errors and healthcare acquired infections that occur in the US each year disqualifies our industry from being a highly reliable one. So what does participatory healthcare mean for patient safety?

Tons, but here's one of the most obvious: When a patient is seen as a participant in, rather than the object of, care, the system becomes more stable. At its most basic, patient participation adds a valuable redundancy at high stakes junctures of care (as occurs when a patient confirms identity before blood is drawn, verifies the affected area before a biopsy is underway, or asks a provider, "Have you washed your hands?"). Moving into less concrete domains, patients are uniquely positioned to uncover a wide array of errors that have been set in motion.

Here's an example, one that illustrates how patient engagement prevented a serious warfarin overdose:


I know a lot about this case because it happened to me. I derailed a 17.5 mg overdose of warfarin which had passed through a series of high-end automated barriers, including electronic MARs and bedside bar-code medication administration. (You can read the complete story here.)

The take-away lesson is that the warfarin overdose wasn't averted by any special "insider knowledge" of warfarin or the medication use process that I possessed. My participation came in the form of a question ("Do you usually give someone who is close to having a therapeutic INR a big dose of warfarin?"). The nurse's willingness to believe that a concern raised by a patient merited investigation is what allowed the error to surface.

From an engineering standpoint, "patient engagement" takes on value beyond its ability to help people understand a plan of care, decide if it's for them, and manage barriers. Engaged patients add a valuable layer of error detection, one that often does not exist if the patient cannot or will not participate in care (which, by the way, is why advocates and surrogates are such important players in patient safety.)

To make participatory processes work for patient safety, look for opportunities to engage in safety initiatives at the system level. I maintain Florence dot com as a real-time patient safety primer, a place where both patients and providers learn about the science that informs safest practices. Daily tweets that run here point to information and resources that represent best practices, case reports, exemplars, and stumbling blocks. I hope you'll find helpful information here and let me know when you have a safety-sensitive story to share.

Because before you get to the bedside, you want to be sure you're at the table.

Thursday, October 8, 2009

Patients For a Moment: The Recipe

I'm a day late sharing this, but Patients for a Moment, a blog carnival written by and about patients, is a good place to visit every other Wednesday. This week it's hosted at The Single Gal's Guide to Rheumatoid Arthritis.

Last week I saw Francis Collins, of The Human Genome Project and NIH fame, quoted in a tweet. I like what I think he said, so I'm passing it along:
"If you want to find a cure for cancer, you don't need to research just cancer. The answers may come from somewhere else."
I took this to mean that non-linear approaches to problem-solving have merit. Being something of a non-linear queen myself, I thought this was a good reason to share something wonderful that came from this approach. It's my recipe for The Best Meatloaf You Will Ever Eat. A bit off of the "patient safety" message, for sure, but since I'm giving a nod to the gathering of patients here, I think some comfort food may be in order.

To begin:
  • Purchase more whole wheat buns and white bagels than you mean to. Just before they reach the consistency of refrigerator missiles, freeze them. Some weeks later, defrost them, cut them into cubes the size of a die, and toast them in a 350 degree oven until they are mostly done.
  • Use about 1/2 in a recipe, then place the remainder in a medium sized Gladware container, seal tightly and cure on the kitchen counter for 2 days.

Shop for:

  • About 1 lb of ground pork. Mine had a "Reduced for quick sale" sticker, but this is optional. (If you don't eat pork, I'm sure that another ground meat product will taste just as good. If your meat doesn't have much fat, add an extra egg)
  • About 1 lb of ground turkey

Make sure you have:

  • 1 egg
  • some fresh Italian parsley from your garden, your neighbor's garden, a farmer's market, a food co-op, or from where I rescued mine: the bottom drawer of your refrigerator. Chop it pretty fine.
  • 1 small small, single-serving size cup of horseradish in sour cream sauce (usually served along with prime rib). My little container was leftover from take-out my son refused to eat last weekend.
Preheat your oven to 350 degrees

Wash you hands really well. Place all of these things, in no particular order, into a good sized mixing bowl, then mix thoroughly. Form into a nice oblong loaf, trying not to let the ends of the loaf be too much narrower than the midsection.

Place the loaf on a cookie sheet you couldn't really afford but felt obligated to buy because you really like the person who was hosting the party. Remember that the cookie sheet turned out to be a really reliable product and remind yourself to say "yes," the next time you're invited to another party. Bake for 60 minutes. Cut the meatloaf in half, and it will be nearly done, juices flowing. Let it stay in the oven for 5 minutes longer if you like it a little drier or take it out and let it rest for 5 minutes.

Slice, serve, and enjoy!

Tuesday, October 6, 2009

Honor the Game

One of my favorite books is Made to Stick by Chip Heath and Dan Heath, brothers who tell interesting stories about why some things capture our imagination and others leave us cold.

A Made to Stick story that stuck with me is how coaching leader Jim Thompson drove down the incidence of bad behavior in youth sports. Thompson started by recognizing "be a good sport" was an insufficient call to action. It wasn't a strong message, and more importantly, the charge didn't improve the conduct of youth players, coaches, parents, or spectators. So he refocused attention away from the individual to something larger, the game. Thompson called his campaign "honor the game" and illustrated it with some powerful examples, like this one:

Lance Armstrong once slowed during a Tour de France race to give his chief opponent, who had crashed, the chance to get back in the race. As his opponent remounted, Armstrong paused rather than taking full advantge of the lucky break, later noting that he rode better against strong competition. Armstrong wasn't "being a good sport." He was honoring the game.
Redesigning healthcare is about honoring the game, making it possible for the actions of individuals to contribute, in measurable ways, to something larger. The healthcare industry, and more importantly, the processes used to deliver healthcare, are under scrutiny. They should be: more people die in the US every year from preventable medical errors and healthcare acquired infections than die of AIDS, breast cancer, and auto accidents, combined.

"Honoring the game" requires a different style of play than Nike's more familiar call to action, "Just Do It!" (at least in the beginning stages of the race against harm-causing errors). Here's why:

For some time now, system design and human factors experts, dispatched from high reliability industries like commercial aviation and nuclear power, have partnered with healthcare workers to find out what ails us. Early on, industry outsiders recognized something important: When compared with other industires, the systems healthcare workers relied on lacked standard engineering controls, key elements needed to make intention match outcome. (Standard engineering controls include such things as barriers, redundancies, and opportunities to detect and mitigate errors that have been set in motion).

In commercial aviation, high-stakes tasks that could cause harm if performed incorrectly are never executed by just one person. There's always a double check. In fact, these process checks are mandated by law. Compare this norm with what a nurse (at least in the era when I came of age as a clinician) may be expected to do in a busy Emergency Department: take a verbal order, retrieve a medication from a large cache, calculate the dose, prepare the medication, and administer it to a patient. This process could be the norm irrespective of whether a drug (like Lasix) has a small chance of causing harm if used in error or whether harm is highly likely if an error occurs, as is the case with IV heparin. No barriers, no redundancies, and scant opportunity to detect an error that's been set in motion.

A commercial pilot would never fly using the type of safeguards most nurses have been taught are reasonable for caring, competent professionals to use and execute flawlessly, even under the most hostile conditions.

Industry comparisons will not take us the whole way on the journey toward reliability. But industry comparisons help dispel myths, some of which healthcare workers may find painful. The good news is that nurses, pharmacists, physicians, and others who work in healthcare are not inherently more eror-prone than the professionals who maintain airplanes, fly them, or control air traffic. The bad news is that we're not less prone than others to screw up either. And how much a professional understands or cares about a process, an outcome, or an individual patient may not be as important as many of us intuitively believe.

To honor the game, a player has to have reasonable chance of being successful. When my son was young, he had a computer baseball game that allowed him to select teams, take the field, and play virtual games. Luke's team always won because he put himself on the team with Sammy Sosa and Mark Mcgwire. (The opposition in his fantasy game usually had a few bookish kids with their shoe laces untied and, as I recall, a little girl with broken glasses on crutches.)

In the real world, we need to make certain people stand a chance of executing the tasks we expect them to do. This is the reason nurses, and other front line healthcare professionals, should pay attention to system design and speak up when expected outcomes can't be delivered without a work-around. Look for appropriate barriers, redundancies, and opportunities to recover an error set in motion. A good place to start is to think about how you identify patients, have medication orders reviewed, store drugs, and take verbal orders. The "a-ha" moments will follow.

Really. Just do it.

Note: My undertanding of human behavior, performance-shaping factors, and system design is highly influenced by the work of David Marx, President of Outcome Engineering and the author of the Just Culture algorithms. Dave and others from OE have generously shared their time and expertise to help me learn more about "the science behind the compliance" in patient safety. I encourage you to visit the Just Culture website and read Dave's book "Whack-A-Mole: The Price We Pay for Expecting Perfection" to learn more.

Monday, October 5, 2009

Monday morning quarterback: The big 3

Each Monday, I'm going to comment about 3 events that are likely to impact the art and science of patient safety, plus preview what I'm tracking this week.

Here's what caught my eye last week:

1. The Joint Commission (TJC) released information about the 2010 National Patient Safety Goals (NPSG). The number of NPSGs dropped from 20 to 11, with the phased-out goals migrating to regular JC standards chapters. There was widespread perception that the 2010 NPSGs reduce requirements, which on the surface appears to be true. The number of NPSGs is down, and no new goals have been added.

You'll find a variety of views about what these changes mean. Heather Comack (@PSeditor) wrote Joint Commission's 2010 Patient Safety Goals Reduce Requirements, a piece in which the changes were welcomed while Mark Graban (@leanblog) shared concerns in A "Step Backward" in Patient Safety. (The comments to Mark's LeanBlog post are worth a read, too.)

The take-away for people interested in risk-reduction?

  • NPSGs call attention to points in care where patients are frequently harmed. It's a good thing when standards are revised to clarify their intent or to make compliance with the intent easier to achieve (provided that "easier" doesn't make the process less effective). And at some point, expectations set forth in NPSGs have to become part of the way front line clinicians "do business" for safer care to become the norm.
  • The science of what makes patients safer hasn't changed nor is it influenced by where standards are housed or which items are called out on a master list:

Processes used to deliver care should make it hard to do the wrong thing (employ barriers).

Clinicians should expect and advocate for double-checks (redundancies) at high-stakes junctures.

And recovery opportunities--that is, the chance to identify errors that have been inadvertently set in motion before patient harm occurs--must abound.

  • As Larry the Cable Guy says, "That's good stuff. I don't care who you are."

2. Two highly respected, in-the-trenches physician leaders in patient safety wrote a Sounding Board piece in the New England Journal of Medicine. In it, Robert Wachter and Peter Pronovost call for greater accountability for compliance with basic patient safety practices (such as hand-hygiene). Both Wachter and Pronovost are well-versed in how system design impacts human performance. They're suggesting that when systems have been engineered in a way that makes it possible for good people to do the right thing, it's time to move beyond coaching and counseling when they don't. Safety leaders are suggesting that it's appropriate to impose sanctions and penalties on healthcare workers who exhibit patterns of non-compliance with safety-sensitive expectations. Johns Hopkins explains the thinking here, while Wachter digs deeper on his personal blog addressing errant physician behavior specifically.

The take-away?

  • Cognitive science upholds the notion that negative reinforcement can extinguish undesirable behavior. Every behavioral choice that has the potential to harm a patient does not stem from a system deficit. But ensuring that sanctions meted out for risky or reckless behavioral choices are fair and based on behavior, not the status of individuals (or their financial contribution to an organization), will require the wisdom of Solomon to implement.
  • If you are a front line clinician, patient, or patient advocate, this is the time to ask, "How is this organization equipped to evaluate and deal with performance deficits? Is this process transparent and fair?" When it's perceived that people are punished for simple human error or that less powerful people receive harsher punishment than more powerful people, voluntary error reporting is driven underground and the opportunity to learn from mistakes and near misses vanishes.
  • I think Just Culture, an evaluation method guided by well-thought out and reliable algorithms developed by David Marx (a systems engineer and lawyer), offers the most robust process for dealing with real-world overlaps involving system design, flaws, and behavioral choices. Learning to use Just Culture algorithms requires discipline, but the methodology is sound, aligns with desired outcomes, and stands up to high-profile scrutiny.

3. Finally, Michelle Fabio wrote a guest post at Code Blog about a Kentucky case that appears to be good news for patient safety. You can read more about the student's blog, what she shared, how the university where she was enrolled responded, and how a U.S. District Court judge ruled at What Can Nursing Students Blog About.

I see this ruling as a good thing for patient safety because the case helps establish boundaries between information students may share and what patients have a right to be held in confidence. It's a delicate balance. The "day in the life of" stories that clinicians publish on personal blogs provide a window into healthcare culture, customs, and norms that are not otherwise accessible.

The take-away?

  • Transparency is good thing. (But, as this case illustrates, transparency may let us peer into really messed up houses. It may be helpful to remember there are other ways to deal with what's inside than to tape up the windows.)

And what I'm watching this week?

Really looking for tweets out of the Health 2.0 conference being held in San Francisco. The place appears to be packed with patient advocates and start-up IT folks, a hopeful sign that innovative solutions are incubating. (If you're on Twitter, follow #hcsfbay for a real-time preview.)

Saturday, October 3, 2009

Feeling Silly Saturday

I have a second blog that sits on Medscape's platform. It's a forum where I share medication safety strategies with professionals, and I named it On Your Meds: Straight Talk about Medication Safety. Comments left at On Your Meds are always interesting, often enlightening, funny, and sometimes sad. I think what nurses share there says a lot about the fitness of the systems we use to deliver medications.

I joke that I should have a blog called "Get On Some Meds" or "Stay On Your Meds." Apparently, someone has been listening.

Comments at On Your Meds usually pop up in the week I post something new. So I was surprised to receive a series of notices showing new comments on an old post, entitled, hmmmm, "Not exactly the language of love: Words to identify and prevent errors."

Since error-reporting is a guiding tenet of safety engineering, I thought it might be useful to let others know what happens when you include the words "language of love" and "medication" in the most searchable elements of electronic media, which appears to be the error I made. Can you say "erectile dysfunction spam"?

Unfortunately, none of the cheap generic sildenafil and tadalafil spam I cleared the other day were as amusing as this:



Oh, and by the way: This is not a product endorsement. It's advice about managing a blog. I don't give medical advice, and you should consult a healthcare professional for any problem you might have, including an erection lasting longer than 4 hours. You can find standard safety information about tadalafil and sildenafil here and here.

Thursday, October 1, 2009

Change of Shift is changing the world

Change of Shift is up at Emergiblog. Take a look to find out what nurses are talking about.

I included a post I wrote last Sunday about how social media (SM) may be the equivalent of the "big reveal" on a reality TV show, only what we're revealing on blogs and Twitter is the culture of healthcare. Making healthcare more transparent by sharing not what should be done, ought to be done, or even what we wish we could have done. SM seems to be a vehicle in full-throttle that captures what's actually done.

That said, I'm wondering how professionals who blog and tweet feel about their stories being picked up and used to illustrate how well the systems frontline clinicians rely on work (or, in many cases, don't work.) The knowledge, attitudes, values, and beliefs of individuals are reflected in the accounts we share. This information is often very personal, and it may reflect organizations individuals chose to identify. There's a lot of "hot talk" about respecting the privacy rights of patients in social media, but what about others?

My work focuses on how people perform within a system. The goal is to improve outcomes by improving the fitness of the system, being mindful of the strengths and limitations that humans bring to high-stakes endeavors. But this is not a universal approach, and "blame, shame, and re-train" approaches to performance improvement remain operational (even if they are not tacitly endorsed). For this reason, I chose not to "hot link" the stories I shared lest it draw unwelcome attention to individuals who simply shared a "day in the life of" account.

I'd love to have some feedback. Am I being overly cautious? Is posting and tweeting a "let the buyer beware" endeavor?
 
Creative Commons License
Florence dot com by Barbara Olson is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.