Tuesday, March 31, 2009

"Here's Your Sign" is not a High-End Risk-Reduction Strategy. Go Figure.

I live in the southern part of the U.S., and, like most of my friends, I like Bill Engvall, one of the funny men on the Blue Collar Comedy tour. Larry the Cable Guy, not so much, but I really think Bill Engvall is funny. If you don't know who he is, here's one of Engvall's jokes (and a link that takes you to the lyrical version of "Here's Your Sign"):

A couple of months ago I went fishing with a buddy of mine, and as we pulled his boat into the dock, I lifted up this big 'ole stringer of bass.
This idiot on the dock goes, "Hey, y'all catch all them fish?"
"Nope. Talked 'em into giving up."
I like Bill Engvall’s take on the human condition. It helps me make it through the Walmart. But when I go to work, I try to leave Bill behind. Here’s why:

In Chipping Away at Risk, I talked about how professional standards in other industries call for the use of the highest feasible strategies to manage predictable risks, and noted that similar thought processes are not yet considered "the norm" in the healthcare culture. In healthcare, it’s easy to draw from our duty-oriented traditions, falling back on what an “A player” wants to do on a good day rather than what a “B player” produces on an average day. (We will not discuss "C players" today.) But Human Factors research tells us that different, more reliable processes are needed to manage predictable risks that arise when people, processes, and equipment converge: "Hey, y'all catch all them fish?"

So here’s a user-friendly list of risk reduction strategies, one that’s widely used by the safety analysts at the Institute for Safe Medication Practices. The strongest error-reduction strategies are listed first, with the less effective options lower on the list:

  • Fail-safes & Constraints
  • Forcing functions
  • Automation & Computerization
  • Standardization
  • Redundancies
  • Reminders & Checklists
  • Rules & Policies
  • Education & Information
  • Suggestions to be more careful or vigilant
And three examples showing how these principles look “on the job”:

1. A patient care unit where the primary fall-prevention intervention involves nursing personnel “keeping a close eye on patients at-risk to fall ” is using a less reliable fall-reduction plan than a unit where nursing vigilance is augmented by standard measures (such as the opportunity to use the bathroom every two hours). This can be predicted because scheduled opportunities to use the bathroom standardizes an intervention while “keep a close eye on them” relies on personal vigilance, a much weaker risk-reduction strategy.
2. A neonatal unit that has a policy stating only 10 units/mL heparin will be stocked in the unit’s automated dispensing cabinet (ADC) has a less reliable risk-reduction plan in place than a neonatal unit where heparin products undergo bar-code scanning prior to delivery to the unit and prior to being prepared for a given patient. Bar-coding is an automated risk-reduction strategy with reliability that trumps both policy statements and the accuracy of humans when “reading the label.”
3. Port-free epidural tubing, especially those with distinguishing colors and features, makes patients safer than using standard IV tubing because the absence of a port is a constraint that can prevent inadvertent administration of parenteral drugs to the patient’s CNS, a tragic occurence that regularly happens when well-educated clinicians become distracted.

I hope the rank order of risk reduction strategies and the clinical examples give you something useful to consider about mitigating on-the-job risks and how to respond when an error occurs.

Stay safe, find some time to fish, and come back soon!

Saturday, March 28, 2009

Chipping Away at Risk

I had an unusual introduction to Human Factors (HF) engineering when I served on a jury that heard a chipper-shredder mishap in the mid '90s. I'll forgo the tragic details that gave rise to a suit against the manufacturer, and leave you with just the take-away lesson: if a machine's outer casing is shielding a series of free-swinging blades seated on a spinning Ferris wheel-like device, the likelihood of your hand being sucked in from below (at the small, innocuously-appearing exit shoot) is as great as it is from above (at the larger entry hopper where you toss yard debris). This information becomes even more relevant should be you be tempted to free--even from a distance well away from the exit shoot--a thorny vine caught around the spinning Ferris wheel device.

During the course of the two week trial, HF expert witnesses gave our jury a soup-to-nuts education about chipper-shredders, attempting to get us up-to-speed about design principles, mechanical features, and professional standards manufacturers must conform to.

It did not come as a surprise to HF expert witnesses--neither those testifying for the plaintiff nor those testifying for the defense--that human beings and chipper-shredders had a high potential for yielding tragic outcomes. In fact, a manufacturer's ability to bring a high-hazard product to market, and keep it there, hinges on whether it can be made safe enough to protect people from the predictable mistakes they are likely to make while using it.

While I found the mechanical aspects of chipper-shredder design interesting (even using the information when purchasing a chipper-shredder of my own a few weeks later), I experienced a profound "ah-ha" moment when I realized how differently HF experts evaluated risk and selected risk-reduction strategies, compared to how I did. (At the time, I was an experienced intrapartum nurse and a leader on a 600 births/month maternity service.)

According to HF standards, warnings--even bold ones using pictures with high-contrast color combinations and affixed in strategic locations--are insufficient if a higher-order strategy--like installing a protective grate north of the exit shoot--is feasible. Written directions (think: policies and procedures) similarly fall low on the list. Because written directions and warnings have a high failure rate, their best use is in conjunction with risk-reducing strategies that are more likely to work.

I think my jury service occurred in 1996 or 1997, several years before the publication of To Err is Human, an IOM report that quantified healthcare errors and served as a multi-stakeholder call-to-action. The subsequent 2001 report Crossing the Quality Chasm began to describe specific improvement strategies, previewing successes and borrowing methodologies from human factors-oriented industries, like commercial aviation and nuclear power.

In the mid-'90s, the idea that healthcare workers should do more than, "review policy" and "counsel individual" was revolutionary. But today, it shouldn't be.

Two years ago, Sean Berenholtz and Peter Pronovost, physicians at Johns Hopkins University and leaders in patient safety research, commented on interventions selected to prevent reoccurence of mistakes in healthcare settings, noting, "Unfortunately, weak interventions predominate and are often the same traditional solutions offered in a new package."1

Next time, I'll share more about a rank-ordering of risk reduction strategies that's used to promote medication safety. If I've piqued your interest, you can find a short case study and critique of the use of low-level risk reduction strategies in a June 2008 Pennsylvania Patient Safety Advisory.

In the meantime, stay safe working in your yard!


1 Berenholtz, B. & Pronovost, P. (2007). Monitoring patient safety. Critical Care Clinics, 23, 659-673.

Thursday, March 26, 2009

Be Where You Are


I had planned to write about chipper-shredder mishaps today, sharing how a random call to jury duty introduced me to the discipline of Human Factors engineering and changed what I believe about people, lawn & garden equipment, and healthcare delivery systems.

Principles from human factors and cognitive psychology are important because they help to determine a rank order for risk reduction strategies. This helps clinicians identify the best strategies for preventing error, and helps administrators make provisions for endorsing, and funding, the risk-reduction strategies most likely to work. These are interesting things to talk about.

But sometimes, you've just have to be where you are. Today, I'm providing hospice care to our family's 12-year-old yellow Labrador retriever, Daisy. I'm trying to figure out the best plan of care and reviewing her recently acquired medication list in hopes of finding some hopeful explanation for her marked downturn. (Also cleaning up big messes, running the space heater, and trying to maintain a minimal-stim environment for Daisy, who prefers to stay curled up near, and occasionally on, my feet.)

We all regress under stress, I guess, and Daisy's working hard to stay connected. But when I'm "feeling" more than I care to, I become hyper-analytical, hoping to move back to the "thinking" place where I'm more comfortable. So I'll share a few nuggets from my unwelcome journey:

1. You can get medications for your pet at your local pharmacy. In the past, the medications our pets have needed all came from the vet's office, but when your pet moves into a high-octane plan of care, it turns out you can go to the same gas station where you get yourself fueled up. (At least you can do this in the state where I live.)

2. Pet medications are another variable in the look-alike packaging maze. Since this is a process-oriented blog, I'll invite you to look at the photo of Daisy's meds again. Notice how the one that came from the vet's office (on the right) has the distinctive pet silhouettes? I think this helps prevent distracted, stressed-out, and yes, tearful, pet owners from inadvertently taking their pet's medicine. The prescription on the left-hand side came from the real pharmacy, where my family gets our prescriptions filled. You can see that I flagged it to help me see--from a distance and maybe without my glasses on--that these pills belong to the dog. It would be better if all meds dispensed for canine use were placed in bottles like the one on the right.

3. Separate pets' medications from the family's stash. Maybe you are inherently less error-prone than I am, but the consequences of a mix-up could be huge. Separating look-alike containers is a stronger risk reduction strategy than simply trying to "be more careful."

Especially when you're crying.

Monday, March 23, 2009

A Belief Born of Despair

I've been advocating for solutions to medical error that extend beyond what individuals can do (or can reasonably be held accountable for doing) for a long, long time. In the language of cognitive psychology, this means I ascribe to a system approach for modeling and managing human error.

My belief in system approaches did not arise as a result of study, reflection, or facilitated learning, but came in the aftermath of care my son received in a state-of-the-art children's hospital in 1992. Born with a serious, but fixable digestive problem, my son--and our family--logged more than half of the first year of his life in the hospital.

I've long since forgotten the litany of things that went wrong that year (although equipment malfunction, wound dehiscence, breastmilk mix-ups, tubes that stayed in too long and tubes that came out before their time return to the forefront of my mind after a cursory search of the blessedly faltering "hard drive" where I store these memories). But I have no trouble recalling an evening when I sat in a rocking chair beside my son's crib, meeting with the institution's risk manager who had been called in from home in the aftermath of yet another inexplicable error. "Can you just tell me," I asked in despair, "why the team of seemingly reasonable human beings you represent are so patently unable to render care that does not--in some way, shape, or form--harm my child?"

She could not.

But others have been able to, and over time, I've found solace in some unexpected places. I'm sharing a link to Human error: models and management, a 2000 commentary by James Reason that appeared in the British Medical Journal. This work remains the de facto starting point for anyone interested in the science of reducing human error.

The process of resolving feelings about what happened in the aftermath of my son's difficult start was complex, and I'm sharing just a part of that journey. It's telling that Reason's words--written years after my son's birth--resonated with me, helping to express what I intuitively knew. I hope they will be helpful to you.

I was first able to give up the idea that "bad" (think: careless, stupid, lazy, inconsiderate, incompetent) people were responsible for all that went wrong by considering the problem logically: it was statistically unlikely that our family would have had the bad luck to bump into a disproportionate number of mal-equipped, mal-intended, or simply "off-their-game" individuals with a frequency that could account for the host of significant mishaps that befell us. This analysis may not come to your mind if you seek care once in awhile and have an unsatisfactory encounter or uncover a near-miss. But when you get a data set like the one I had in 1992, you come to realize that some norms, like poor penmanship and ambiguous orders, breed the predictable mishaps that follow.

In my son's case, I ultimately concluded that given the variables of "inpatient days logged" and "complexity of care," he probably experienced the same number of adverse events that anyone else in his situation did. Seeing our misfortunes as a series of unacceptable, but common, outcomes helped me get rid of the feeling that my family was being trailed by some dark cloud of bad juju.

The memory of thoughtful words and genuine acts of kindness also helped dispel the notion that errors in my son's care arose largely because of uncaring or negligent people. In our darkest days, following a leak in my son's newly repaired esophagus, the surgeon shared that he prayed for Luke and for our family, expressing his hope for healing, comfort, and restoration of our family life. The rotating resident brigade, whom I unkindly referred to as the "sneakered sycophants," nevertheless tagged my son with some endearing nicknames, a few that we still use today. One of Luke's home care nurses became a godmother. I share the healing power of these moments, not because I think that intending to do the right thing and actually doing something right are the same. They're not. But these moments helped me see that what was lacking in the care my son received simply couldn't be explained by factors under the control of one individual.

I hope you'll return to this discussion ready to explore more about what turns intention into outcomes, what heals without first hurting. Let me leave you with something that always make me smile:



Sunday, March 22, 2009

Where's Waldo? Finding Reliability in Surprising Places

Last Friday, I talked about look-alike, sound-alike (LASA) drug names, throwing out for discussion just one of the system-level risk points that predispose people to err. Remember Waldo, the popular figure in the striped shirt kids have fun trying to find? I think of risk points as "Waldos" because they're often significant stumbling blocks, features hidden in plain sight that undermine the ability to deliver intended care. Without specific measures that make Waldos visible in healthcare, they frequently derail our good intentions.

(Oh, and if I’ve sparked your interest in LASA-related drug errors, you can click here to access a graphics-rich, user-friendly, CE-granting tutorial about this problem. It offers a more complete discussion of LASA problems and prevention strategies. Disclosure alert: I'm a co-author of the piece, but I don’t receive any tangible benefit by sending you over to Medscape to access it.)

But today, I want to circle back to common beliefs about intention and individual performance, making comment about what's known about reliable performance.

If you're like me, you probably get a chuckle when signs like these hit your electronic in-box.



(The "Darwin Awards" are another funny cousin in this family of pass-along e-mails.) These "just do it" calls-to-action are helpful for instilling personal accountability when performance is not overly dependent on a system, like it is with, say, teenagers and curfews.

But imagine you are boarding a commerical airliner, and you see a warning posted in the cockpit that says, "This machine has no brains. Use your own." Are you staying on-board? It probably doesn't matter, because the crew isn't likely to!

The experience and competencies your flight crew and air traffic controllers bring to work are not considered sufficient to get a plane off of the ground if the plane's brains (think: radar, auto-pilot, computer) are on the blink. I know this from experience. Last year, I flew between Atlanta and Philadelphia weekly, a routine that was largely uneventful. But I vividly recall one flight that required a return to the gate, de-planing, and re-loading onto another aircraft, events that transpired when the captain nixed take-off because the mechanics could not explain to his satisfaction why a control panel light was behaving in an atypical fashion.

Aviation professionals understand that safety is a function of reliability, a term Wikipedia helpfully explains as the ability to deliver stable, predictable results under ordinary circumstances as well as when hostile or unexpected events arise. Aviation is highly procedure-oriented, and it's likely that the ability of individuals to perform in extraordinary circumstances, like when a plane lands on the Hudson, lies in the strength of the systems that support routine function. The aviation industry routinely adopts tools and technologies that enhance the considerable abilities of individuals to perform in a reliable fashion, and they share "Waldos" across the industry whenever the safety-threatening striped shirts are identified.

Human beings, healthcare's most significant output, are far more complex than airplanes. But this fact should not dissuade us from adapting reliability-promoting processes used elsewhere. Deviation from a standard flight plan (or plan of care) for cause--that is, for reasons that enhance benefit to individual flyers (or patients)--make sense only when deviation is not the norm.

I hope you'll stay safe, come back soon, and fly only in friendly skies!

Friday, March 20, 2009

LASA: It's Not Just Another Bad Abbreviation

I'm talking about errors associated with look-alike, sound-alike (LASA) drug names today because LASA problems offer concrete examples of risk points that dog clinicians involved in the medication use system. (For purposes of this discussion, my quick-and-dirty working definition of a risk point is "any underlying factor that predisposes to error.")

In the last post, I referred to new research confirming something you probably already know: healthcare professionals struggle with reporting mistakes, and we struggle with the fact that we are fallible when we're involved in errors. When people believe that “bad people” or “good people having a bad day” are individually responsible for most medical errors, it’s easy to see why reporting error and reconciling feelings of personal responsibility become burdensome. But reporting and reconciling become easier when you look for solutions that improve the nature of the process, not the nature of the people. Face it, we’re all going to have a bad day once in awhile, and, unfortunately, not all people are good.

(I recently spent a year as the Safe Medication Management fellow at the Institute for Safe Medication Practices. But, as you can read in my last post, I was tripped up by look-alike packaging of hand sanitizer and hand soap a few weeks back, proving yet again that “knowledge” does not trump “process.”)

Human error is typically a by-product of the systems we practice in, and with LASA errors, it’s hard to miss the risk points. The category of LASA-related errors exists because, frankly, drug names are often similar to one another. It's easy to see how words and phrases like "oxycodone and oxycontin" and "Novolog Mix 70/30 and Novolin 70/30" could be mixed up. Similarities like these regularly give rise to confusion, and yes, error. We see look-alike, sound-alike word confusion in other settings all the time: if you haven't seen "your" erroneously substituted for "you're" recently, you're reading better things than I am! But when word mix-ups have the potential to give rise to medication errors, stronger processes that guard against selecting the wrong one need to be in place.

Next time, I'll share data and some easy-to-access resources for preventing LASA errors. Maybe you have an example of a look-alike or sound-alike error to share? (If you do, tell your story in the “comments,” omitting identifying information. On Florence dot com we neither offer medical advice nor violate HIPAA regulations.)

So, good people, stay safe and come back soon!

Wednesday, March 18, 2009

A Picture's Worth One Thousand Words

When I made "patient safety" my business, I stepped away from specialty practice in intrapartum and high-risk antepartum nursing care, a decision that is sometimes difficult to explain. Last week, I wrote about how the perception of patient safety as a warm, fuzzy, intention-based goal can get in the way of actionable things--like workflow analysis, process mapping, and harnessing the power of technology--to deliver efficiencies, reliability, and economies of scale.

Another aspect of the career shift has been the risk of becoming a "glass half-empty" kind of girl, a perpetual naysayer who tells earnest, well-intended, and increasingly cash-strapped healthcare professionals, "Really folks, this is simply not enough. Have you forgotten that medical errors are the 8th leading cause of death in the U.S?” Last week, Oprah helped me out, hosting the Quaids and reminding us that "every year in the United States, more people die from medical mistakes than from breast cancer, AIDS and car accidents…combined. It's a major, major health issue that will touch almost every single American at one point in our lives."

I’m not a person who sees the glass is half-empty, nor am I an apologist. So I’ll share here what’s helping me to reconcile the irrefutable mismatch between intention and outcome that is healthcare today.

First, it may be helpful to simply acknowledge that errors are very common in healthcare. So common, in fact, that the Agency for Healthcare Quality and Research has endorsed a taxonomy to describe and categorize them. While this may be shocking at first glance, it’s actually good news: Using a specific nomenclature to describe events and categorize them is an epidemiologic approach to problem solving. Taxonomies are used in the study of other vexing problems (like breast cancer, AIDS, and car accidents). So, it’s reasonable to expect that similar processes would be used to diminish the incidence of our problem: medical errors.

While the charge “First, do no harm,” may resonate with many clinicians, this is a goal statement, not a process map. “Just Do It!” just doesn’t, well, do it when it comes to solving significant threats to health.

If you visit AHRQ’s Patient Safety Network, you’ll find the error taxonomy is searchable by a variety of categories (for example, “care setting”; “clinical area”; “type of error”). The one I use most often is “approach to patient safety” because this query lets me “connect the dots,” seeing how specific strategies (like “patient hand-offs”) are seated within larger motherships (like “Communication Improvement”). The taxonomy maps the current “method to the madness,” and leaves room for new ideas. (You’ll notice that the labels I apply to each post at Florence dot com often include key words from the patient safety taxonomy.)

Second, everyone makes mistakes. We may not mean to, but we do. There is a strong body of evidence suggesting that in the aftermath of an error, healthcare professionals struggle with what actions to take and how to reconcile their feelings about having been involved in an error. And a recent study in the Journal of Patient Safety suggests frontline clinicians remain conflicted about disclosing, discussing, and reporting error, despite efforts to increase transparency, promote reporting, and look at error in context. (If I were to apply a label to the discussion right now, I’d choose: culture of safety.)

It may be easier to start talking about errors that happen in healthcare settings by talking about errors that didn’t. Take a look at the photo below and see if you can guess what happened when I cooked breakfast at my church a few weeks back.


Yes, I washed my hand with a hand sanitizer product intended to be used without water, an activity that neither cleansed nor sanitized my hands.

Obviously, I didn’t read the label. A look at my kitchen sink will help you see why:

(In case you can't read the label--something that's difficult to do even in here in my kitchen--the little-bitty font just above the green leaves says, "Hand Soap.")

Two distinct products that share similar packaging, similar color, and similar placement: an error-prone set-up in the community. And an error-prone set-up at work.

This is not to say that I think the consequences of mixing up products in a community setting and the consequences of mixing up products (particularly medications or cleaning agents) while on-the-job are equivalent. In fact, it’s precisely because the risk of harm is so much greater when error occurs in a healthcare setting that processes on-the-job need to be far more robust than what we typically use at home.

I hope you’ll come back as this discussion evolves! (Feel free to use the comment section to share your thoughts with me and with each other.) And in the meantime, I hope you'll stay safe!

Next time: LASA: It’s not just another bad abbreviation.

Sunday, March 15, 2009

The Problem with System Problems

You may have heard of Jennifer Thompson-Cannino and Ronald Cotton, authors of a recently published memoir "Picking Cotton," an accounting of Thompson-Cannino's mistaken identification of Cotton as her rapist and what happened in the aftermath of the tragic error. (Cotton served 11 years in prison before being exonerated by DNA evidence, an event that triggered Thompson-Cannino to experience near-stiffing guilt for her role in his conviction.) I've seen Jennifer and Ronald on 60 Minutes and caught them on several radio interviews, most movingly in a piece recorded for NPR's This I Believe series. (http://www.npr.org/templates/story/story.php?storyId=101469307).

They're sharing their story, a powerful testimony of love, forgiveness, and redemption. Jennifer and Ronald also champion a cause both hold dear: judicial reform, specifically the processes used to gather and present eyewitness testimony.

Thompson-Cannino and Cotton's journey is particularly remarkable because they have undertaken it together. When an accuser apologizes and a victim forgives, each experiences grace that only the other can offer. We are infrequently given a glimpse of healing like this, healing that comes from such a deeply personal place. But what may be easy to lose in the story of their personal triumphs is the role that system deficits played in the tragedy that befell them.

The problem with "system problems," I think, is that they often beg individual accountability, leaving us with the sensation of a debilitating ethical itch that simply cannot be scratched. Expressions like, "Lead, follow, or get out of the way," and "If not you, who? If not now, when?" suggest a general belief that things are under control, or can be brought under control. It's a matter of personal responsibility. Take me to your leader! If good people bring the good times, surely bad people must bring the bad.

So it's particularly interesting to me that Thompson-Cannino and Cotton, two people whose capacity for introspection and personal accountability are remarkably deep, are talking about system problems, looking past "the people" to focus on "the processes."

An analysis of events in the Cotton case reveals that the circumstances that led to Thompson-Cannino's initial identification of the perpetrator predisposed her to identify the most likely attacker, not the attacker. Cues and clues, erroneously, but not maliciously, offered by investigating officers further reinforced Jennifer's perception that she was right. The 60 Minutes piece was told, in part, by another person affected by this tragedy: the detective who investigated the case. A professional who, using the standard operating procedures employed at the time, helped to send an innocent man to prison for 11 years. If you're feeling the beginnings of an ethical itch, it may or may not be helpful to know that more than 75% of individuals convicted of crimes, then later exonerated because of DNA evidence, were convicted with eyewitness testimony that turned out to be erroneous.

DNA now offers us access to highly reliable evidence about what happened to who, under what circumstances, and where. When it's available, DNA evidence can corroborate or refute eyewitness testimony, making the likelihood of uncovering an objective truth far greater than it ever has been. Even without DNA evidence, the procedures used to identify perpetrators and preserve, but not influence, the memories of victims are now evidenced-based and should look different today than they did 25 years ago.

I think it would be fair, a word I use with great caution given what happened to Ronald Cotton, to say that it was not wrong to have used 'standard operating procedure' 25 years ago. But it would be wrong to use 25-year-old standard operating procedures today. It would be wrong to continue to do things in an unreliable way when someone's liberty or someone's life is at stake.

System level analysis offers a way to improve reliability in judicial processes, in aviation, in healthcare. It gives people who are very close to the action--like patients and clinicians--tools to see that everything may not be exactly as it appears on the surface.

That's what we're going to tease apart and talk about here at Florence dot com! I hope you'll come back again. Because it is not wrong to have used standard operating procedure 25 years ago, but it is wrong to use 25-year-old standard operating procedures today.

Next time: This picture paints a thousand words. (But I'm only going to use 500.)



Thursday, March 12, 2009

Patient Safety: Is it really like baseball, hotdogs, apple pie & Chevrolet?

Happy Patient Safety Awareness Week! We're at the tail end of the nation's weeklong campaign designed to spur conversation amongst consumers and providers and promote strategies to make patients safer.

I don't know what the celebrations look like where you are, but most people I know are talking about worrisome first quarter financial projections, rising unemployment, and the possibility of many more people joining the ranks of the uninsured. It's exceedingly tough to be an effective advocate for patient safety these days, a concern reflected in the theme chosen by the National Patient Safety Foundation for their 2009 spring meeting: "Patient Safety in Challenging Times: Now More Than Ever, a Critical Need."

I've found that Wikipedia entries often offer simple words that capture complex ideas, so I tooled over and was happy to find this definition at http://en.wikipedia.org/wiki/Patient_safety:
"Patient safety is a new healthcare discipline that emphasizes the reporting, analysis, and prevention of medical error that often lead to adverse healthcare events."

Thank you, Wiki, for a step in the right direction. Because it seems to me that part of the "patient safety" advocacy problem may be rooted in its warm, fuzzy name. "Patient safety" sounds like a simple "mom and apple pie" issue. I'm for it! You're for it! We're all for it! Who wouldn't be for patient safety?

(Just for fun, picture yourself, a colleague you respect, or your personal physician in a very public forum--let's say "on the news" or "under oath" to set up a visual. Now imagine the question, "So, do you or do you not embrace patient safety?") There really is only one right answer.

But can everyone who can honestly say, "I'm for patient safety," also say, "I'm committed to a process in which real or potential error is reported and deconstructed, where failure points are identified, and, when necessary, to the redesign of work processes to prevent recurrence of an error or event"?

Really being for "patient safety" requires more than good will, a caring heart, or a pledge to "do no harm." In order to diminish inadvertent harm, seasoned clinicians have to examine, and sometimes leave behind, elements of dearly-held, intention-based beliefs and practices. ("The Five Rights" may not make it through process mapping and meticulous work flow redesign.)

In the end, these clinicians, whose valuable expertise arose in a different tradition, will likely adapt processes and methodologies, borrowed from high stakes industries with superior safety records. A new generation of "digital natives" will join our ranks, bringing new norms and new ways of taking on and solving complex problems. Reliability in healthcare will improve. But this is a process, not an event.

To answer, "Yes, I'm for patient safety," healthcare managers, administrators, and payors must embrace the emerging science that informs safety, invest in high-yield products and processes, and nurture widespread culture change. Again, a process, not an event.

Investing in better processes is still investing, and "patient safety" may be a hard line item to defend in a tough economy. So as National Patient Safety Week comes to a close and the inevitable rounds of budget cutting begin, I hope I've left you with a user-friendly assessment tool. Just ask, "Are you really for patient safety?"

Stay safe and come back soon!

Monday, March 9, 2009

Welcome to Florence dot com

I find inspiration, and occasionally wisdom, in unexpected places, and I hope you'll find some here at Florence dot com, a place for people interested in improving healthcare.

A few months ago, I happened upon a documentary about a senior citizens chorus from Northampton, Mass called Young@Heart. In it, an octogenarian suffering from congestive heart failure sings a rendition of "Fix You," a Coldplay song about learning from mistakes and fixing broken things, which drew critical acclaim for the chorus (plus hundreds of thousands of hits on YouTube and other sites that linked the clip in the fall of last year).

Even if you haven't seen Young@Heart or a clip of "Fix You," I'm certain you know someone like Fred Knittle. He's been in your ER, on your inpatient census, or on your patient roster. And you probably saw him at the Walmart a time or two. To me, Fred Knittle's rendition of "Fix You" says more about healing--and the valor of persevering--than anything I've bumped up against in a very long time. The message hits home, maybe, because Knittle is backed up by his portable oxygen tank as well as his loyal choral compatriots. Or the meaning intensifies because Knittle sings his part and the part of his partner, a fellow chorus member who passed away just days before the piece was filmed. Or maybe it's because Knittle's remarkable baritone croon may not have anything to do with fixing at all.

The thing that I most love about the success Knittle enjoyed is simply that it came at all (he died on New Year's Day in '09). Knittle, if his cheerful optimism and wry accounting is predictive, lived a good life. His obituary shared events and accomplishments that evidence a life well lived: a devoted wife, children and grandchildren, military service, a long career in service to others, a loving, connected community. But it wasn't until Knittle's final years, most probably after receiving a terminal diagnosis, that he returned to a community singing group he loved and produced work that is making people around the world pause for a moment and think, really think, about what it means to care, to heal, to try, and to die.

Renewal coming from a broken place speaks to me. In the healthcare industry, we frequently produce an outcome we did not set out to achieve, making our work, from an engineering perspective, well-intended, but not reliable: in the US medical errors are the 8th leading cause of death. Each year, more people die as a result of medical error than die of AIDS or breast cancer. We are broken.

What ails healthcare is not a "one person" or "one profession" problem nor will the fixes be singular. Healthcare professionals, like the Young@Heart chorus, may appear too old, too tired, too exasperated, or too out-of-breath to always look like credible sources of hope. Fred Knittle didn't look the part either. While most performance problems in healthcare are rooted in our systems, that is, how we do business, solutions ultimately rely on what people come to view as important and how we adapt. I hope you'll return regularly to Florence dot com for cues, clues, and commentary about cutting edge trends in patient safety and that you'll find this a good place to share your insight and experience, whether you're a professional, a consumer, or both.

"Lights will guide you home and ignite your bones, and I will try, and fix you." Thank you, Mr. Knittle.
 
Creative Commons License
Florence dot com by Barbara Olson is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.