One of my favorite patient bloggers has a great post up today. Kairol Rosenthal at Everything Changes shares important things about what patients can do to help prevent medical error. Check out How Do You Prevent Errors in Your Care?
Campaigns that help patients become more active, more visible in their care are on the rise. The Joint Commission has one I like: Speak Up. The focus of campaigns like these--rightly, I think--is on behaviors patients and caregivers should engage in. But it's interesting to know the science behind the recommendations.
Patients are the last line of defense in a complex system of care. Ideally, errors are prevented (through strong system design and rigorous adherence to the processes designed to avert human error). But the next-best thing to preventing an error is to detect it and mitigate negative consequences before the error reaches and harms a patient. This is why the patient's (or advocate's) voice is so important. They are the last line of defense that can detect an error that's been set in motion. Knowing what to expect helps people recognize when things are not going as expected.
For example, asking "Did you wash your hands?" helps avert a common error of omission set in motion close to the patient. Errors arising close to the point of care are particularly hard to detect. (They stand in contrast to things like a physicians' prescribing error which--while potentially serious--routinely undergoes scrutiny by pharmacists and, in a hospital setting, nurses.) The likelihood of detecting and correcting an upstream error (like a wrong dose or wrong drug prescribing error) is good. The likelihood of detecting a downstream error (like failure to perform hand hygiene) is not.
Many clinicians are working to change cultural norms in their workplaces, things that discourage patients and professionals from speaking up when they have a concern about how care is unfolding. It's a viable strategy that helps cure a healthcare epidemic.
Kairol thinks they should be wearing a ribbon.
Tuesday, February 9, 2010
Saturday, January 30, 2010
Enough! Hidden Hazards that Impair Safety
This morning my husband, my son, an exchange student and his host brother are holed up in what my mom would call a "no tell mo-tel" 30 miles south of snowy Nashville where I am waiting to greet them with tickets to tonight's Thrashers-Predators game. The boys gave up trying to complete the drive from Atlanta last evening when my husband, who grew up in Canada, said, "Enough." And not a lot more.
Unpredictable things that derail what we intend to do are annoying, and they can be dangerous. Like icy patches hidden under the snow, they're often hidden. Hazards are on my mind today.
On a larger scale, I've been reflecting about hidden places where safety gets derailed ever since a link to an article in the UK hit my Tweet stream last week. The headline and the original tweet used precious characters to say this: "Nurses who overdosed two Heartlands Hospital cancer patients escape punishment by professional body." The re-tweeted version that came to me included these words: "Sometimes sorry is not enough."
Based on the published account of these errors, the tragic events that resulted in the deaths of two patients did not happen because the nurses and physician intended to harm them. Rather, the processes they used to provide care on a regular basis failed when they were subjected to a drug's hidden hazard: Safe dosing of the drug involved, amphotericin, depends upon whether the specific product on hand is in a conventional or liposomal formulation.
The Institute for Safe Medication Practices defines a series of routine checks and balances that should be in place in clinical settings where amphotericin is used. Differentiating--calling out in a way that is obvious and unmistakable to all clinicians who prescribe, dispense, and administer drugs with liposomal formulations--is one of the strategies necessary to prevent these errors. It's also worth noting that liposomal formulations of drugs are part of a group designated as "High Alert Medications" so-called because they are highly likely to cause grave harm when used in error.
The process the clinicians used the day two people in the UK died failed because it was insufficient to prevent or detect a potentially lethal error that was set in motion. The nurses told the professional board that they were "very sorry," words that seem to have fueled the grief of the families and caused some in the global community to judge them, too.
"I'm sorry," no matter how sincerely felt or expressed, does not restore the dead to the living. That is not the purpose of expressing remorse nor for accepting an apology. The survivors of a terrible tragedy caused by medical error must be supported in how they choose to proceed, dealing with the unwelcome life-altering changes such events hoist upon them. Survivors must be free to accept or not accept expressions of regret (although many find sincere apologies by individual clinicians and organizational leaders lessen their burdens).
But how we treat the people at the "sharp end" of a tragic system failure is ultimately a measure of safety culture. And it's a place where where good people (including many patient safety experts and healthcare professionals) slip on a hidden hazard. Saying that the nurses involved in this error "escape punishment" suggests they deserve punishment. And "sometimes sorry is not enough" leaves me scratching my head. What would be enough?
Postscript: You can access a recording of the IHI webcast mentioned above by clicking this [Link].
Unpredictable things that derail what we intend to do are annoying, and they can be dangerous. Like icy patches hidden under the snow, they're often hidden. Hazards are on my mind today.
On a larger scale, I've been reflecting about hidden places where safety gets derailed ever since a link to an article in the UK hit my Tweet stream last week. The headline and the original tweet used precious characters to say this: "Nurses who overdosed two Heartlands Hospital cancer patients escape punishment by professional body." The re-tweeted version that came to me included these words: "Sometimes sorry is not enough."
Based on the published account of these errors, the tragic events that resulted in the deaths of two patients did not happen because the nurses and physician intended to harm them. Rather, the processes they used to provide care on a regular basis failed when they were subjected to a drug's hidden hazard: Safe dosing of the drug involved, amphotericin, depends upon whether the specific product on hand is in a conventional or liposomal formulation.
The Institute for Safe Medication Practices defines a series of routine checks and balances that should be in place in clinical settings where amphotericin is used. Differentiating--calling out in a way that is obvious and unmistakable to all clinicians who prescribe, dispense, and administer drugs with liposomal formulations--is one of the strategies necessary to prevent these errors. It's also worth noting that liposomal formulations of drugs are part of a group designated as "High Alert Medications" so-called because they are highly likely to cause grave harm when used in error.
The process the clinicians used the day two people in the UK died failed because it was insufficient to prevent or detect a potentially lethal error that was set in motion. The nurses told the professional board that they were "very sorry," words that seem to have fueled the grief of the families and caused some in the global community to judge them, too.
"I'm sorry," no matter how sincerely felt or expressed, does not restore the dead to the living. That is not the purpose of expressing remorse nor for accepting an apology. The survivors of a terrible tragedy caused by medical error must be supported in how they choose to proceed, dealing with the unwelcome life-altering changes such events hoist upon them. Survivors must be free to accept or not accept expressions of regret (although many find sincere apologies by individual clinicians and organizational leaders lessen their burdens).
But how we treat the people at the "sharp end" of a tragic system failure is ultimately a measure of safety culture. And it's a place where where good people (including many patient safety experts and healthcare professionals) slip on a hidden hazard. Saying that the nurses involved in this error "escape punishment" suggests they deserve punishment. And "sometimes sorry is not enough" leaves me scratching my head. What would be enough?
- An Ohio pharmacist is prosecuted, convicted, and jailed for criminal conduct following the death of a toddler who received a toxic chemotherapy infusion: Jail Time for a Medication Error – Lessons Learned from a Pharmacy Compounding Error. Is this enough?
- A midwife in the UK hangs herself, believing she is blamed for the death of an infant that followed a failed hand-off of key clinical data. The details appear in Midwife hanged herself thinking she was to blame for baby's death. Enough?
- A new resident physician commits suicide following a medical error. Expectations of perfection and what happens to people when systems are not strong enough to overcome human error are shared in Words fail, a moving tribute written by her colleague. Enough!
Postscript: You can access a recording of the IHI webcast mentioned above by clicking this [Link].
Monday, January 18, 2010
Not perfect
In the past few weeks, I've managed to lose my keys, my iPhone, and my way, predictable lapses known to occur when a person doesn't sleep or shower in the same place enough. So error, and how to mitigate predictable errror, had been on my mind.
To that end, I now possess 3 identical, well-parsed travel kits, one for home, one for my home away from home, and one for my gym bag. Standardize. Simplify. Before clicking "send," I ask for a second set of eyes on high-stakes transactions (like flight bookings). Independent double checks. Redundancies.
I know what reduces the chances that simple human error will occur or cause major set-backs in many processes. Last week, I was busy putting this knowledge to work for myself.
So it seemed like an odd time for counter-intuitive messages--things that show the benefit of imperfection--to crop up. But on Friday morning, I found myself captivated by a story about a mistake. (You can listen to this recollection, made more special because events weren't carried out as planned, in Story Corps' "When the tooth fairy overbooks, helpers step in," a daughter's precious memory of a father's slip.) And today I found Kent Bottles' interesting piece about why failure is important, which called upon a classic article "Teaching Smart People to Learn." (There's a link to the pdf in Kent's post.)
Trying to find the silver lining in the mistake cloud reminds me of a two quotes I used to keep on the bulletin board above my desk: Experience is what you do get when you didn't get what you wanted and Experience helps you recognize when you've made the same mistake twice.
Bon voyage! Safe travels!
To that end, I now possess 3 identical, well-parsed travel kits, one for home, one for my home away from home, and one for my gym bag. Standardize. Simplify. Before clicking "send," I ask for a second set of eyes on high-stakes transactions (like flight bookings). Independent double checks. Redundancies.
I know what reduces the chances that simple human error will occur or cause major set-backs in many processes. Last week, I was busy putting this knowledge to work for myself.
So it seemed like an odd time for counter-intuitive messages--things that show the benefit of imperfection--to crop up. But on Friday morning, I found myself captivated by a story about a mistake. (You can listen to this recollection, made more special because events weren't carried out as planned, in Story Corps' "When the tooth fairy overbooks, helpers step in," a daughter's precious memory of a father's slip.) And today I found Kent Bottles' interesting piece about why failure is important, which called upon a classic article "Teaching Smart People to Learn." (There's a link to the pdf in Kent's post.)
Trying to find the silver lining in the mistake cloud reminds me of a two quotes I used to keep on the bulletin board above my desk: Experience is what you do get when you didn't get what you wanted and Experience helps you recognize when you've made the same mistake twice.
Bon voyage! Safe travels!
Labels:
cognitive psychology,
human error,
Kent Bottles,
learning
Sunday, January 17, 2010
AHRQ's PS Net: Spiking the Kool-Aid with Truth Serum
Florence dot com has been on holiday. As I reflect on the hiatus, it's helpful to remember that the real Florence spent the final twenty years of her life in bed. (That's not what I've been doing, but it's information that helps establish expectations.) At the outset of this renewal, there are a few things about patient safety worth reiterating.
Few people are really "new" to patient safety. You become seasoned--recognizing what's gone right and what has or may have gone wrong--as soon as you begin to give or receive healthcare. Patient safety, simply put, is the science of preventing people from being harmed as a result of their need to seek care and how care is provided.
If you're a seasoned healthcare provider but new to the term "patient safety" or uncertain how it captures work you may be familiar with, I recommend viewing the Agency for Healthcare Research and Quality's Patient Safety Network (AHRQ PS Net) site. It's a place where initiatives and approaches--some very familiar to bedside clinicians--are organized and categorized according to "where stuff happens," "how stuff happens," "why stuff happens," and "how to prevent stuff from happening." You get the point.
The site may sound academic, but it's not. Behind the taxonomy and useful glossary are a lot of easy-reads. Web M&M presentations, for example, rival prime time drama. (Just imagine the Kool-Aid in the screenwriter's room at House being spiked with truth serum.)
I hope you'll enjoy the (mostly) real-time dialogue about patient safety (and other things that capture my attention or imagination for a moment or two) this year!
Few people are really "new" to patient safety. You become seasoned--recognizing what's gone right and what has or may have gone wrong--as soon as you begin to give or receive healthcare. Patient safety, simply put, is the science of preventing people from being harmed as a result of their need to seek care and how care is provided.
If you're a seasoned healthcare provider but new to the term "patient safety" or uncertain how it captures work you may be familiar with, I recommend viewing the Agency for Healthcare Research and Quality's Patient Safety Network (AHRQ PS Net) site. It's a place where initiatives and approaches--some very familiar to bedside clinicians--are organized and categorized according to "where stuff happens," "how stuff happens," "why stuff happens," and "how to prevent stuff from happening." You get the point.
The site may sound academic, but it's not. Behind the taxonomy and useful glossary are a lot of easy-reads. Web M&M presentations, for example, rival prime time drama. (Just imagine the Kool-Aid in the screenwriter's room at House being spiked with truth serum.)
I hope you'll enjoy the (mostly) real-time dialogue about patient safety (and other things that capture my attention or imagination for a moment or two) this year!
Subscribe to:
Posts (Atom)