Thursday, April 30, 2009

Change of Shift

The nursing blog carnival, Change of Shift, is up over at CodeBlog. Take a look at posts from blogs written by nurses to get an insider's view of what it's like to try to live, die, heal, and work with dignity in a variety of healthcare settings.

This go round, Change of Shift asked for submissions about being human, a welcome topic for me since preaching from the book of "I'm human, You're human," is occupying a considerable amount of my time these days.

Gina's post about a day in the ICU should be required reading for anyone who has not yet decided to get on board the "systems-thinking" bus. I think it was Don Berwick, CEO at the Institute for Healthcare Improvement, who said, "Every system is uniquely designed to give you the precise results you are obtaining." Berwick's wry words explain why the U.S. gets nothing-to-boast-about health outcomes at economy-cracking prices: that's what our care delivery system is engineered to produce. Gina's chronicle of advocacy and despair is worth reading more than once.

I'm leaving you with a YouTube link to the song of the day: "Human," by the unfortunately named band, The Killers. (Sometimes, you just can't catch a break! Just ask Gina.)

Monday, April 27, 2009

STLs happen

Acting on an urge to find an outdoor spot to enjoy a warm spring evening, my husband and I recently found ourselves perched on a deck three stories above an old tavern, giving us a bird's eye view of the old courthouse in the small southern city where Larry Flynt, the publisher of Hustler magazine, was gunned down in 1978. I didn't know this fact, although I've lived in nearby communities for over two decades, but my husband did. This Trivial Pursuit-worthy fact, coupled with an old red pick-up truck that was missing the entire driver's side door (but, happily, not the driver) occupied our conversation until I remembered we were right down the street from the new courthouse (where I had been a jury member when a fellow citizen sued the Troy-Bilt company following a chipper-shredder mishap in the mid-1990's). We were destined for good conversation.
  • Targeting, then deliberately shooting, a fellow citizen with a handgun
  • Driving a motor vehicle absent a key piece of personal protective equipment
  • Getting your hand stuck in the business end of lawn and garden equipment
These are three very distinct ways to screw up. (Normally, I'd use the term err--as in To Err is Human--but, hey, it was Friday night.) My husband, of course, would have preferred to continue talking about Larry Flynt, Hustler, or the guy in the truck without the door. But I was determined to talk human factors engineering, the study of how people, given our capabilities and limitations, can be predicted to perform tasks that involve using machines under real world conditions.

"Did you know," I asked, "that the probability of a well-trained, motivated, competent person producing an error while performing a routine task is 1 in 2000?"

Erring while doing something you know how to do is a mistake characterized as a slip, trip, or lapse (STL), and STLs are the most common of human errors. Bringing home Coke instead of Diet Coke or Fruit Punch Juicy Juice instead of Cherry Juicy Juice are classic STLs. Environmental factors, such as similiar packaging, product placement on the shelf, lighting in the grocery store, and distractions during the selection process play a role in STLs. Irrespective of intention, STLs happen, and they happen most often to seasoned people performing tasks and activities they are normally adept at doing. If it happens in the grocery store, it can happen at work.

The chipper-shredder mishap I heard during my jury service could be characterized as a slip, trip, or lapse. (The gentleman who brought the suit was raised on a farm and had used heavy equipment since the time he was a young child.) When an experienced, but busy or distracted, nurse attaches a syringe containing viscous liquid, meant for oral administration, to an IV line and inadvertently administers it intravenously, you've heard another account of a slip, trip, or lapse.

I still haven't gotten over the fact that I learned about slips, trips, and lapses while serving on a jury, instead of during the course of my professional training ten years earlier. How helpful it would have been to know that systems could be designed and engineered in order to compensate for mistakes competent professionals can be predicted to make. (At the risk of beginning a rant, let me point out that in the U.S., we landed a man on the moon three decades before oral syringes--devices that are incompatible with IV tubing and thus prevent competent people from having a 1-in-2000 slip that can kill a person they're trying to cure--became commercially available in hospitals.)

I write about medication safety over at Medscape, on a popular blog called On Your Meds. This forum draws tons of comments from front line clinicians. Most use it to talk about what's it like to use existing processes to deliver the meds (and the care) they want to give. A few weeks ago, On Your Meds received over 8,000 hits in the first 24 hours after a piece entitled, "Medication Misadventures" was posted.

If you're a front line clinician, I encourage you to check out the hundred or so comments in response to "Medication Misadventures," reflecting about the safety problems others perceive, how your system benchmarks with others, and how well you (and your patients) are protected from predictable slips, trips, and lapses.

If you coach, lead, or manage front line clinicians or have a role in funding the systems (equipment, software, and the support services needed to fully operationalize them), I encourage you to read the comments left in response to "Medication Misadventures." Comments are simply electronic footprints, showing clinicians' perceptions about the fitness of the medication use system. These are people (your people, perhaps) interacting with machines under real world conditions. How well is your medication use system engineered to account for human factors?

If you sell medication safety, remember to talk about human factors when you come calling. We're not shooting our fellow citizens with a hand gun. But we may be tempted to drive an old pick-up, and we regularly have to dislodge pieces of cat brier from the chipper-shredders we use to get the job done.

And if you design equipment or systems that will make the medication use system safer, please get back to work. We need you.
Off to mulch..... stay safe and come back soon!

Friday, April 24, 2009

The average Walmart shopper understands risk reduction!

A few weeks ago, a pediatric nurse shared an intervention that lessened the likelihood of an IV medication error. Her story celebrated nursing advocacy, the ability of professionals on the front line to recognize risk, intervene proactively, and make things safer for patients.

Last week, I went back to this pediatric infusion case, reviewing what a system analysis of the medication use process and seminal medication safety research tell us about risk. The take-away lesson? Errors that originate upstream are more likely to be discovered and corrected before the error reaches a patient while errors that originate close to the point of administration are less likely to be detected.


Here's another image prepared using the same seminal medication error data, illustrating more take-away lessons: errors that originate downstream are not only more likely to reach a patient, they are more likely to cause harm when they do.



Source: Leape L; Bates D; Cullen D; Cooper J; Demonaco H; Gallivan T; et al.Systems analysis of adverse drug events. ADE Prevention Study Group JAMA. 1995;274(1):35-43.

Now, let's look at the details provided in the original post:

  • the patient was 3 years old
  • the care setting was one that routinely cared for pediatric patients
  • the patient was receiving an IV medication
  • the medication infusing was not commonly used
  • the professional staff were not using a drug administration protocol familiar to them
  • no written guidelines for how to prepare the infusion or administer this particular drug were available
  • the medication infusion required titration (the dosing unit provided in the post: mg/kg/hr)

On-duty personnel at the end of the night shift were observed struggling to perform the calculations needed to titrate the prescribed dose (0.84 mg/kg/hr) while adjusting doses in mL/hr increments. Calculations of this nature are possible, but they are complex, error-prone, and more likely to be botched when performed by fatigued workers. It's also high-stakes work that's occurring while the infusion is attached to the patient, about as far downstream as you can go in the medication use process.

A routine medication, dose, and rate check performed by the oncoming nursing team revealed that the medication was being administered as prescribed. A change in the medication concentration advocated by the oncoming nurse simplified the rate and dose relationship, removing the need to perform complex mathematical calculation to titrate the dose at the bedside. Risk was reduced.

This represents very good work by the team at the bedside, who used tools available to them to reduce error potential. Other positive call-outs include: the setting was one that routinely cared for pediatrics; independent double-checks are part of the clinical culture; staffing was such that a seasoned nurse had time to consider risk-reduction strategies and advocate for change; and the professional culture is described as one that values inter-disciplinary communication and respect.

When I first read this case, though, I was struck by the image of front line clinicians trying to avert disaster, much the same as an airline crew in flight might have to work to solve an emergent in-flight problem. Where was the ground crew, I wondered? How much of the time-robbing, disaster-avoidance described by the nurse at the bedside could have been averted by better "pre-flight" processes? Do solutions implemented by bedside clinicians, using a relatively closed set of variables, yield the strongest possible risk reduction, or does the process become just a little "less risky"?

Administering intravenous medications to a pediatric patient is a high-stakes activity, but it is not a rare one, at least in this setting. (Spoiler alert: You're not going to find a link to a mega-document with a full-blown Failure Modes and Effects Analysis about pediatric medication infusions!)

Instead, let me leave you with a few high-level risk reduction strategies to consider, many that you'll recognize if you practice in a setting where The Joint Commission (TJC) standards frame clinical care. I'm not claiming expertise in TJC standards interpretation nor am I offering advice about what any particular organization should do to minimize risks associated with pediatric medication administration. (You can, however, find some here.)

But I like to think that anything that can be understood by the average Walmart shopper is worth sharing, so I'm closing with a few observations about how this Walmart shopper sees risk-reduction.

Risk-reducing activities are often reflected in TJC standards, and the "science behind the compliance" is often based on failure mode and effects analysis. These are things the "ground crew" should be thinking about to ensure the people on the flight deck have what they need to get the job done:

  • Establish standard concentrations for all IV medications (even the ones not often used). When IV medications are added to an organization’s formulary, they are subject to specific processes (usually under the auspices of a Pharmacy & Therapeutics Committee). These should ensure that standard drug concentrations are defined and incorporated into the tools used by professionals to prescribe, dispense, and administer the medication.

  • IV drug infusions are ideally prepared in a pharmacy. When operational barriers to pharmacy preparation occur--in care settings without 24 hour pharmacy or in regions where unit-based drug preparation is the standard of care--staff members who admix medication should have access to guidelines specifying the standard concentration along with detailed admixing instructions. Clinicians who administer infusions should have easy-access to sanctioned dose conversion charts.

  • “Smart” infusion pumps, with drug libraries programmed to reflect standard concentrations, make weight-based dosing even more simple. Dose-checking programming modes remove the need for manual calculations while immediately alerting clinicians--just prior to administration, the last possible discovery point--if an inappropriate dose has been inadvertently programmed.

There are other high-end strategies for reducing pediatric drug errors, some in development and some already being used in clinical settings. I encourage you to share yours. As for me, it's Friday, and I'm off to Walmart!

Tuesday, April 21, 2009

Gifts that keep on giving

This week at Grand Rounds, DiabetesMine blogger Amy is celebrating her birthday with stories full of good things. Florence dot com is over there, misbehaving with a story about Ring Dings, a move that might end my short blogging career.

I really liked the post about banana-split jelly beans (and not just because Ring Ding jellybeans might help Paula Poundstone and Michael Pollan bridge the gap between food and edible food-like substances).

In the spirit of Amy's birthday, I'm posting links to things I think are worth passing along to your friends, family, and patients:

  • AHRQ has a few 30 second public service announcements that are worth viewing (if they haven't already shown up at commercial breaks during Grey's Anatomy and House in your ad market). Seeing a healthcare provider like a waiter or a cell phone salesman may interfere with some healthcare mojo, but it's a necessary first step in getting people care they need and would actually like to have. I like the 3rd video in which healthcare workers burst into song to elicit questions from patients. (Even though the spot reminds me, somewhat uncomfortably, of the sign at my hairdressers' that says, "I'm a beautician, not a magician.") The AHRQ folks get an "A" for casting that one.

  • The Institute for Safe Medication Practice's ConsumerMedSafety website has a handy link that allows consumers to log their medications, creating a readily accessible medication list sharable amongst all providers. The added benefit is that e-mail messages are provided to the consumer should a medication alert, recall, or potential interaction be identified. The service is offered in partnership with iGuard.

Enjoy these, and visit Amy's blog for links to more things that will inspire you to have a great day!

Coming Friday: Reducing risk with IV pediatric infusions. Read Part 1 and Part 2 and get ready to compare your analysis with mine!

Sunday, April 19, 2009

Paula Poundstone thinks Ring Dings make life worth living

I'm a middle-aged woman, a stage in human development that can be cross-indexed to conditions like "I no longer sleep," "I can't see for squat," and "My thermostat is busted." Although I'm mostly aggravated by these changes, I've found a happy byproduct of insomnia: In the early morning hours when I'm not sleeping, I listen to podcasts of interesting programs, interviews, and books that wouldn't otherwise fit into my busy days. (And when I don't find these interesting, they often put me back to sleep.)

A few weeks ago, though, my husband (who has no problem in the "sleeps soundly all night" department) is awakened because the bed is shaking and I'm laughing out loud, listening to an exchange between Paula Poundstone and Michael Pollan on Wait, Wait...Don't Tell Me, NPR's weekly news quiz show.

Michael Pollan is a well-known food and food economics expert, who describes a reasonable approach to eating as, "Eat food. Not too much. Mostly plants." Attempting to spread this message to an untapped constituency, Pollan agreed to be interviewed by Wait, Wait's host, Peter Sagal and a panel of celebrity comedians. Here's a link to the audio exchange between Paula Poundstone and Michael Pollan. (You'll get a brief ad that supports NPR programming, then cue the next recording to 2:20 and listen until 4:40. But do this only when you have time for a good laugh.)

If you can't access the audio, here's the gist of the exchange:

Pollan: It's very hard for people to know what food is these days, because there are so many edible food-like substances competing with food in the supermarket.

Poundstone: One of the things that has made my life worth living is, uh, Ring Dings. Are you going to tell me that's not food?

Pollan: Well, there's a few simple tests to figure out if a Ring Ding is food or not. How many ingredients does a Ring Ding have....

Poundstone: (interrupting, and sounding optimistic) Devils food cake, one. A creamy filling, two. And a rich chocolate outer coating, three. What's the matter with you?

Pollan after an exchange that suggests Ring Dings are not, in point of fact, food, moves on, offering Poundstone middle-ground advice: There are things that could be characterized as "special occasion food."

Poundstone: And what part of, 'Ring Dings make my life worth living' did you not hear?

The conversation ultimately concludes with an aggresive-sounding Poundstone telling Pollan, "You may know a lot about food, but you don't know the first thing about living, buddy!"

This conversation is instructive for anyone who has ever tried to tell anyone something they simply didn't want to hear. It's a window, an opening that shows how culture influences health, health-defining choices, and why the need for health care in the U.S. looks the way it does. Who knew so much rested on the answer to the question, "Are Ring Dings a special-occasion food or an edible food-like substance?"



I hope you'll stop back at Florence dot com again this week for more commentary about barriers that prevent people from engaging in health-promotion and disease management, plus some useful links to resources for managing them.

Bon appetit!


Friday, April 17, 2009

Risk points across the medication use system

Recently, Nurse Ausmed shared a great post about an intervention that lessened the likelihood of an IV medication error occuring in a pediatric patient. Her take-away lesson was "simplify, simplify," a core principle in safe medication practices.

Earlier this week, I wrote that Nurse Ausmed's online case study would lend itself to a basic excercise in identifying the latent (or upstream) conditions that often lead to error on the front line. Links to ideas about modeling human error (developed by James Reason) and some online pediatric medication safety resources were provided.

Here are my initial thoughts:

To answer the question, "Could the error-prone condition be identified and the potential for patient harm lessened before it reached the front line?" start by considering what the medication use system looks like:

The roles and responsibilities of professionals ("who does what, when, and how") may vary according to practice setting and applicable professional standards of care. Irrespective of setting, however, prescribing, transcribing, dispensing, administering, and monitoring are the units, or nodes, that make up the medication use system.

It's important to pause for a moment and take this in. As individuals, we typically focus on the portion of the system where our own professional duties lie, rather the system as a whole, making it difficult to see upstream opportunities.

Now consider the problem of errors in the medication use system:

Errors may originate at any point in the process. In the slide below, the red arrow illustrates how an error that began in the prescribing phase is not picked up, moving through all downstream defenses to reach the patient.


Seminal medication safety research shows the likelihood of catching (and correcting) an error increases the further upstream the error originates. This makes sense since an error in the prescriber's order has the potential to be picked up by the person who dispenses the drug, the person who administers the drug, or the patient. This is why processes like independent double checks and automated clinical decision support are valuable: they make errors and error-prone conditions visible before they reach the patient.

Unfortunately, errors that originate in the administration phase are highly unlikely to be picked up before they reach the patient. This is why processes at the point of administration should be as simple, standard, accurate, and dependable as feasible. (I think of clinicians who administer medications and the processes they use as I would a flight crew: it's probably not a good idea to expect problems to be solved at 35,000 feet that could have reasonably been resolved on the ground or to use patchy processes to accomplish high-stakes, in-flight tasks.)

Later next week, I'll come back to this topic, using clinical information Nurse Ausmed shared to help identify strategies for preventing IV medication errors in peds that are on the high end of the risk reduction hierarchy. I'd hope you'll share the risk-reduction strategies you use when you care for pediatric patients.

Stay safe and come back soon! I've been thinking about something I heard Paula Poundstone say last week, and the next time you check in, you need to be ready to laugh!

Thursday, April 16, 2009

Human Factors

The bi-weekly nursing blog carnival, Change of Shift, is up over at Emergiblog. I love the name "Change of Shift" because the view of healthcare that emerges there, like a Change of Shift hand-off, provides a window into the system we use to deliver care. The realities faced by clinicians on the front line are good reads for anyone interested in understanding the physical and mental challenges involved in providing care, and these accounts show how much relationships matter.

I especially loved a post by Drug Pusher (and not just because I have a keen interest in medication safety!).

The summary of nurses' views about barriers to providing safe care at Better Health is definitely worth reading, especially for people working to develop health IT products. I have an untested hypothesis that goes something like this: Few industries interested in producing reliable results or maintaining strong financial performance would support "humans babysitting machines" to the extent that we currently see in healthcare. Watching "safety" fight with "efficiency" is like watching "healthcare" fight "education" for scarce resources. These are not "pick one" issues. "Nurses dish on communication lapses that harm patients" seems to support my hypothesis.

I hope you'll visit Change of Shift every other Thursday: it's a good place to benchmark practices and keep an eye on emerging trends.

Next up: Deconstructing an error-prone IV medication set-up in a pediatric patient. Review Part 1 here.

Tuesday, April 14, 2009

Grand Rounds

Grand Rounds is up at Pharmamotion with an interesting series of posts about the state of health and healthcare. Anyone looking for exemplars or barriers to the Institute of Medicine's six dimensions of care (Safe, Effective, Timely, Efficient, Patient Centered, and Equitable) will find interesting things to read there today.

I'm reflecting on Nurse Ausmed's post that was included in Grand Rounds because it speaks to realities of the system front line clinicians rely on. In this post, a seasoned nurse reflects on the theme of nursing advocacy, an important component of nursing care. The touching account of how she helps young brothers prepare for the death of their newborn sibling shows the value of individualizing care, of using "teachable moments" that unfold at the bedside, of responding in ways that set the stage for healing that will occur long after clinical care ends. This is the essence of patient and family centered care.

Nurse Ausmed also shares how nurses' advocacy can make medication administration safer. It certainly did in the case she described (having the concentration of an infrequently used medication infusion--being administered to a pediatric patient--changed so that titration in mg/kg/hr would require fewer calculations at the bedside). By recognizing and responding to an error-prone condition, Nurse Ausmed makes a potential mistake that had been set in motion visible, and advocates effectively to mitigate the error-prone condition before harm occurs.

Now let me exchange my virtual nurse's cap for a safety engineer's hard hat and invite you to put one on, too. If you're new to systems thinking, and you want to learn how to analyze the error-prone conditions in your workplace, here's a real-time exercise:
  • Re-read the portion of Nurse Ausmed's post entitled, "Simplify, Simplify." (She's provided a significant risk-reduction hint in the title.)

  • Ask yourself: Could this error-prone condition have been identified and lessened before it hit the front line? Nurse Ausmed's efforts were stellar. She's clearly an A player (and she's at the beginning of her shift!). But I'd encourage you to identify upstream interventions--those that could reasonably be undertaken through science-based, interdisciplinary collaboration--that could have lessened the likelihood of error before the infusion reached the patient.

  • Check back on Thursday to see how my analysis aligns with yours.

To help you get started, here are links to a few resources I'm going to use: James Reason's modeling of human error; ISMP's analysis of the events that led to the death of Sebastian Ferrero; cues and clues offered in The Joint Commission's Sentinel Event Alert related to pediatric medication safety; and recommendations from a multi-stakeholder group of experts convened last summer to identify ways to prevent IV medication errors.

Enjoy Grand Rounds this week, and check back here on Thursday to see principles of error-reduction at work!

Saturday, April 11, 2009

Get ready to cancel the launch!

In Lessons from a Sunken Ship, I recounted the story of a 1628 shipwreck that occurred in the aftermath of a failed stability test, a test result known to at least 30 shipbuilders (who nearly put the Vasa underwater during a preliminary test of seaworthiness) and the ranking military leader who observed the aborted test. Signing off on the launch, the ranking officer lamented the absence of the King, apparently the only person with authority to cancel the launch.

This story lends itself to talking about the dangers of rigid hierarchies, and I'll probably return to it at some point, rigid hierarchies having sunk more than a few ships in healthcare. But the Vasa also illustrates principles about the hierarchy of error and harm prevention:

1. Eliminate or prevent mistakes. A better design would have prevented the Vasa from going down.

2. Make mistakes that have been set in motion visible. The ship did not perform as expected when subjected to simulated sea-like conditions. Not launching a ship with dubious stability would have prevented the Vasa from going down.

3. Mitigating the hazard should a mistake occur. Lifeboats prevented some people on the sinking ship from going down.

4. Education/re-education about how to manage known hazards. Swimming lessons might have helped some people save themselves.

Healthcare has been criticized for the tendency to bottom feed when it comes to risk reduction, meaning that we tend to rely on risk-reduction strategies low on the hierarchy. This doesn't mean that, as individuals, healthcare professionals don't care about risk or don't want things to turn out well. It simply means that we're more likely to select and implement interventions like "review policy" with individuals who make errors than to examine the underlying factors that allowed frontline workers to err. We spend a lot of energy attempting to teach front line clinicians how to save themselves.

So if we got out of the lifeboats and headed north on the risk-reduction hierarchy, how far could we go and what would the consequences be?

Health and healing are complex, and it's fair to say that we're sailing more than a few badly designed ships. 1 in 7 Americas lack healthcare insurance. Healthcare disparities are rampant. Patients are older, sicker, and rounder than they used to be. Our system does not incentivize prevention. A better design would avert many crises. But redesign of healthcare--something that appears to be emerging as a national priority--is outside the locus of control of individual clinicians, irrespective of how often or how nobly we face the consequences of the current bad design.

So how can front line clinicians prevent a poorly designed vessel from sinking? One answer is: embrace processes and procedures that make mistakes set in motion visible. Be able to identify emerging practice changes as the higher-level risk-reduction strategies they are. Get ready to cancel the launch!

A bar-code scan reveals a mismatch between ordered medications and a similarly packaged one in the patient’s drawer: you’ve cancelled a launch. A pre-procedure time-out reveals a site-of-surgery discrepancy: you’ve cancelled a launch. Reading back and verifying a telephone order (insulin 50, five-zero, units sub-cutaneously now) reveals the prescriber on the crackly line said one-five (15), not 50 units of insulin: you’ve cancelled a launch.

Cancelling a launch is not as good as preventing mistakes from occurring. But this approach trumps lifeboats and swimming lessons. Right now, healthcare is adopting, occasionally adapting, risk-reduction strategies from other industries, industries more reliable than ours.

The best risk-reduction strategies, in my opinion, are yet to come. As healthcare workers--bright, caring, and competent individuals—come to understand the principles that drive reliable performance, participate in developing highly-reliable processes, demand these be vigorously applied, and eventually come-of-age in an environment where reliability is the norm, it will no longer be necessary to report preventable adverse health events as aggregate data!

See you there!

Wednesday, April 8, 2009

Fishing in a well-stocked pond

Florence has a sister, a new blog on the Medscape site called, "On Your Meds: Straight Talk about Medication Safety." I hope you'll take a look, and bookmark the site because On Your Meds is going to host a running commentary on specific strategies for reducing medication errors.

The current post is about high-alert medications, those with a heightened risk for causing harm if used in error. Insulin, chemotherapy, narcotics, drugs with weird dosing schedules, drugs with impossibly narrow therapeutic indices, drugs that result in closure of life-sustaining orifices if halted by mistake..... Let's just say that the drugs on ISMP's High-Alert list have earned their place.

I spent a year studying medication error prevention with ISMP, the nation's foremost experts on the subject. So I know more than the average bird, and often more than I wish I did, about medication errors. But you probably do, too: A medication error is any preventable event that may cause or lead to inappropriate medication use or patient harm while the medication is in the control of the health care professional, patient, or consumer.

People often wonder where I get the stories I use to illustrate key facts about med safety. Med errors can arise anywhere in the medication use process, a complex system (run by human beings) that includes: prescribing; dispensing; administering; and monitoring the effects of drugs.

Now consider that in a given week, an average of 82% of adults in the U.S. are taking at least one medication (prescription or nonprescription drug, vitamin/mineral, herbal/natural supplement); 29% are taking five or more. (These stats provide a snapshot of adults in community settings and exclude medications administered to people in hospitals and extended care facilities.) At this point, the stories find me. Or as Larry the Cable Guy might say, "You're fishin' in a well-stocked pond, sister."

Last week, I'm in the locker room at the YMCA, sharing a little more personal space than I'd rather. I've just finished cycling, and it looks like the Y member closest to me is preparing for "Twinges," the water class for people with joint disorders. She's chatting with a friend, and putting her clothes in a locker. The next thing I know, a bunch of pills, maybe 12, have spewed from the pocket of her balled-up Khaki pants. Some hit the bench, some the floor, and a few land in my gym bag. I help her retrieve them, phrases like "drug storage" and "mindfulness" flashing in my brain. She scoops up the last visible ones, examines her catch, re-pockets them, and says to her friend, "Good, I got the yellow one. We can still go to lunch."

I'm working on a project about insulin pens, visiting the manufacturers' Internet homepages and checking out the patient education materials available there. I notice images on a manufacturer's site where hip-looking teens are depicted using their insulin pens as hair accessories. Phrases like "drug storage" and "mindfulness" flash in my brain.

ISMP shares an error analysis in which a patient being treated for angina in a busy Emergency Department receives IV saline instead of IV nitroglycerin. The commonly used nitroglycerin is seated next to a similarly-appearing, but obsolete, glass bottle of 0.9% sodium chloride. The key elements? "Drug storage" and "mindfulness."

Across the continuum, themes repeat. I'll be reflecting more about them here at Florence dot com and at On Your Meds. In the meantime, it may be worth thinking about the utility of high-level risk-reduction tools. Should professionals use the same strategies to manage medication risks that senior citizens at the YMCA do?

Tuesday, April 7, 2009

Grand Rounds

This week, Leslie is hosting an interesting discussion for Grand Rounds, reflecting on the way life used to be. Since my experience has been that some of the best thinking and most creative solutions to tough problems come from non-linear approaches, I was happy to see how many different ways healthcare bloggers took on the challenge to reflect about change.

And kudos to Leslie for linking all the submissions in such a cohesive way!

I'm in-between posts, writing more about what sunken ships and old Swedes tell us about safety in healthcare systems today. The story of the Vasa, Lessons from a Sunken Ship, and a short analysis is included at Grand Rounds under "The More Things Change, The More They Stay The Same."

Leslie has given me something to think about as I write the follow-up piece to Lessons from a Sunken Ship. She saw my post as calling for regulatory measures to ensure that the wisdom of the frontline is heard. (Something, I'm afraid, likely to cement my husband's darkening opinion about my thought processes.)

I'm actually more interested in ways to influence culture, ways to promote a culture of safety, that are less linear than regulations and standards tend to be. Our "company manners" will only get us so far. I want to be at the table when only the family is home!

Monday, April 6, 2009

Sign, sign, everywhere a sign.

I keep getting, ummm, signs.

A sign that arrived in an e-mailed joke a few weeks back inspired me to write about high reliability.

Then Bill ("Here's Your Sign") Engvall gave me one, causing me to make mention of the fact that all risk-reduction strategies are not created equal.

And just a few minutes ago, I found this one while checking out the Facebook page of Shelby Caldwell, a talented young photographer:


used with permission

I'm not certain if the abundance of signs means the people at this grocery store really, really, really don't want you to fall (in which case a bag boy with a shovel and some salt might confer better protection) or if they want to warn you that you're about to fall irrespective of which space you park in or which cart queue you pull from.

(It may be helpful to know that where I live snow, especially spring snow, often melts before a "snow removal crew"--ummm, bag boy and shovel--can be mobilized.)

Since I can't say it better than the 5 Man Electrical Band did many years ago, I'll leave you with this:
"Sign, sign, everywhere a sign
Blockin' out the scenery,
breakin' my mind
Do this, don't do that, can't you read the sign"

Friday, April 3, 2009

Lessons from a Sunken Ship

A thing I’ve begun to enjoy about blogging is that it helps me find memories I might otherwise have forgotten and lets me bring people (and pets) into some discussions that might have meaning for you, too. Thanks to all who have sent kind words about Daisy, our dog who is currently receiving hospice care (and recently debuted as the poster child for safe canine medication practices). It’s been a quiet, medicated morning, and I’ve been tooling around the internet and touching base with my parents as I put the finishing touches on the story about ships, Swedes, and safety that I’m sharing today. I hope you’ll enjoy it!

"Those who cannot learn from history are doomed to repeat it."
- George Santayana

On August 10, 1628, only minutes after setting sail on its maiden voyage, the mightiest warship of its time, loaded with a crew of 150, sunk in the Stockholm harbor. The Vasa had been commissioned by King Gustavus Adolphus, Sweden's monarch, who was engaged in a fight with the Poles at the time and desperate to seat a crown jewel in his armada. It’s a well-established fact that the King repeatedly tinkered with the vessel’s design while simultaneously demanding its rapid completion. But these were not the only reasons the Vasa sunk.

Like all disasters, this one had a host of contributory factors setting up the “perfect storm,” that allowed the mighty warship to sink in the Stockholm harbor on a perfectly beautiful, sunny, summer day. An abbreviated but insightful root cause analysis can be found on the official Vasa website. For people who are interested in how culture influences safety, as I am, lessons gleaned from the Vasa are particularly valuable.

It’s relatively easy to see how untested innovation, production pressures, and loss of key leadership contributed to the Vasa’s disastrous voyage. But what's really interesting to me is the Vasa’s failed stability test: In the days before the tragic voyage, the ship had undergone a preliminary test of seaworthiness using the stability testing standards of the day. This involved having a gaggle of men from the shipyard, in this case about 30, run back and forth across the ship’s deck while the ship remained moored. The Vasa’s stability test was halted after just three runs-- long before a satisfactory result was obtained--to prevent the ship from capsizing at the dock.

Nothing further was done to improve the Vasa’s stability before the ship set sail days later.

This sequence of events means that in the interval between the failed test and the maiden voyage, there were at least 30 rank-and-file shipbuilders who knew, who had to have known, that the ship was destined to sink. Do you wonder what they were saying to each other?

I think this particular piece of information captured my imagination when I toured the Vasa Museum several years ago because I know a little something about Swedish sensibilities, having been raised by a first generation Swedish-American whose family flipped back and forth between Sweden and the U.S. in the early 1900’s. Three of my grandparents emigrated from Sweden, and I was born in a small town with a large sub-population of Swedish immigrants. We’re private people, not given to share unsolicited advice (although my cousin once observed that if you sought my father’s advice, he would provide such a detailed explanation that even a novice could fix a Corvair). I joke that if my father asks, “How’s that working out for you?” you’re likely doing something that could cost you a finger.

I don’t know if my father’s sensibilities speak to the culture in the Stockholm shipyard in the 1600’s, and frankly, it probably doesn’t matter. What does matter, and still matters today, is that the Vasa sunk in part because there was no mechanism in place, no recognized, endorsed, or welcomed way, for critical information known by line managers and workers to be heard. My father will help you out, lending his considerable knowledge, time, and skills most generously, but you have to let him know you want to hear from him.

In 2007 (that's 379 years after the Vasa sunk, according to the calculator app in my iPhone), researchers studying how to best identify and respond to healthcare defects giving rise to the epidemic of adverse events that confront us today observed,

“There are many sources to identify defects, including patient safety reporting systems, morbidity and mortality conferences, sentinel events, liability claims, and perhaps most powerfully, asking staff how they think the next patient will be harmed.”1
This concept ain’t an iPhone, folks. Just ask my Dad.

Coming next: Later is better than never (more lessons from the Vasa).

1 Berenholtz, B. & Pronovost, P. (2007). Monitoring patient safety. Critical Care Clinics, 23, 659-673.
 
Creative Commons License
Florence dot com by Barbara Olson is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.