A Made to Stick story that stuck with me is how coaching leader Jim Thompson drove down the incidence of bad behavior in youth sports. Thompson started by recognizing "be a good sport" was an insufficient call to action. It wasn't a strong message, and more importantly, the charge didn't improve the conduct of youth players, coaches, parents, or spectators. So he refocused attention away from the individual to something larger, the game. Thompson called his campaign "honor the game" and illustrated it with some powerful examples, like this one:
Lance Armstrong once slowed during a Tour de France race to give his chief opponent, who had crashed, the chance to get back in the race. As his opponent remounted, Armstrong paused rather than taking full advantge of the lucky break, later noting that he rode better against strong competition. Armstrong wasn't "being a good sport." He was honoring the game.Redesigning healthcare is about honoring the game, making it possible for the actions of individuals to contribute, in measurable ways, to something larger. The healthcare industry, and more importantly, the processes used to deliver healthcare, are under scrutiny. They should be: more people die in the US every year from preventable medical errors and healthcare acquired infections than die of AIDS, breast cancer, and auto accidents, combined.
"Honoring the game" requires a different style of play than Nike's more familiar call to action, "Just Do It!" (at least in the beginning stages of the race against harm-causing errors). Here's why:
For some time now, system design and human factors experts, dispatched from high reliability industries like commercial aviation and nuclear power, have partnered with healthcare workers to find out what ails us. Early on, industry outsiders recognized something important: When compared with other industires, the systems healthcare workers relied on lacked standard engineering controls, key elements needed to make intention match outcome. (Standard engineering controls include such things as barriers, redundancies, and opportunities to detect and mitigate errors that have been set in motion).
In commercial aviation, high-stakes tasks that could cause harm if performed incorrectly are never executed by just one person. There's always a double check. In fact, these process checks are mandated by law. Compare this norm with what a nurse (at least in the era when I came of age as a clinician) may be expected to do in a busy Emergency Department: take a verbal order, retrieve a medication from a large cache, calculate the dose, prepare the medication, and administer it to a patient. This process could be the norm irrespective of whether a drug (like Lasix) has a small chance of causing harm if used in error or whether harm is highly likely if an error occurs, as is the case with IV heparin. No barriers, no redundancies, and scant opportunity to detect an error that's been set in motion.
A commercial pilot would never fly using the type of safeguards most nurses have been taught are reasonable for caring, competent professionals to use and execute flawlessly, even under the most hostile conditions.
Industry comparisons will not take us the whole way on the journey toward reliability. But industry comparisons help dispel myths, some of which healthcare workers may find painful. The good news is that nurses, pharmacists, physicians, and others who work in healthcare are not inherently more eror-prone than the professionals who maintain airplanes, fly them, or control air traffic. The bad news is that we're not less prone than others to screw up either. And how much a professional understands or cares about a process, an outcome, or an individual patient may not be as important as many of us intuitively believe.
To honor the game, a player has to have reasonable chance of being successful. When my son was young, he had a computer baseball game that allowed him to select teams, take the field, and play virtual games. Luke's team always won because he put himself on the team with Sammy Sosa and Mark Mcgwire. (The opposition in his fantasy game usually had a few bookish kids with their shoe laces untied and, as I recall, a little girl with broken glasses on crutches.)
In the real world, we need to make certain people stand a chance of executing the tasks we expect them to do. This is the reason nurses, and other front line healthcare professionals, should pay attention to system design and speak up when expected outcomes can't be delivered without a work-around. Look for appropriate barriers, redundancies, and opportunities to recover an error set in motion. A good place to start is to think about how you identify patients, have medication orders reviewed, store drugs, and take verbal orders. The "a-ha" moments will follow.
Really. Just do it.
Note: My undertanding of human behavior, performance-shaping factors, and system design is highly influenced by the work of David Marx, President of Outcome Engineering and the author of the Just Culture algorithms. Dave and others from OE have generously shared their time and expertise to help me learn more about "the science behind the compliance" in patient safety. I encourage you to visit the Just Culture website and read Dave's book "Whack-A-Mole: The Price We Pay for Expecting Perfection" to learn more.