I had an unusual introduction to Human Factors (HF) engineering when I served on a jury that heard a chipper-shredder mishap in the mid '90s. I'll forgo the tragic details that gave rise to a suit against the manufacturer, and leave you with just the take-away lesson: if a machine's outer casing is shielding a series of free-swinging blades seated on a spinning Ferris wheel-like device, the likelihood of your hand being sucked in from below (at the small, innocuously-appearing exit shoot) is as great as it is from above (at the larger entry hopper where you toss yard debris). This information becomes even more relevant should be you be tempted to free--even from a distance well away from the exit shoot--a thorny vine caught around the spinning Ferris wheel device.
During the course of the two week trial, HF expert witnesses gave our jury a soup-to-nuts education about chipper-shredders, attempting to get us up-to-speed about design principles, mechanical features, and professional standards manufacturers must conform to.
It did not come as a surprise to HF expert witnesses--neither those testifying for the plaintiff nor those testifying for the defense--that human beings and chipper-shredders had a high potential for yielding tragic outcomes. In fact, a manufacturer's ability to bring a high-hazard product to market, and keep it there, hinges on whether it can be made safe enough to protect people from the predictable mistakes they are likely to make while using it.
While I found the mechanical aspects of chipper-shredder design interesting (even using the information when purchasing a chipper-shredder of my own a few weeks later), I experienced a profound "ah-ha" moment when I realized how differently HF experts evaluated risk and selected risk-reduction strategies, compared to how I did. (At the time, I was an experienced intrapartum nurse and a leader on a 600 births/month maternity service.)
According to HF standards, warnings--even bold ones using pictures with high-contrast color combinations and affixed in strategic locations--are insufficient if a higher-order strategy--like installing a protective grate north of the exit shoot--is feasible. Written directions (think: policies and procedures) similarly fall low on the list. Because written directions and warnings have a high failure rate, their best use is in conjunction with risk-reducing strategies that are more likely to work.
I think my jury service occurred in 1996 or 1997, several years before the publication of To Err is Human, an IOM report that quantified healthcare errors and served as a multi-stakeholder call-to-action. The subsequent 2001 report Crossing the Quality Chasm began to describe specific improvement strategies, previewing successes and borrowing methodologies from human factors-oriented industries, like commercial aviation and nuclear power.
In the mid-'90s, the idea that healthcare workers should do more than, "review policy" and "counsel individual" was revolutionary. But today, it shouldn't be.
Two years ago, Sean Berenholtz and Peter Pronovost, physicians at Johns Hopkins University and leaders in patient safety research, commented on interventions selected to prevent reoccurence of mistakes in healthcare settings, noting, "Unfortunately, weak interventions predominate and are often the same traditional solutions offered in a new package."1
Next time, I'll share more about a rank-ordering of risk reduction strategies that's used to promote medication safety. If I've piqued your interest, you can find a short case study and critique of the use of low-level risk reduction strategies in a June 2008 Pennsylvania Patient Safety Advisory.
In the meantime, stay safe working in your yard!
1 Berenholtz, B. & Pronovost, P. (2007). Monitoring patient safety. Critical Care Clinics, 23, 659-673.