I read through all five parts of physician Robert Wachter’s article on Medium, “How Medical Tech Gave a Patient a Massive Overdose.” In it, Wachter chronicles the steps of oversight and technology-enabled errors that prompted a nurse to give a pediatric patient at UCSF Children’s Hospital 38.5 times the proper dose of an antibiotic. The command went, approximately, from a resident to a pharmacist to a pharmacy facility to a nurse to a patient. Among the culprits in the trail of errors he describes, the following five issues stood out to me:
1) All alerts are created equal. The online medication ordering system in this case had built-in alerts that always looked the same–whether trying to double-check the dose (are you sure you want to give 2 pills rather than the usual 1?) or prevent a huge no-no (are you sure you want to give 38.5 pills rather than the usual 1?). This makes it tough to quickly determine the seriousness of the alert.
2) There are too many alerts. Much like the boy who cried “Wolf!”, Wachter notes that a system producing too many alerts–a cacophony of pop-up boxes, flashing lights, blaring sounds, and codes over the loudspeaker—leads many to build a habit of ignoring many alerts.
3) A robot isn’t emotionally ruffled by the experience of counting out 38.5 pills. A robot in charge of processing pharmacy orders for in-patients does what it’s told to do. A human, on the other hand, might wonder why all these pills must be counted out, why such a high dosage is being ordered, or at the very least, why this dosage isn’t liquid.
4) User interfaces emphasize default settings. Default settings are great for efficiency. They reduce the number of clicks required for a doctor to place an order. For medical dosages, a doctor can just input numbers, theoretically look over the default units (like mg/kg), and then let the system calculate the final dose. But default settings are not good when we don’t notice that the units don’t line up. In this story, that’s what happened.
5) Confirmation bias. There’s a chain of at least 5 people a wrong dosage can go through before it reaches the patient, so why wouldn’t it get questioned along the way? The technology of the medical ordering and packaging system is supposed to be designed to prevent errors, and so even when it doesn’t, people assume that they are the wrong ones. People have a confirmation bias where they assume that an order they see, one that has been previously approved by a superior and passed through the computer alert system, must be correct. And so even when the last person in the chain of events—the nurse giving the little boy his medication, in this case—has nagging doubts, she thinks she must be wrong. She doesn’t want to bother her busy colleagues.
Wachter’s piece is an excerpt of his book, “The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age.” He provides some solutions for the issues brought up by this case:
- better alert design
- frugal use of alerts
- more skepticism when using Med Tech
- a culture where questioning decisions is encouraged
- a willingness to share faults in order to learn from them
One of the biggest takeaways for me, however, is the fact that the medical professionals involved in this story knew these were problems all along. Who didn’t know? The designers of the medical technology being used. Wachter writes:
But not every problem can be fixed in-house. Some issues can only be fixed by outside software engineers — in our case, the ones sitting at Epic’s massive headquarters in Verona, Wisconsin. Even then, the company only makes such revisions available to all of their clients in the course of periodic software updates, which come perhaps once or twice a year. Because most health IT systems are not cloud-based, they lack the ability to push out a rapid update, the way we’re all used to doing on our smartphones and tablets.
If more doctors were involved in creating medical technology and had an integral role in updated it, maybe there would be a better body of knowledge and “best practices” for technological design in the medical system. It took an Epic failure (pun maybe intended) to get Wachter on the case of the harmful intersections of medicine and technology, and the observation that healthcare IT is a “double-edged sword.” Of course, getting our physicians to spend time helping to develop and improve medical technology is not the only way to apply creativity to these problems. Wachter writes:
Preventing the next Septra overdose will take efforts that focus on problems far beyond the technology itself, on the other layers of Swiss cheese. For example, the error by the pharmacist owed, at least in part, to the conditions in the satellite pharmacy, including the cramped space and frequent distractions. The satellite pharmacists now work in a better space, and there have been efforts to protect the pharmacist who is managing the alerts from answering the phone and the door.
Externally-designed technology determines the path of our internal decisions, despite our best intentions. Doctors who see areas for change—in software, alerts, robots, working spaces, leadership philosophies—need to be encouraged to step forward and make their voices heard.