Sometimes too much is too little

As a Nurse Faces Prison for a Deadly Error, Her Colleagues Worry: Could I Be Next? | Kaiser Health News (

(Note: the following is a pretty long quote, but the website says specifically that this article can be republished for free.)

As the trial begins, the Nashville DA’s prosecutors will argue that Vaught’s error was anything but a common mistake any nurse could make. Prosecutors will say she ignored a cascade of warnings that led to the deadly error.

The case hinges on the nurse’s use of an electronic medication cabinet, a computerized device that dispenses a range of drugs. According to documents filed in the case, Vaught initially tried to withdraw Versed from a cabinet by typing “VE” into its search function without realizing she should have been looking for its generic name, midazolam. When the cabinet did not produce Versed, Vaught triggered an “override” that unlocked a much larger swath of medications, then searched for “VE” again. This time, the cabinet offered vecuronium.

Vaught then overlooked or bypassed at least five warnings or pop-ups saying she was withdrawing a paralyzing medication, documents state. She also did not recognize that Versed is a liquid but vecuronium is a powder that must be mixed into liquid, documents state.

Finally, just before injecting the vecuronium, Vaught stuck a syringe into the vial, which would have required her to “look directly” at a bottle cap that read “Warning: Paralyzing Agent,” the DA’s documents state.

The DA’s office points to this override as central to Vaught’s reckless homicide charge. Vaught acknowledges she performed an override on the cabinet. But she and others say overrides are a normal operating procedure used daily at hospitals.

While testifying before the nursing board last year, foreshadowing her defense in the upcoming trial, Vaught said at the time of Murphey’s death that Vanderbilt was instructing nurses to use overrides to overcome cabinet delays and constant technical problems caused by an ongoing overhaul of the hospital’s electronic health records system.

Murphey’s care alone required at least 20 cabinet overrides in just three days, Vaught said.

“Overriding was something we did as part of our practice every day,” Vaught said. “You couldn’t get a bag of fluids for a patient without using an override function.”

Overrides are common outside of Vanderbilt too, according to experts following Vaught’s case.

Michael Cohen, president emeritus of the Institute for Safe Medication Practices, and Lorie Brown, past president of the American Association of Nurse Attorneys, each said it is common for nurses to use an override to obtain medication in a hospital.

Basically it looks to me that from a technical standpoint, this nurse ignored numerous safeguards, but that as a practical matter, the consensus of actual nurses seems to be that these safeguards are routinely ignored, and this error could indeed happen to anyone.

I feel this is a broader problem, in that safeguards are instituted which are not practical and will be routinely ignored. Another example is disclaimers on financial and legal documents. I was once closing on a mortgage and signing paper after paper that the closer passed over (I knew the general idea of what each was, but didn’t actually read any of them) and I asked the closer if in her history of closing mortgages anyone had ever actually read each paper. She said only one time, an earnest young couple closing on their first ever mortgage. Point being that these documents might have been written up with the intention of having people know the information in them but as a practical matter, reading each paper is not practical and simply won’t be done.

I think this nurse ignoring the various pop-up warnings falls into the same category. Yes, in theory you should read all these pop-up warnings but there are so many of them and you don’t have all day for this procedure, so you get into the habit of ignoring them.

The same goes for various safety measures. You can get people to do some measures if they’re simple and effective, but once they get too cumbersome then people will just disregard them for the most part. (I’ve seen a lot of professionals on construction sites, and in general they absolutely do not conform to the safety measures recommended for their various power tools and the like.)

I think a big part (though not the entirety) of it is considering each item individually. If you were thinking of a mortgage closing with no disclosures at all, and now you were thinking of mandating one disclosure for one important matter, that might make sense. It might be reasonable that people will read one paper. But meanwhile, another well-meaning regulator is mandating a disclosure on another matter, and that lessens the likelihood that either will be read. And so on; by the time you get to 25 disclosures, there’s virtually no likelihood of anyone reading anything.

In sum, the broader point is that you need to consider the practical impact of various precautionary measures, not just in terms of how much you’re imposing on people you’re subjecting to them but also in terms of much they make it less likely that people will conform to these measures altogether. The more specific point is that sometimes each individual measure makes sense from a cost/practical consideration standpoint, but the cumulative effect of numerous such measures undermines all of them.

I’m sure most of us have at one time or another mistakenly deleted an important file because clicking “OK” just becomes an automatic response.

I would need to know if these types of warnings actually happen in less important situations. Are they in fact regularly shutting down warnings like this? And, if so, is that because the system is badly designed, or because they aren’t using it as intended.

The argument here seems sound: she had a lot of chances to notice the problem and didn’t. Many of these are not part of the warnings. The medicine was in a completely different form (which would take longer to make up) and, of course, the label on the medication. It doesn’t make a whole lot of sense why standard procedure would not be to check that the name matched the one being prescribed.

Even if the system is screwed up, it does often take something like this for people to do anything about it. It’s stupid, but until they see deaths and actual possible liability for those deaths, often bad systems continue. For all the money we spend on healthcare, they seem to be the most likely to keep using older technology with proven (but sublethal) flaws. There’s a reason why there’s the joke about them still running Windows XP, for example.

It’s probably true that the system is poorly designed (doesn’t let you search by brand name) and is being taught/used incorrectly (nurses are routinely ignoring warnings).

My idea of reckless homicide in medicine would be performing a surgery while under the influence of drugs, and the patient dying. Not so much administering the wrong drug by accident. Compare with reckless homicide in the context of vehicles, that is, involuntary vehicular manslaughter. If you aren’t DUI or doing something really stupid - something nobody in their right mind would do, because you know it’s risking injury or death, like driving on the sidewalk or into oncoming traffic - it’s probably a misdemeanor charge.

I don’t think her negligence amounts to a felony abuse of an impaired individual. The law (TN Code § 71-6-119) says accidents are not included under that offense, nor should they be.

It’s no joke, let me tell you that.


I have first-hand knowledge of these devices and the need to wade your way through a series of warnings and are-you-sure? reminders. The built-in safety measures are sometimes over the top, but certainly prevent some errors.

But it was not the dispensing machine that administered the medication, it was the nurse. Nursing schools, from day one, repeatedly stress the “five rights” of medication administration (right patient, drug, dose, time, and route). Dispensing machine or not, this crucial step was ignored.

That said, nurses are often so overworked that workarounds are often necessary just to get through the day. I feel for this nurse. She did something that countless other have done (no excuse) and had a bad outcome. I suspect she is being harder on herself than the justice system could ever be.


My point has been that these are at odds. To the extent that they’re “over the top”, then they get disregarded, and fail to “prevent some errors”. If they were less over the top, then they would be easier to pay attention to, and would likely prevent more errors.

It really depends on which “practical impact” you are prioritizing.

I currently work with regulatory compliance systems in financial markets. Earlier in my career, I spent a few years at a medical-software company. I’m very familiar with the premise that procedural programmatic hurdles are routinely raised in front of the user, warning them about the impacts of their actions and asking them to confirm their intentions before proceeding.

The implication of the OP, exemplified by the statement above, is that these systematic obstructions are meant to protect the subject of whatever the process is overseeing. For the pharmaceutical cabinet, the patient will be protected from errors by medical staff. In my current work, these verifications protect either the end investor from fraud or negligence or the management company from malfeasance or incompetence on the part of a staffer. And so on.

In my experience, while that may be the common perception, it’s not actually the full implemented reality. The point of a system jumping in front of the user with a warning is not simply to protect the patient or the investor; it’s also, in no small part, designed to protect the provider of the system from any legal liability.

To illustrate: When the fund manufacturer’s product strategy manager wants to change the published investment guidance to disclose different mix/max figures for various asset types that will be held by the fund, the relevant control software pops up a warning about internal coordination and potential additional disclosure requirements, and asks the manager to confirm their understanding of the rules and implications. This protects the fund manufacturer from the consequences of publishing inaccurate or misleading guidance and/or failing to comply with its stated investment practices. But it also protects the software vendor, because by capturing and documenting this user action, they can prove that the change in the published material was the conscious responsibility of that user, and not an error by the software.

Funnily enough, in addition to the above, I also worked with my family’s construction business, building houses. So I know about the gap between the power tool’s instruction book and safety warnings, and the way the professionals actually use the gear. In that case, the motivation is equally split. Sure, they don’t really want anyone to lean over the table saw while ripping a board and have it kicked back into their body or face, so they tell you not to do it. But if you do do it, and you get hurt, they are able to point to the warning to (they hope) eliminate any grounds on which you might sue them.

In this context, the fact that a medical professional has to skip past multiple warnings from the drug cabinet every single day, just to do his or her job, may not be “bad design” from the perspective of the cabinet maker, because they can show that the nurse saw the alert and pushed the button anyway. Yes, from the perspective of the patient, “warning fatigue” is definitely a risk escalator, but as long as the responsibility and the legal liability for any negative consequence attaches to the nurse, what is the incentive for anyone to change how things function?

It should be that “it works better for the patient” is sufficient incentive, but in a world governed by cost-cutting and profit-seeking and legal (rather than moral) accountability, that’s rarely part of the calculus. And it explains why these kinds of systems operate like this. Until something truly concrete — meaning money — compels a change, inertia rules.

I think your broad point is correct. If a system requires that warning messages are routinely overridden then the warning messages will be routinely overridden. When actions are routine they quickly become thoughtless. Ideally a system like this would require that warning messages are thoughtfully overridden, but that won’t happen if the same overrides are happening over and over again. A bad system will set people up to fail and in some ways is worse than no system at all.

That said, surely at some point in the process of giving someone medication the name of the medication being given must be compared carefully to the name of the medicine required. It shouldn’t matter what the machine dispenses because the nurse should read the label at some point to confirm it is the right stuff.

Edit: Thinking a bit more about warning systems. One way to require a thoughtful override would be to require the user to type in the essence of the warning. For example overriding a “Paralyzing Agent” warning could require the user to type “Paralyzing” into the system, so they are at least having to bring the essence of the warning into their consciousness before proceeding.

The cabinet maker has no business relationship with the nurse. They do business with the hospital. The hospital is the one who provides the nurse with the cabinet; the hospital may also be the one that acquiesces to a training policy whereby nurses are told to ignore/bypass procedural safeguards.

Leave the nurse holding the bag and she is powerless to effect change, except by being even more extra careful not to screw up. Which will never happen on the individual level because after one screw up like this your license is revoked, your career is over, and you will most certainly lose the malpractice suit, even without the threat of jailtime.

Also, IMO, the “bad design” is less so the warning prompts and moreso the cabinet not recognizing brand names.


I don’t think my statement is contradictory. I think it’s accurate.


I’m a teacher. When I catch students in wrongdoing, “but everyone else is doing it!” is a very common defense. It’s equally valid there as here: That is, not at all.

This nurse definitely did something wrong. Other nurses, and the people who designed the cabinet, and the hospital trainers, and any number of other people, might have also done something wrong. If an investigation finds that they did, discipline them, too. But that doesn’t diminish the nurse’s wrongdoing.

But if your students make an honest mistake, without intending to do wrong, and you realize your rules/instructions/policies/practices may have contributed, I hope you would consider changing those rules/instructions/ policies/practices to help your students get it right next time.

I was thinking the exact same thing. If a warning that could mean life or death can be overridden with a simple “click OK” then that’s a design flaw.

IME (as a parent), in that situation, “everyone” doesn’t really mean everyone, and typically just means a selected few that the kid decided to compare themselves to. But if it really does mean everyone, then that changes things quite a lot.

Because kids are being ranked on a relative scale, whether it’s marks and class rank, or other forms of achievements. And if the “wrongdoing” is so widespead that anyone who doesn’t engage in it is at a serious competitive disadvantage, then it really does lessen the fault on the student and shifts it to the teacher who set up a system which rewards wrongdoers and punishes those who do right.

And similar in cases like this. If the nature of the nursing job with its attendent time crunches and pressures requires that people take these shortcuts in order to meet the requirements of their jobs - staff the required number of patients, do the expected number of procedures etc. - then it very much lessens the responsibility on the nurse and shifts it to those who set up a system which requires such shortcuts.

Recent studies of medical errors have estimated errors may account for as many as 251,000 deaths annually in the United States (U.S)., making medical errors the third leading cause of death.

The moral of this sad story is that it would be very wise of us to practice the best possible personal health care in order to avoid our “Health Repair” medical system as much as possible. I think that’s the best way to protect ourselves from this sort of thing.

I believe that the next best thing is for the person receiving medical care to have as much family support as possible. When medical personnel see that family is involved and attentive to what is going on, it is beneficial to the person receiving care. Taking my own advice in the thread about “Cites”, I did some checking.

How Patient and Family Engagement
Benefits Your Hospital

Patient and family engagement improves multiple aspects of hospital performance, including quality, safety, financial performance, patient experiences of care, patient outcomes, and employee satisfaction. Together, the multiple individual benefits of patient and family engagement lead to improved hospital performance.

If the people I love are hospitalized, I’m going to be very involved.

I have some questions regarding information that is in the police report and the hospital’s report

It looks like the reason for the override was because the order for Versed was not in the Accudose system and according to the hospital’s report the nurse should have contacted the pharmacy for review before overriding the machine.

My understanding of this is that order has to be in the system so that you don’t accidentally take a drug out that wasn’t prescribed for that patient. I think someone earlier said that overriding the machine is common - but if the reason for overrides being common was explained , I didn’t understand it.

As far as who’s at fault for what - I think the cabinet should have been able to search by brand name as well as generic. But I have a hard time blaming anyone other than the nurse for the other mistakes - apparently the machine requires a reason for the override to be chosen on one screen which requires a bit more thought than just clicking “OK”.

She apparently told the police that she was not overtired and the Neuro intensive care unit was not understaffed , so it’s difficult for me at this point to blame the hospital for overworking her.

But even if the nurse doesn’t get any blame for overriding the cabinet, there are other problems that have nothing to do with the cabinet. The other mistakes could have happened even before these cabinets were in use. According to the police report , she said she was distracted by having an unrelated conversation and that she thought it was odd to have to reconstitute the medication. But apparently even though she thought it was odd, she didn’t look at the front of the vial to see the name of the medication or notice the warning printed in red above the instructions for reconstituting the medication or the fact that the red cap said “paralyzing agent”. And it seems she immediately left the patient alone rather than monitoring her, even though she was told that the staff performing the scan could not administer the medication or monitor the patient after it was administered. All of those things could have happened no matter how she got the wrong vial, whether it was the cabinet or someone else handed her the wrong medication or if she asked the cabinet for the correct medication and whoever loaded the cabinet made an error or . . . It seems to me that checking the label before administering medication is a very basic step - and if the nurse had just done that , the outcome would have been different.

As the OP makes the analogy about all the legal disclosures around buying a house means that most of those disclosures are not actually read, the analogy here would be more along the lines of not checking that the paperwork you are signing is labeled as the house at the address that you are buying.

It may be understandable that one doesn’t know that they signed away liability of the building contractor over electrical issues along with all the other forms, but if they end up buying a condo in North Dakota rather than a house in Colorado because of their inattentiveness, then at least some of that is on them. (Especially when doing so means that someone else buys the farm.)

A critical care physician is currently on trial in Columbus, Ohio, facing 14 counts of murder in the death of patients given massive doses of fentanyl. Abuse/misuse of medicine cabinet overrides is an issue in that case.

Guidelines for managing overrides were recently updated in order to address such issues.

There have been grievous outcomes in other situations where automatic warnings were disregarded (i.e. commercial airline flights).

It sounds like a systemic failure if warnings can easily be disregarded when they’re perceived as an inconsequential annoyance.

I just read some articles on that case. IMHO that case seems more like a doctor who was euthanizing patients deliberately rather than a medical error.

Well according to his attorney*, “there’s no such thing as a medical murder case”. :face_with_raised_eyebrow:

*this lawyer successfully defended Casey Anthony and Aaron Hernandez on murder charges, so the case may not be a slam-dunk for the prosecution.
**a defense strategy has been to argue that Husel did nothing wrong, and that others were at fault for not stopping him. :thinking: