Just Culture in Healthcare

A System Approach to Decreasing Errors

drawing up medication in a syringe

TEK IMAGE / Getty Images 

Who is to blame if a treatment error is made in a hospital or in an ambulance? Healthcare agencies, the legal system, and patients have traditionally held the caregiver accountable when something goes wrong. The assumption is that the person who is trained and licensed to provide care is ultimately responsible for the quality of care provided.

Healthcare professionals as a group tend to agree with this assumption.

There is a lot of blame placed on those who do the actual hands-on care when things go wrong, especially by their own peers and on themselves.

This isn't unique to healthcare. Plenty of high-performance professions expect perfection from their practitioners. Pilots, for example, have very little room for error, as do soldiers, firefighters, architects,  police officers and many others. 

What Is Just Culture?

Despite an expectation of perfection, it's a well known fact that to err is human. Anyone who's ever forgotten where the car keys were or left out a paragraph in a mid-term essay can attest to the fact that errors happen despite how much we know or how mundane the action.

Mistakes happen to the best of us, but in some cases, the consequences of a mistake can be catastrophic. For those whose actions have such heavy weight attached to them, there has to a way to reduce and mitigate errors.

In healthcare, that approach is often referred to as a just culture.

Benefits

Instead of blame, the just culture approach suggests that errors should be treated as inevitable. There's no way to make humans infallible. Instead, known failure points can be identified and processes can be engineered to help avoid those mistakes in the future.

It's called just culture as apposed to a culture of blame. It's a change of how errors are perceived and acted upon by an organization. When an organization embraces a just culture, it is more likely to have fewer adverse incidents and caregivers in that organization are more likely to self-report errors or near misses. Reporting helps policy-makers engineer new systems to address the causes of the errors before an adverse incident occurs.

Just culture treats errors as failures in the system rather than personal failures. The idea is that some, if not most, errors can be eliminated by designing a better system. This idea is used every day in many areas.

For example, gas station nozzles and hoses have been ripped off because drivers forget to take them out of the tank filler opening. To combat this extremely expensive error, modern nozzles have a breakaway coupler that allows them to be pulled off of the hose without damaging the nozzle or the pump.

Goals

A just culture is intended to reduce adverse patient outcomes by reducing errors, but the concept needs a better name.

Since this idea is labeled just culture, there is a tendency to focus only on treating those who commit errors in a fair or just manner, rather than focusing on the system or the environment in which the error was made.

In most cases, there are contributing factors that can be identified and sometimes removed.

For example, let's look at a scenario that could happen anywhere in the country. A paramedic is sedating a patient during a seizure. The patient suddenly becomes unconscious and unresponsive. The paramedic is unable to wake the patient and has to provide rescue breaths for the patient the rest of the way to the hospital. The patient was accidentally given a higher concentration of medication than she should have been.

if a medication error is made during an ambulance transport, focusing on the caregiver who made the error is tempting.

Some administrators might start looking at the caregiver's education and experience to compare with other caregivers and recommend education or retraining as a corrective action. The administrators could consider this approach fair and an example of just culture due to the fact that there is no disciplinary action taken on the caregiver.

A better approach is to assume the caregiver is as competent, experienced, and well-trained as his peers. In that case, what would cause anyone in the organization to make the same type of medication error? Looking at the system rather than the individual would lead us to question why there is more than one concentration of the same medication on the ambulance.

System vs. Individual Focus

The intent of the administrators is to reduce the likelihood of a similar medication error happening in the future. Evaluating the system provides more opportunities for improvement than evaluating the individual.

In the case of a medication error made by giving the wrong concentration of medication, standardizing all ambulances in the system to stock only one concentration of that medication will prevent any paramedic in the future from making the same mistake. By contrast, retraining only the paramedic that made the error just decreases the chance of one caregiver making the mistake.

One way to focus on system improvements rather than zeroing in on individuals is to change the way problems are addressed from the outset. Leaders can ask themselves how to encourage the behavior they want without issuing memos or policies, conducting training, or using discipline.

In a robust just culture setting, system design is focused on reducing errors before they happen. Not only should there be a reaction to incidents once they happen, it is even more important to be proactive.

Accountability

You might be asking when, if ever, the individual is held accountable for his or her actions. In a just culture, the individual is accountable not for errors per se, but for behavioral choices. 

Consider the paramedic who made the medication error in our example above. Would we ever hold him accountable for the overdose? Yes and no.

First, we would still address the system issues that led to the opportunity for error. Keeping that medication to a single, standard concentration still helps reduce errors.

However, it's important to look at the factors that could have contributed to the paramedic's mistake. Did the paramedic come to work intoxicated? Did he come to work fatigued? Was he using medication from another source instead of what is provided to him through his organization (did he get it from the hospital or another emergency vehicle)?

All of these factors potentially could have contributed to the error and are behavioral choices that the paramedic would have to make. He knows if he's ingesting substances that can alter his mental state. He knows if he didn't get enough sleep before his shift began. And, he knows if he is using medication that didn't come from his ambulance.

Outcome Bias

An extremely important note about accountability: outcome doesn't matter. If the paramedic gave the higher concentration of medication in error and the patient died, the paramedic should not be held to a higher standard than he would be if the patient lived.

Outcome bias is quite hard for regulators and administrators to combat in actual situations. When looking at incidents, it's very likely that the patient's condition is what triggered the review. In many cases, there is already a bad outcome. It's very easy to fall into the trap of no harm, no foul.

However, if the object of just culture is to decrease incidents that can lead to adverse outcomes, then the outcome of any single event shouldn't matter. For example, let's look at another scenario that happens every day.

A respiratory therapist assisting a resuscitation in the emergency department forgot to attach a sensor to the patient's endotracheal tube and the patient stopped receiving oxygen. A nurse in the room notices the detached sensor and tells the respiratory therapist. She thanks the nurse and attaches the sensor, which tells the team that the patient is not receiving oxygen. They fix the problem and the incident is never reported.

No one thinks twice about it because the patient turns out fine. However, if the error is not noticed and the patient goes into cardiac arrest, the incident will lead to a review. That's an example of outcome bias. The error is the same, but one version is considered no big deal while the other is considered an incident worthy of examination.

In a mature just culture, the error would be reported either way. There would be a desire by all caregivers to identify how it is that the sensor could get left off. It's likely that reporting an error like this would identify other, similar errors of omission that could be addressed at the same time. Maybe the organization would implement a checklist procedure to help catch easily overlooked mistakes like this one.

An organization practicing just culture would not penalize the respiratory therapist for her error, even if it led to the death of a patient. Contributing behavioral choices, however, would be addressed. If the respiratory therapist came to work fatigued or intoxicated, for example, she could be held accountable.

View Article Sources