Special Feature

The Highly Reliable ICU

By Stephen W. Crawford, MD

We who work there can generally agree that the intensive care unit (ICU) can be a hectic environment. The flux of patients can be rapid and the clinical condition of those patients can change rapidly. Personnel can be challenged to manage multiple complex and evolving scenarios that include numerous medical treatments and medications in an increasingly technological world. As a result, iatrogenic complications and hospital-acquired infections pose a constant threat to our patients, as well as to the caregivers. The ICU seems to be a microcosm of the problem described in the report of the Institute of Medicine’s (IOM’s) Committee on Quality of Health Care in America, To Err is Human; Building a Safer Health System.1 Adverse events and hospital deaths are common, and when these are combined, a large proportion of deaths are deemed preventable. Are there organizational approaches we can adopt in the ICU that will create a safer place?

Part of the problem in changing the present working condition may be that we think of the ICU as a system; highly integrated, rational and organized. However, we behave as though we are the center of an informational network. Rather than being an interdependent part among many in a complex system, we believe we are autonomous players who work within, and direct, the ICU structure. We behave as though our responsibility to safety is through personal improvement in medical technique and clinical knowledge. In action, we tend to minimize the role of systematic organizational improvements that require "group-think" to envision.

Karl E. Weick, a University of Michigan professor of organizational behavior and psychology, argues that organizations function best when they act as a single entity.2 In the case of the ICU, improved function directly leads to fewer errors and better system safety. Our failure to view us as components in a larger ICU system decreases our mindfulness of the interactions of diverse people and processes in the ICU. People thinking as a team are better at seeing and acting upon small system errors earlier.

An improved awareness of the fine details can lead to a reduction in errors. Weick described a concept widely used in organizational psychology and management: the "highly reliable organization" (HRO).3 HROs share 2 essential characteristics: They constantly confront the unexpected, and they operate with remarkable consistency and effectiveness.

The ICU qualifies for the first characteristic, easily. Do we operate with remarkable consistency and effectiveness? I suspect most do not. I suspect also, that we all want to view our ICU as an HRO. Weick’s analysis of HROs offers important lessons. His message: The best way for any organization, and its people, to respond to unpredictable challenges is by building an effective organization that expertly spots the unexpected when it crops up and then quickly adapts to meet the changed environment.

A model that is used to illustrate an HRO in everyday practice is that of naval aviation. Flight operations at sea on an aircraft carrier involve highly dangerous maneuvers, performed with precision and with strict timing around the clock in a variety of weather conditions, and often under very stressful political situations. These operations on the flight deck are managed for the most part by sailors who are only 19 to 20 years old! If the Navy can create a culture of safety to these young sailors, why is it so hard to do in the ICU staffed by highly schooled health care professionals?

There are at least 5 habits of an HRO, and these are not routinely incorporated into the structure and culture of the ICU.4 These are habits our ICUs should cultivate in order to become safer and more efficient:

1. Do not be tricked by successes. HROs are preoccupied with their failures. They are incredibly sensitive to their own lapses and errors, which serve as windows into the vulnerability of their systems. They pick up on small deviations—and they react early and quickly to anything that doesn’t fit with their expectations. HROs create climates where people feel safe trusting their gut-instincts. They question assumptions and report problems. They quickly review unexpected events, no matter how inconsequential. They encourage members to be wary of success, suspicious of quiet periods, and concerned about stability and lack of variety, both of which can lead to carelessness and errors.

2. Defer to your experts on the front line. There are so many deviations out there, so much dissonance. How do we know what’s really worth paying attention to? The answer: Listen to your experts—the people on the front line, the nurses and therapists.

People at the top, physicians and nurse managers, may think that they have the big picture. More accurately, they have a picture, certainly not the picture, and certainly not bigger in the sense that it includes more data.

The picture that frontline workers see is different. It is drawn from their firsthand knowledge of the unit’s operations, strengths, and weaknesses. What is important about the frontline workers’ view is that these people capture a fuller picture of what the organization faces and what it can actually do. In most cases, they see more chances for bold action than the managers at the top. So it’s better for HROs to allow decisions to migrate to frontline expertise rather than to the top of pre-established hierarchies.

3. Let the unexpected circumstances provide your solution. When something out of the ordinary happens, your stress level rises. The safest prediction for what will happen next is that your perception will narrow; you will get tunnel vision, and you will miss a lot of stuff. You have to be able to resist that dramatic narrowing of cues, because within everything that is happening unexpectedly, you will find what you need for a remedy.

The root cause analysis (RCA) approach is an effective tool for determining the remedies from disasters. The RCA is recommended by patient safety organizations, including Joint Commission on Accreditation of Healthcare Organizations for sentinel adverse events.

4. Embrace complexity. Medicine is complex, in large part because it is unknowable and unpredictable. In the face of all of this complexity, HROs are reluctant to accept simplification. They understand that it takes complexity to sense complexity.

We all instinctively try to simplify the data that we receive, but there are better and worse simplifications. Better simplifications arise from a deeper knowledge of the environment along with a deeper understanding of the organization and its capabilities. That knowledge and understanding develops when people attend to more things, entertain a greater variety of interpretations, differentiate their ideas, argue, listen to one another, work to reconcile differences, and commit to revisiting and updating whatever profound simplicities they settle on as guidelines for action. A complex organization is made up of diverse people with diverse experience. Its complexity fosters adaptability.

5. Anticipate, but also anticipate your limits. We try to anticipate as much as we possibly can. But we can’t anticipate everything. There’s such a premium on planning, on budgeting. In the face of all that, the notion of resilience has an affirming quality: You don’t have to get it all right in advance.

Good strategy does not rely on anticipation alone. It’s built on a smaller scale, updated more frequently, and driven by actions. It’s not: "Think, and then act." Instead, it’s: "Think by acting." By actually doing things, you’ll find out what works and what doesn’t.

That doesn’t mean you should stop anticipating. But you should add in two subtleties. First, focus your attention on key mistakes that you do not want to make. Second, trust your anticipations, but be wary of their accuracy. You can’t see the whole context that is developing. Your anticipation is probably a reasonable first approximation of what might be happening, but no matter how shrewd you are, it won’t cover some key features.

If safety was really important . . .

The keys to creating an HRO are rooted in the attitudes of the organization’s leaders. We can model our ICUs after the HROs. We can do this by actually relying on frontline (bedside) expertise in our units and allowing (or insisting) that the person with specific expertise makes decisions. We can implement clear-cut procedures and policies and constantly train our staffs. Safety drills are an effective means of reinforcing and monitoring training on procedures.

Every ship’s officer and aviator aboard an aircraft carrier believe safety is paramount. The belief is constantly reinforced by actions; repeated training, drills and accountability. Those safety officers and chief patty officers responsible for training the junior enlisted personnel are held accountable to the air-boss and commanding officer for mistakes and lapses of those in their charge. If safety is paramount in our ICUs, what actions do we take that send that clear message to those working at the bedside? The accountability starts with the behavior of the leadership which must mirror precisely what we expect of the staff. At sea, safety is everyone’s responsibility. In the ICU, the directors and managers have to be part of the team and are every bit as responsible for the safe operation as the person at the bedside.

The culture of medicine does not lend itself easily to creating an HRO. Many physicians, by nature and training, function autonomously and decisively. Allowing the decision authority to migrate to the level of the bedside does not come readily. Moreover, we tend to believe in personal responsibility for training and education. Implementing repetitive training for the ICU staff and then conducting drills is unnatural. However, it is these actions of the ICU leadership that will foster a safer environment.


1. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000

2. Weick, KE. The Reduction of Medical Errors Through Mindful Interdependence. In: Medical Error: What Do we Know? What Do We Do? MM Rosenthal and KM Sutcliff (Eds) San Francisco, CA: Jossey-Bass. 2002;9.

3 Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. Jossey-Bass; 2001.

4. Hammonds, KH. Five Habits of Highly Reliable Organizations. Fast Company. May 2002;58:124.

Stephen W. Crawford, MD, Pulmonary Medicine Naval Medical Center San Diego, CA, is Associate Editor for Critical Care Alert.