The War on Interruptions, an Excerpt from "Switch: How to Change Things When Change Is Hard"


The following is an excerpt from Chip and Dan Heath’s new book, Switch: How to Change Things When Change Is Hard, which will be released on February 16.

One of the most consistent findings in psychology is that people behave differently when their environment changes. When we’re in a place where people are quiet (church), we’re quiet. When we’re in a place where people are loud (stadiums), we’re loud. When we’re driving and the lanes narrow, we slow down. When they widen, we speed up again. This may seem obvious, but when we try to make change at work, we often make the mistake of obsessing about the people involved rather than their environment. Often the easiest way to drive change is to shape the environment.

This environment-shaping strategy was used in 2006 by Becky Richards, the Adult Clinical Services Director at Kaiser South San Francisco hospital, who was determined to drive down medication errors. On average, nurses commit roughly 1 error per 1000 medications administered. That’s an impressive record of accuracy. Still, given the huge volume of medications delivered at Kaiser South, that error rate led to about 250 errors annually, and a single error can be harmful or even deadly. For instance, if a patient received too much Heparin, a blood thinner, the patient’s blood would no longer clot and the patient could hemorrhage. If a patient got too little Heparin, he could develop a blood clot that could lead to a stroke.

Richards believed that most errors happened when nurses were distracted. And it was all too easy to get distracted—most traditional hospitals put the medication administration areas right in the middle of the nursing units, which tended to be noisiest places on the floor. Tess Pape, a professor at the University of Texas who has studied medication errors, said, “Today we admire people for multitasking, we celebrate people who can accomplish many things at once. But when you’re giving out medications is the last time you should be multitasking.”

At Kaiser, no one thought twice about calling out to a nurse who was delivering medication. Worse, nurses felt an obligation to respond when others distracted them. They couldn’t very well tell a surgeon, “Sorry bud, can’t help right now, I’m dealing with medication”? And yet that’s exactly what would need to happen for errors to be reduced.

Ideally, when the nurses were administering medication, they’d work inside a soundproof bubble, like the “Cone of Silence” from Get Smart. With that solution being architecturally infeasible, Richards came up with the idea of using a visual symbol, something that could be worn by nurses, which would signal to other people, Hey, don’t interrupt me right now.

After considering armbands and aprons, she settled on vests. She called them “medication vests.” Richards scrambled to find someone who could supply her vests: “The first vest we ordered was off the internet. It was really cheesy. Cheap plastic. Bright orange. Be careful what you order off the internet.”

Later, with vests in hand, Richards unveiled the idea to her staff: When you’re administering medication, you’ll put on a medication vest. It’s bright enough that people can see it from down the hall. And all of us, including the doctors, will know that when someone is wearing one of these vests, we should leave them alone.

She selected two units at Kaiser South for a 6-month pilot study of the medication vests, and in July 2006, it began.

Richards quickly encountered a problem with her pilot: The nurses hated the vests. So did the doctors. “Nurses thought the vest was demeaning, and they couldn’t find it when they needed it,” said Richards. “They didn’t like the color. ‘How do you clean it?’ And physicians hated not being able to talk to their nurses when they passed them in the hall.”

The nurses’ written feedback about the pilot was scathing:

– “Oh, so you want to draw attention to the fact we can make a mistake…”
– “You want people to think I have a dunce cap on, that I’m so stupid I can’t think on my feet.”
– “Give me a hard hat and a cone and I can go work for Cal Trans [the state highway department].”

“They were pretty brutal,” said Richards. The reception was so universally poor that Richards was ready to write off the idea and try something else.
Then the data came back.

In the 6-month pilot, errors had dropped 47% from the 6 months prior to the study. “It took our breath away,” said Richards.

Once the data was in, the hatred faded. Impressed by the results, the entire hospital adopted the medication vests, except for one unit that insisted they didn’t need them. Errors dropped by 20% in the first month of the hospital-wide adoption, except for one unit that actually saw an increase in errors. (Guess which one?)

You know you’ve got a smart solution when everyone hates it and it still works—and in fact it works so well that people’s hate turns to enthusiasm. Becky Richards had found a way to use the environment to change behavior.
A similar practice has long been used by the airline industry. Because most aircraft accidents happen during takeoffs and landings—the most hectic and coordination-intensive parts of any flight—the industry has imposed a rule called the “sterile cockpit”. Anytime the aircraft is below 10,000 feet—whether on the way up or the way down—no conversation is permitted, except what’s directly relevant for flying. At 11,000 feet, you can talk about football, your kids, or the loathsome passengers. But not at 9,500 feet.
In another organization, the IT group jointly agreed on a sterile cockpit for their software project. The group had embraced a substantial goal—to reduce new product development time from three years to nine months. In previous projects with tight deadlines, the work environment had become increasingly stressful, and as workers got behind schedule, they’d tend to start interrupting their colleagues for quick help. Managers would wander by regularly to be “statused” on the project. As a result, people were interrupted more and more, and work weeks expanded to 60 and 70 hours as people started showing up on the weekend, hoping to get some work done when they could focus.

The IT group decided to try an experiment—they established “quiet hours” on Tuesday, Thursday, and Friday mornings before noon. The goal was to give coders a sterile cockpit, allowing them to tackle more complex bits of coding without being derailed by periodic interruptions. Even the socially insensitive responded well to the change in the Path. One engineer, previously among the worst interrupters, said, “I always used to worry about my own quiet time and how to get more of it, but this experiment made me think about how I’m impacting others.”

In the end, the group managed to meet its stringent nine-month development goal. And the division VP attributed the success to the sterile cockpit quiet hours: “I do not think we could’ve made the deadline without it,” he said. “This is a new benchmark.”

In these disparate environments—cockpits and hospitals and IT workgroups—the right behaviors did not evolve naturally. Nurses weren’t “naturally” given enough space to work without distraction, and programmers weren’t “naturally” left alone to focus on coding. Instead, leaders had to reshape the environment consciously. With some simple tweaks to the environment, suddenly the right behaviors emerged. It wasn’t the people who changed, it was the situation. What looks like a people problem is often a situation problem.