*NURSING > EXAM > NUR MISC/Patient Safety:Understanding Adverse Events and Patient Safety,100%CORRECT (All)

NUR MISC/Patient Safety:Understanding Adverse Events and Patient Safety,100%CORRECT

Document Content and Description Below

NUR MISC/Patient Safety:Understanding Adverse Events and Patient Safety “First, do no harm.” This phrase is one of the most familiar tenets of the health care profession. If you poll a group of... health care professionals, it is likely all would say they strive to embrace this motto in their practice. And yet, patients are inadvertently harmed every day in the health care system, sometimes with severe consequences. Noah Lord was one of these patients. At the age of four, Noah had an operation to remove his tonsils and adenoids, to help with chronic ear infections. Although it was a simple outpatient procedure, a series of miscommunications increased his risk for harm: • Noah’s surgeon did not fully understand Noah’s symptoms or history. • After the procedure, the care team sent Noah home early, possibly without notifying the surgeon. • When Noah’s mother called the hospital five times for help, each time she spoke with people who failed to communicate important information, such as critical warning signs. Noah was at home when he began to bleed profusely from his nose and mouth, where he’d had the surgery. Tanya Lord, his mother, was a lifeguard and knew CPR. She was able to clear his airway and revive him three times, but Noah eventually died before paramedics arrived to help. In the video, she describes how what happened changed her life forever Adverse Events Are Common Although the results are not always as devastating as what happened to Noah, the reality is adverse events in health care happen all the time. Studies of different health care settings in the United States have found:1 . About 1 in 10 patients experiences an adverse event during hospitalization. . Roughly 1 in 2 surgeries has a medication error and/or an adverse drug event. . More than 700,000 outpatients are treated in the emergency department every year for an adverse event caused by a medication. . More than 12 million patients each year experience a diagnostic error in outpatient care. . About one-third of Medicare beneficiaries in skilled nursing facilities experience an adverse event. The consequences of these adverse events can be physical, emotional, and/or financial. People Make Mistakes The simplest definition of patient safety is the prevention of errors and adverse effects to patients associated with health care.1 Patients are inadvertently harmed every day in health care systems, sometimes with severe consequences. And some — but not all — harm to patients is the result of human error. No matter how well-intentioned, well-trained, and hard-working, health professionals are human and make mistakes. Even Dr. Don Berwick, the founder of IHI, will tell you about errors he has made As Dr. Berwick said, “Almost anyone in that situation stood a substantial risk of making that same error.” It’s true. In fact, as the patient safety field has evolved, it has moved away from the term “medical error,” which tends to overemphasize the role of individuals in causing harm. As it turns out, exploring the real root causes of harm means looking far beyond individual providers; it means taking a close look at the systems in which they work. The Evolution of Patient Safety The patient safety movement started as a recognition that health care was causing injury and death to patients in the course of care. Today, the movement encompasses much more: designing health care systems that reliably deliver the best outcomes to patients at the lowest cost. (Click to enlarge the image.) In 1999, the Institute of Medicine (IOM) released its landmark report, To Err Is Human, which revealed that between 44,000 and 98,000 people died each year in United States hospitals due to medical errors and adverse events.2 It did not identify the main cause of the problem to be reckless or incompetent providers. “Faulty systems, processes, and conditions that lead people to make mistakes or fail to prevent them” were usually to blame for patient harm, according to the report. Within days of the report, new legislation tasked the US Agency for Healthcare Research and Quality with studying health care quality. Over the next five years, the rate of English-language articles published on patient safety almost tripled.3 Dr. Bob Wachter is a renowned patient safety expert and author. Listen to him describe how patient safety suddenly became a priority, and how the field has evolved since 1999: Health Care Is a Complex System Most health care professionals — physicians, nurses, pharmacists, and so forth — are drawn to health care out of a desire to help others. They go through intensive training and are carefully screened for their positions. Some take further instruction — such as this online course — to enhance their education and better prepare for patient care duties. Given the conscientious nature of the typical health care provider and the comprehensive training he or she receives, why is it that so many adverse events occur? There are many answers to that question: • Technology is evolving rapidly. The practice of modern medicine involves numerous drugs and highly technical equipment. There are more than 10,000 types of medical devices available today.1 • There is not always a clear right answer. The science of medicine is filled with nuance, and what one health care provider or organization feels is good practice, another may not. A 2002 AHRQ review identified more than 120 different systems to rate the strength of scientific evidence.2 • There is rarely enough time. Providers are often caring for a great number of patients, all of whom are unique. A 2012 study of over 13,000 US physicians found more than 40 percent saw more than 20 patients each day.3 • Patients require complex, coordinated care. Multiple caregivers and patient handovers leave room for miscommunication at every turn. One teaching hospital reported 4,000 handoffs daily, for a total of 1.6 million per year.4 • The hierarchical nature of health care can breed disrespectful and abusive behavior. Multiple US studies find more the 90 percent of nurses experience verbal abuse at some point in their careers. Trainees are vulnerable to disrespect and mistreatment as well.5 Steve Spear, DBA, MS, MS, is a Senior Lecturer at the Sloan School of Management and at the Engineering Systems Division at MIT. He has worked with many organizations to help them integrate new technology, and seen how it creates new risks along with the rewards: Health care today relies on a coordinated effort of people working across many different functions, disciplines, and specialties. All these individual components of the system need to function effectively not only on their own, but also together. Organizational theorists such as James Reason (whom you will learn about in later courses) have described safety as a “dynamic non-event”: Safety is dynamic because it requires “timely human adjustments” and a non-event because “successful outcomes rarely call attention to themselves.”7 In other words, to make “nothing bad happen” requires a lot of good things to be done right. Blame & Punishment Are Not Solutions It is human nature to look for someone or something to blame when things don’t go according to plan. Historically, the medical profession has viewed medical errors and adverse events as either an inevitable byproduct of complex care or the result of provider incompetence, often seeking to blame the providers involved in the error. Yet most of the time these situations occurred unintentionally. You already heard how Dr. Berwick felt after he made a mistake. The next story comes from Dr. Lucian Leape, the founding chairman of the IHI/NPSF Lucian Leape Institute, whom many consider the father of the modern patient safety movement. He shares an experience he had as a pediatric surgeon: The reality is that patients and family members are not the only people harmed by medical error. Providers suffer, too. Blaming people who are doing their best does not solve the problems that lead to error, and it makes health care less safe. Dr. Leape has often said: Delivering the right care — for every patient, every time — requires a different way of thinking about error in medicine and a new approach to preventing harm. Janet is especially busy because one of her colleagues called out sick, and she needs to collect blood samples for four patients. She collects one sample, and before she gets a chance to label it, another patient in an emergency situation needs her help. She leaves the unlabeled specimen on the nurses’ station for a moment. When Janet returns, there is a second unlabeled vial of blood at the nurses’ station. She realizes another nurse was obtaining blood samples and was also called away. Neither nurse knows which vial belongs to which patient. What about this scenario seems to make an adverse event likely to occur? (Choose all that apply.) (A) Janet is juggling multiple tasks at once. (B) There are not enough staff members to keep up with the demands of care. (C) Janet was interrupted in the middle of a task with inherent room for error. (D) Janet is not trying hard enough at her job. More Info There is no reason believe Janet is not trying her best to provide safe, efficient care. Otherwise, all of the other factors make an error more likely to occur in this scenario. Making Systems Safer It took a terrible disaster for the aviation industry to learn that blame and punishment don’t improve safety. In 1977, a pilot named Jacob Van Zanten made an error that led to the collision of two jumbo jets on a runway in Tenerife, killing more than 500 people in aviation’s deadliest accident. After the crash, industry leaders couldn’t blame the accident on mechanical failure or staff misbehavior, as they usually did, because Van Zanten was a revered pilot. In fact, magazines with his image were onboard at the time. Audio transcript: This is the worst air traffic collision of all time, the crash of two 747s on the runway at the Tenerife, Canary Islands, more than 20 years ago. And the story, briefly, was a KLM 747, was getting ready to take off. It was a foggy morning. There was a Pan Am 747 at the end of the runway, and before the fog set in so, so deeply, the KLM cockpit crew saw the Pan Am, but it seemed to be rolling toward a side–spur of the runway and it was logical to believe it was out of the way. But then the fog came in, and they could no longer see the end of the runway. Air traffic control sent a message to the KLM cockpit, and it got garbled, they didn’t hear it correctly. And so- this is from the later report- on hearing this message, the KLM flight engineer asked, “is he not clear then?” meaning is the Pan Am 747 not out of the way. The KLM captain, a revered captain who is actually in charge of training all the 747 pilots in KLM’s fleet, the captain didn’t understand him and he repeated the question, “it he not clear, that Pan American?” The captain replied with an emphatic yes. And perhaps influenced by his great prestige, making it difficult to imagine an error of this magnitude on the part of such an expert pilot, both the pilot and engineer made no further objections. So, in other words, to the best of our ability to reconstruct that scene, two of the three people in the cockpit were not sure that a 747 was not in the way. And yet they green-lighted the plane, they did not object to the captain’s proclamation that the runway was clear because this captain had such great prestige that the authority gradient was simply too large. Of course, you know what happened next- the plane began lumbering down the runway, emerged from the fog to see the horrifying sight of the Pan Am plane right in front of them. They actually managed to get their nose over the fuselage of the Pan Am but to do so, their tail dragged across the ground of the runway, the tail got up in the air about 25-30 feet, just high enough to slam into the upper deck of the Pan Am. Both planes exploded, 583 people died. And so aviation learned the tragic cost of this kind of hierarchy where it could be possible that someone could suspect something was wrong and not speak up to power. And they have worked doggedly to decrease that hierarchy, something we just began to do in health care. Instead of blaming individuals, the aviation industry looked at the system as a whole, and learned how flawed processes and problems with the culture set those people up for failure. Since this shift of focus from blame and punishment to system-level improvement, the number of airline accidents resulting in passenger harm has significantly decreased. The industry has continually improved safety by learning from small mistakes, even those that do not cause harm. Health care can and should do the same. In the next video, Dr. Berwick reflects on what he should have done after the mistake he described earlier: A patient is scheduled for surgery on her left leg. Initially, an intern prepares the patient by marking the correct surgical site on the dressing on the leg. The intern’s teammate removes the dressing, but addresses the problem by marking the surgical site on the patient’s skin. However, he uses a water-soluble marker. The ink becomes smeared and illegible. The attending surgeon is new and not familiar with the hospital’s marking procedures. Meanwhile, the nurses helping with the surgery are busy preparing the patient and the operating room. Generally, the operating room schedule is tight, and everyone is in a hurry to move the surgery forward. What type of adverse event in particular is more likely to occur because of the system failures in this scenario? (A) Improper anesthesia dosing (B) Retained foreign body after surgery (C) Wrong-site surgery (D) None of the above More Info Hopefully, the surgeon and nurses will stop to perform site verification before proceeding with the surgery, and move forward with a successful operation. However, regardless of what happens next, a wrong-site surgery is more likely to occur because of a breakdown in the system. Near misses and even small errors are “accidents waiting to happen,” and they represent important opportunities to improve safety. A Framework for Patient Safety A Framework for Safe, Reliable, and Effective Care to be used in service of realizing the best outcomes for patients and families. Click to enlarge the image and explore each dimension. After he made a mistake, Dr. Berwick wished he and his organization could have used what happened to him to make the system safer. “Nowadays, I hope that health care is maturing into a better kind of system for a worker like me — frail, error prone, human,” he said. The good news is that over the past 20 years, health care organizations have begun to realize and accept that most errors cannot be linked to the performance of the individual, and are instead the result of a series of system failures. Even better news is that these system failures are often preventable. IHI’s patient safety experts have created a framework to help organizations understand the components of safe systems of care, which focuses on two broad areas:1 . Organizational culture, which is the product of individual and group values, attitudes, competencies, and behaviors related to safety . Learning systems, which measure performance and help teams make improvements In the remainder of this course and the Open School Patient Safety Curriculum, we will discuss each element of the framework in detail. We hope you will continue learning with us. Because as Dr. Leape reminds us, we have made great progress in safety, but there is lots of opportunity to improve: Post Assessment Quiz According to WHO, in developed countries worldwide, what is the approximate likelihood that a hospitalized patient will be harmed while receiving care? Your Answer: 10% According to WHO, in developed countries up to 10 percent of hospital patients may be harmed while receiving care. Since the publication of To Err Is Human in 1999, the health care industry overall has seen which of the following improvements? Your Answer: Wider awareness that preventable errors are a problem More than a decade after the publication of To Err Is Human, there is now wide recognition throughout health care that the number of errors is way too high. Although this awareness has not yet led to consistently lower rates of preventable medical error, progress is being made. Health care organizations have begun to realize and accept that most errors cannot be linked to the performance of individuals, but rather to the systems in which they function. Safety has been called a “dynamic non-event” because when humans are in a potentially hazardous environment: Your Answer: It takes significant work to ensure nothing bad happens The best answer is it takes significant work to ensure nothing bad happens. When things go right in a potentially hazardous environment, nothing bad happens. But in order for this “non-event” of nothing going wrong to occur, a lot of things must be done right. Thus, safety has been described as a “dynamic non-event.” To prevent this type of error from recurring in this unit, which of the following is MOST important? Your Answer: An improved culture of safety and teamwork Had there been a culture of safety fostering better teamwork, this error may well have been prevented. In this case, when James asked Maria for help, she made him feel bad instead of being a team player. In this type of environment, James may be reluctant to ask for help, even if he is more closely supervised. We can generally assume that health care providers do not want to harm their patients, so the threat of punishment is not the best way to prevent mistakes. Although errors may occur when there is no recognized best practice, in the case of IV fluid replacement, clear recommendations do exist. Who is likely to be negatively affected by this medical error? Your Answer: All of the above The best answer is all of the above. Patients and families are not the only ones affected when a medical error occurs. In this case, James is likely to be devastated, and Maria may be affected as well. Some providers even leave their profession after committing errors leading to a death. Your Role in a Culture of Safety What Does a Culture of Safety Look Like? In the previous lesson, you saw several examples of people who could have spoken up to prevent accidental harm. A culture of safety would have made it easier for caregivers to voice their concerns, and would have made it more likely that others would respond. In a culture of safety, providers discuss errors and harm openly, without fear of being unfairly punished, and with confidence that reporting safety issues will lead to improvement. Dr. David Bates, MD, MSc, is a Professor of Medicine at Harvard Medical School and leading researcher in the field of patient safety. Here is what he has learned about building a culture of safety: In Lesson 1, we shared a IHI’s framework that identifies the following factors that contribute to an organization’s safety culture:1 . Psychological safety: creating an environment where people feel comfortable raising concerns and asking questions and have opportunities to do so . Accountability: holding individuals responsible for acting in a safe and respectful manner when they are given the training and support to do so . Negotiation: gaining genuine agreement on matters of importance to team members, patients, and families . Teamwork and communication: promoting teams that develop shared understanding, anticipate needs and problems, and apply standard tools for managing communication and conflict In this lesson, we’ll review each of these dimensions of a culture of safety. Two interrelated domains underpin the Framework for Safe, Reliable, and Effective Care: the culture (orange elements) and the learning system (gray elements). In this context, culture is the product of individual and group values, attitudes, competencies, and behaviors that form a strong foundation on which to build a learning system. Every Person Contributes to Culture Unfortunately, many professional and social groups — even groups of family and friends — promote the opposite of psychological safety: They don’t support the value of asking questions, seeking feedback, or suggesting innovations. Before he became Director of the Clinical Effectiveness and Evaluation Unit at the Royal College of Physicians of London, Dr. Kevin Stewart was new to his career and the hierarchy of medicine. He tells the story of how he accidentally hurt a patient while trying to avoid a confrontation with his supervisor: Every person in a system contributes to its culture. What you do influences the behavior of others, whether you’re a supervisor or the newest staff member. No matter who you are, how you behave toward others will make a difference. Psychological Safety Psychological safety is key to reducing the likelihood that a patient will get hurt, as you just saw. But it has other benefits, too, such as innovation and faster adoption of new ideas. Amy Edmondson, a professor at Harvard Business School, is an expert in team performance. In a study of surgical teams, she and co-researchers found that when team members felt comfortable making suggestions, trying things that might not work, pointing out potential problems, and admitting mistakes, they were more successful in learning a new procedure. By contrast, when people felt uneasy acting this way, the learning process was stifled.1 Edmondson explains more about how psychological safety helps groups learn: Every time you work with your peers, even when you’re not the most senior person in the room, you can model behaviors that promote psychological safety: • Make yourself approachable. • Seek to engage all team members. • Encourage feedback. • Respond to suggestions. • Respect and value every team member and his or her input. A brain surgeon was about to perform a difficult procedure on a high-risk patient. Several of the surgical team members had just met for the first time. The surgeon walked into the room and announced, “Good morning, team. This is a difficult case, and I'm human like everyone in this room. Please speak up if you see me about to make a mistake or have a suggestion to help.” She then went around the room and introduced herself to everyone by her first name. Is the surgeon showing good leadership? (A) Yes (B) No More Info By introducing herself, encouraging participation, and valuing everyone’s role, the surgeon helps create an environment in which team members can participate to their full potential and speak up if necessary, to help the group navigate problems that could emerge, especially in a high-risk surgery. The focus on safety is a powerful reminder of the team’s common goal: providing safe, effective care for the patient.2 Accountability A just culture, initially defined for health care by the lawyer and engineer David Marx, recognizes that competent professionals make mistakes. However, it has zero tolerance for reckless behavior. This distinction has two benefits: • People know that certain kinds of behavior are not acceptable. • People know that they won’t be punished for admitting to errors that happen when they’re trying to do the right thing. Patient safety expert Fran Griffin, RRT, MPA, explains the defining characteristics of a just culture: As Fran Griffin said, unsafe actions based on reckless decision making are not acceptable. However, culpability can be a difficult line to draw. David Marx recommends distinguishing between three types of human behavior, defined as follows:1 • Human error: inadvertently doing something other than what you should have done. For example, you suddenly notice while driving that you’ve exceeded the speed limit, without intending to do so. • At-risk behavior: making an intentional behavioral choice that increases risk — without perceiving that heightened risk or believing the risk is justified. An example is when consciously driving 72 miles per hour feels safe to you, even though the posted speed limit is 65 miles per hour. • Reckless behavior: consciously disregarding a visible, significant risk. Reckless behavior, such as driving drunk, involves choosing to put oneself and others in harm’s way. You promote a culture of safety when you speak up about unsafe acts while recognizing that even competent, well-meaning professionals will make errors. Take a guess: For which of the above categories do you think Marx recommended disciplinary action? (A) Human Error (B) At-risk behavior (C) Reckless behavior (D) None of the above More Info In Marx’s model, he suggests disciplinary action should be reserved only for cases that fall into the third category, “reckless behavior.” For cases classified as “human error,” he recommends system-level improvement. For “at-risk behavior,” he suggests system- level improvement plus coaching. Algorithms based on Marx’s work can help you assess the best approach when something goes wrong and a patient is harmed. For example, IHI recommends asking five questions:2 • Were their actions malicious? Did they intend to cause harm? • Was their sensorium intact? Were they thinking clearly? • Were their actions reasonable and appropriate? • Were their actions risky, reckless, or unintentional? • Does the individual have a history of unsafe behavior? Teamwork and Communication No matter what role you play in health care, you will be a member of a team, and thus you have a responsibility to communicate effectively and value the contributions of other team members. The processes of health care are too complex to be safely carried out by individual experts who try hard. In Lesson 1, Dr. Lucian Leape told the story of a child who died under his care after no one challenged his clinical judgment. In the next video, he explains what the experience taught him about the importance of teammates: When patient care teams function as a group of individual experts and don’t take deliberate steps to ensure safety, they can inadvertently increase the risk of error and patient harm. On the other hand, when health care teams function as a collaborative unit with regular two-way communication, they promote safe, reliable, and effective care. During each team interaction, team members know the plan, and there is a dynamic that supports psychological safety. Negotiation Given the level of complexity of health care and the importance of many of the daily decisions teams must make, providers need to be able to negotiate effectively to gain genuine agreement. Whenever possible, collaborative negotiation is best. In collaborative negotiation, both parties work together to find a mutually agreeable solution through appreciative inquiry (asking simple questions to gain greater insight into the other person’s needs and interests) and self-reflection (working to understand your own desires and interests). So far, we have looked at examples of building a culture of safety in behavior toward colleagues. It is just as important to practice the same skills with patients. When it comes to negotiating and collaborating effectively with patients, sometimes it is fear that gets in the way. Patient safety expert Barbara Balik, RN, shares advice to work more collaboratively with patients in spite of time pressure virtually all providers feel: Patients and Families Another reason to practice strong teamwork and communication skills with patients and families: They can play an important role in improving safety. You met Tanya Lord in the previous lesson. She lost her son, Noah, to harm from the health care system. Before Noah died, Tanya knew something was wrong, but no one would listen: The are a number of potential roles that patients and family might play in patient safety.1–3 • Helping to identify adverse events. • Helping to inform clinicians about adverse events they are not aware of. • Behaving as advocates for their own health. Dr. Saul N. Weingart, Chief Medical Officer at Tufts Medical Center, shares some of his research findings related to the above: Post lesson Assessmnet One hospital CEO insists on including performance data in the hospital’s annual report. “We do very well on most measures, except for one or two, but we put those in anyway,” she says. “We want to hold ourselves accountable.” Does this practice demonstrate effective or ineffective leadership? (A) Ineffective leadership: Because results are an important indication of leadership, publicly sharing poor results is an unwise practice. (B) Effective leadership: Being transparent, even about poor results, is a mark of a good leader. (C) Ineffective leadership: Leaders are people who have followers, and sharing poor results might cause the leader to lose some followers. (D) Effective leadership: It is good to share results in the annual report, but the leadership would be even more effective if it shared only the strongest results. Good leaders know that leaders are highly visible — and they therefore set examples for others. A leader who seeks transparency in her followers must demonstrate the same quality herself. Use the following scenario to answer questions 2 and 3: At the large multi-specialty clinic in which you work, there have been two near misses and one medical error because various clinicians did not follow up on patient results. Different caregivers were involved each time. When asked why they failed to follow up, each caregiver said he or she forgot. Based on what you know, how would you classify the caregivers’ behavior? (A) Human error (B) At-risk behavior (C) Reckless behavior (D) None of the above The best answer is human error, as there is no reason to believe the caregivers’ acted with intentional disregard for safety. The fact that multiple people made the same mistake further suggests the problem was due to a poorly designed system rather than at-risk or reckless behavior by individuals. A nurse who realized that his colleagues weren’t consistently following up on patient results reported the problem to the clinic leadership right away. Which response would be most consistent with a culture of safety? (A) Transferring the nurse to another clinic (B) Investigating the problem and seeking systems solutions (C) Thanking the nurse and asking him to keep quiet about it (D) Placing the item on the agenda for the leadership meeting next year The best answer is investigating the problem and seeking systems solutions. An organization must develop a method to surface and learn from defects and harm that occurs to patients. We know that incident reports are one way to learn. They can also be an indicator of the culture of the organization. That is the more people are willing to report, the safer they feel. Why is psychological safety a crucial component of a culture of safety? (A) Without it, people won’t be interested in improvement work. (B) It allows people to remove unsafe members of the team quickly. (C) Without it, patients will not follow their doctors’ advice. (D) It allows people to learn from mistakes and near-misses, reducing the chances of further errors. In psychologically safe environments, people understand that making mistakes is rarely a sign of incompetence, and that they won’t be judged for discussing mistakes. Because of that, people are able to call out errors – whether their own or others’ – and improve the processes that made the errors possible. A medical unit in a hospital is in the midst of hiring some new physicians. During an orientation for new employees, a senior leader stands up and says, “We expect that the same rules apply to everyone on the unit, regardless of position.” Which aspect of a culture of safety does this unit seem to value? (A) Psychological safety (B) Accountability (C) Negotiation (D) None of these Holding all employees to the same standards of professional behavior, regardless of position, is an example of accountability. Your Role in Building Safer, More Reliable Systems Learning from Adverse Events If blame and punishment aren’t appropriate responses to adverse events, what are? Consider the following story from Sorrel King, a mother whose initial reaction to a catastrophic health care event was blame and anger — but who ultimately chose another path Josie’s parents created a patient safety program at Johns Hopkins Children’s Center, focusing on systems causes of error. “For me, maybe it would have been easier to pin it on that one nurse,” King said. “But it wasn’t her fault; it was the system’s fault.” By examining the events that led to an error, health care organizations can take a learning approach to responding to error and unintended events. They can look for reasonable system changes to prevent the same problems from happening again. Here’s a reminder of the factors that contribute to the organization’s learning system, which we laid out in Lesson 1:1 . Improvement and measurement: strengthening work processes and patient outcomes using improvement science, including measurement over time . Transparency: openly sharing information about the safety and quality of care with staff, partners, patients, and families . Continuous learning: regularly identifying and learning from defects and successes . Reliability: applying best evidence and promoting standard practices with the goal of failure-free operation over time In this lesson, we’ll review each of these dimensions of safe, reliable, and effective care. Two interrelated domains underpin the Framework for Safe, Reliable, and Effective Care: the culture (orange elements) and the learning system (gray elements). The quality of the learning system is defined by the ability to self-reflect and identify strengths and defects, both in real time and in periodic review. How Complex Systems Fail For anyone who works in health care, this lesson will explain your responsibility to recognize problems and drive improvement in your own work, with the larger system in mind. Steve Spear, whom we introduced in Lesson 1, can offer a quick example of why this is so important: Dr. David Bates, whom you heard from in the previous lesson, has studied adverse events in health care extensively. He has found that in every case of error that leads to death or injury, there are ten more errors that have the potential to cause serious harm but for some reason don’t. These errors are weak signals that something is wrong with the system.1 In health care, examples of weak signals could be: • A nurse inadvertently picks up a multi-dose vial of insulin instead of heparin, but notices right before he injects the insulin into the line. • A nurse injects the wrong medication into the patient, but it is an antibiotic that doesn’t cause an adverse reaction. When people in a system sidestep weak signals of potential harm (literally, in Steve Spear’s example) and fail to address safety concerns head on, it’s called a workaround. Workarounds are dangerous because they allow a problem in the system to continue to exist — setting people up to eventually experience failure. Who’s to blame for Hannah’s fall? Who’s to blame for Hannah’s fall? Check all that apply. (A) Tom (B) Daria and Deepa (C) Joe (D) Rachel (E) Hannah (F) All of these people (G) None of these people More Info In one sense, all of these people are to blame. You might say: . “Hannah should have been more careful.” . “Rachel should have seen the situation for what it was and shouldn’t have interrupted her.” . “Deepa and Daria are to blame because they stepped over the cord.” . “Tom was the one who stepped over it in the first place, so the mess that occurred is his fault.” The fact is, any one of them could have recognized the risk and taken steps to remedy the dangerous situation. But the true cause of Hannah’s fall goes beyond any one person’s actions — and ultimately goes to the very top of the organization. In this imaginary organization, leaders failed to set the expectation that people should recognize and speak up about small safety hazards they observed as part of daily work. That failure led to Hannah’s fall. Unfortunately, in high-risk organizations, the consequences can be much worse. Reliability After the Space Shuttle Columbia disintegrated and killed seven crew members, investigators reported: “With each successful landing, it appears that NASA engineers and managers increasingly regarded the [small problems they saw] as inevitable, and as either unlikely to jeopardize safety or simply an acceptable risk.”1 Often, the little problems and workarounds that crop up in the daily routine become so familiar that people start assuming they’re completely normal, a phenomenon called normalizing deviance. Normalizing deviance is a problem because it erodes reliability. Reliability is the ability to successfully produce a product to specification repeatedly. In health care, that “product” is safe, efficient, person-centered care. IHI Executive Director Frank Federico, RPh, explains what makes health care processes reliable: When you choose not to follow a standard operating procedure, you make systems less reliable and put patients at risk. What should you do instead? If the protocol is unclear, takes too long to follow, or is not the best solution to the problem, speak up. Speaking up is the first step toward learning and improvement. Improvement and Measurement You’ve seen now what happens when busy people use workarounds and ignore weak signals. Unfortunately, this is what usually happens. One study observed nurses worked around operational failures 90 percent of the time, ignoring weak signals such as the pharmacy sending incorrect medication doses, broken or missing equipment, and supplies being out of stock.1 In the video, Steve Spear explains the opportunity each person has to identify vulnerability in the system and take action to correct it: On a hectic day at work, imagine you mistakenly hook up a patient’s oxygen supply to compressed room air instead of forced oxygen. You realize and correct the mistake before any harm comes to the patient. Should you report the error? (A) Yes (B) No More Info Yes. When people report errors, whether they have negative consequences or not, organizations can learn from them. Your error was a weak signal of a problem in your system — one that didn’t cause harm in the moment, but could very well harm a patient in the future. Many health care organizations use internal voluntary reporting systems for the purpose of capturing data about errors. While these systems can take many forms (e.g., they may be electronic or paper based, anonymous or open), in most cases anyone can complete and submit a voluntary error report at any time. Transparency Transparent organizations track performance and have the courage to display their work openly. We will say it one more time: Health care professionals have a responsibility to speak up about weak signals. And organizations have a responsibility to respond. Typically, error reports are sent to the risk management, patient safety, or quality department for review and follow-up as well as to the manager of the department in which the error occurred, such as to the pharmacy director for medication-related errors. The organizational response should include: • Acknowledging the issue • Thanking the individual for reporting it • Maintaining communication about what is being done to prevent such an issue in the future On the whole, operational transparency exists when people at all levels can see the activities involved in the learning process: leaders, staff, patients and their families — even other organizations and the community at large. In addition to allowing people to learn, transparency among patients and staff builds trust. Dr. Michael Leonard, a physician leader for patient safety at Kaiser Permanente, made an error more than a decade into his career as a cardiac anesthesiologist: He accidentally administered the wrong medication and re-paralyzed a patient whose surgery was complete. Here is what Dr. Leonard did after the error, and what he learned from the experience: Continuous Learning Conditions that allow organizations to continuously see and solve small problems include: • The people doing work must recognize they have a problem. (Meaning, there must be clear standards for what “normal” should be.) • Someone must be responsible for solving that problem. • The people doing work must be able to notify the responsible person in a timely way. • The responsible person must show up without unfair blame and with a desire to solve the problem collaboratively. • There must be enough time and resources to solve the problem. • Feedback loops must provide data back into the various reporting systems to share information and generate insights to prompt action and learning. The reality of today’s health care environment is that the systems that support patient care are complex and error prone, and most organizations lack a comprehensive method for making them less so. This is the reason patient advocates like Sorrel King and Tanya Lord continue to tell their stories: Increasingly, more time and focus are being placed on proactive rather than reactive learning, to prevent tragedies like what happened to Noah. The Framework for Safe, Reliable, and Effective Care is designed to guide organizations and providers on their journey of improvement, with patients and families at the center. Other courses in the curriculum will review each interconnected dimension more closely. Additional Resources Take the Next Step in Your QI Training: Move from Theory to Action Every day, people all over the world are making differences on a local level. The Quality Improvement Practicum is a 9-week online course with coaching that helps you lead a small improvement project in your local setting. Whether you want to streamline a process in your system, improve outcomes for your customers, or feel happier in your own daily work, IHI experts will support you on a journey toward meaningful change. Post Lesson Quiz Which of these is a behavior providers should adopt to improve patient safety? (A) Develop ways to work around broken systems. (B) Ignore patients’ individual preferences when they disagree with “best practice.” (C) Follow written safety protocols, even if they slow you down. (D) Obey your superiors without question. Safety protocols are in place for a reason, and you should follow them, even if they slow you down. Sometimes there will be a problem with a policy or procedure, in which case you should report it, rather than inventing a “workaround” (a method to circumvent a problem without fixing it). Likewise, you should speak up if you believe any colleague — supervisors included — is threatening patient safety. Part of patient-centered care is respecting patient autonomy, even if it means considering different treatment approaches than what you would normally consider “best practice.” You’re an administrator at a hospital in a fast-growing suburb. Your hospital has hired three new orthopedic surgeons, including a new chief. These new hires are likely to triple the number of knee replacements done in your hospital. Currently, this procedure is done infrequently, and each time it feels a bit chaotic. As you consider the number of individuals with specialized skills required to execute a safe, effective knee replacement (nurses, surgeons, and anesthesiologists, as well as pre-operative, operating room, and post-operative staff), you realize that this process has the properties of a complex system. A few weeks after the new chief of orthopedic surgery comes on board, she has a moment of inspiration and sketches out a new, radically different way for patients to “flow” through the pre-operative, intra-operative, and post-operative phases. She sends you an email saying that she wants you to meet with her Monday morning to begin implementing it. Which of the following should you keep in mind as your hospital redesigns the way it handles knee replacements? (A) Planning by a multidisciplinary team should allow for the development of an excellent, high-functioning system on the first try. (B) Planning a new complex system for health care delivery has little in common with planning an industrial production process. (C) How system components are integrated with one another is as important as how well they function independently. (D) To ensure buy-in, the leader of the design process should be as high up in the organizational hierarchy as possible. Any complex design process should begin with excellent component processes and materials. But such components will not, by themselves, result in an excellent overall result. How components (and component processes) are integrated is a key to overall outcomes. This is as true for a medical care process as it is for an industrial design process. Even with a committed multidisciplinary team, it is very rarely, if ever, possible to get everything right on the first try. Finding flaws after initial implementation (and opportunities for further improvement) should be expected and embraced. While commitment to innovation, excellence, and continual improvement should be supported from the very top of an organization, the actual leadership of the design process should be at the level that will serve best to engage those who have the deepest knowledge of the workflows and component activities, and can engage the multidisciplinary design team. Which of the following is typically true of “weak signals”? (A) They usually result in harm to caregivers or patients. (B) They are uncommon. (C) They can combine with other human or environmental factors to result in catastrophe. (D) They should only be called out by specifically designated individuals within a health care organization . Weak signals that could be used to identify system deficiencies are common — and usually ignored. This is understandable since, by themselves, such signals do not result in direct harm. It is only when they combine with other factors that harm (and sometimes catastrophe) results. Examples in and out of health care abound, including NASA’s Columbia Space Shuttle disaster, which, if the response to such signals had been more robust, could have been prevented. Since weak signals occur in daily work at all levels of an organization, each individual must see it as part of his or her job to identify and respond to such signals (or to “escalate” the problem up the hierarchy so that it can be fixed). The term “normalized deviance” refers to: (A) Acceptance of events that are initially allowed because no catastrophic harm appears to result. (D) The standard deviation of a variable in a “bell curve” distribution. (E) The increase in disturbing song lyrics in modern music. (F) Innovation based on observing positive outliers in a production process. Paradoxically, the fact that weak signals do not result in harm is what makes them most dangerous. When a weak signal is ignored (perhaps many times) and no harm results, workers integrate it into their conception of what is normal. Statements like “we always do it that way” may indicate underlying complacency. This acceptance of unsafe, ineffective, or inefficient routines is called normalized deviance. You meet with the nurse administrator responsible for improvement when issues in the process of care are identified by those on the wards. She listens carefully to your concern, but in the end says she can only try to help improve nursing issues, and not those that extend to pharmacy or transport. The primary reason your meeting is unlikely to lead to an adequate solution is: (A) No one is identified as responsible for improvement when abnormalities in the process of care are identified. (B) The responsible individual belittled the nurse reporting the problem. (C) The nurse administrator did not have the appropriate span of responsibility to engage the system components needed to solve the problem. (D) Since things have been going along without a serious adverse event for several months, it appears that the current work-around is effective. Steve Spear identifies a number of steps needed to fix problems in a production system. They include recognizing abnormalities; having an identified person to call, with the knowledge, attitude, and responsibility necessary to find a solution; and giving workers the time and resources to solve the problem. In the case of health care, this means treating the “system” as well as the “patient.” The challenge here is that even though someone is designated, and that person may have the time to fix how work is done, the nurse administrator may not have the perspective and authority to work across boundaries of specialty, function, and discipline. The Swiss Cheese Model Why Did Nora Get an Infection? Though born prematurely, Nora Boström was a lively toddler with long curly hair when she began having fainting spells at age 3. Doctors in Palo Alto, California, put her on a medicine to help her lungs grow stronger. The medication was delivered by a central line catheter — a tube that went through a vein in her arm and into her heart. Because of the catheter, Nora became infected. She got one central line–associated bloodstream infection (CLABSI), and then three more in a year. Just before her fourth birthday, she died in a hospital in her mother’s arms. No one is sure whether the infections contributed to her death, but the question remains: Why did Nora get these infections? Understanding the relationship between error and harm is the first step to building safer systems. Doug Bonacum, the former Vice President of Quality and Safety at Kaiser Permanente health system, shares why this is more important than ever: Video Transcript: A New Way of Thinking Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente With an industry filled with professionals drawn to helping others, who go through intense medical training, and are carefully screened for their positions, why is health care so dangerous? The first thing we must appreciate is how complex the practice of medicine has become. Even with all the medical research that has been done over the past hundred years, there is still not a high degree of agreement on what constitutes the best and safest practice. Diagnosis and treatment are often performed under some degree of uncertainty and medication monitoring, particularly in the outpatient setting, is quite challenging. For our frontline practitioners, there are always new medications, new technologies, new procedures, and new research findings to assimilate. It can be overwhelming. Patients are becoming increasingly complex and the diversity of the workforce grows at an increasing rate. Providing safe reliable care has never been more challenging than today. One thing that is clear to all of us working on this issue today is that we can’t solve our patient safety problems by using the same kind of thinking that created them in the first place. To make things right for every patient every day will require a new way of thinking about error in medicine and a new approach to preventing harm. This new way begins with a deeper appreciation for error causation and error prevention In this course, we’ll take a look at how human error relates to harm, and what these concepts can teach us about how to improve health care. We’ll start with a helpful way to understand harm: the Swiss cheese model. The Swiss Cheese Model In health care, hazards are everywhere — powerful drugs, complicated procedures, and very sick patients are the norm. Serious adverse events are almost always the result of multiple failed opportunities to stop a hazard from causing harm. James Reason, a psychology professor and one of the seminal thinkers in the field of human error, has called this idea the Swiss cheese model of accident causation — meaning, the idea that harm is caused by a series of systemic failures in the presence of hazard.1 In this model, the cheese represents successive layers of defense in your organization’s safety system. For example, to prevent CLABSIs, providers wear sterile gowns, wash their hands, follow checklists, and remove the catheter as soon as possible; in most cases, safety systems such as these prevent hazardous situations from leading to harm. But they don’t always. In the next video, Doug Bonacum will explain the Swiss cheese model, and how the holes in the cheese represent each opportunity for failure within each layer of defense:2 Video Transcript: The Swiss Cheese Model Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente Let’s consider the delivery of medicine to be the inherent hazard in our industry. Ingesting a medication, having surgery, being placed on bypass, undergoing dialysis, or receiving radiation for example, all come with significant health benefits, but they are not risk-free. In addition to the hazards inherent in medicine, accident analysis has revealed the human contribution to adverse outcomes predominates. Taking a step back, Reason asserts that the setup for an accident to occur in a system begins with fallible decisions made by top level leaders. These decisions are then transmitted via line management and then ultimately to the point of production, or the point of care in our industry, where so called preconditions, or qualities of human behavior in production coexist including attributes like the level of skill and knowledge of the workforce, the work schedules, technology, equipment, and maintenance programs, along with the individual and collective attitudes and motivation of the workforce itself which creates its culture. In summary, Reason conceptualized the trajectory of accident opportunity being one which begins with what he calls latent failures at managerial levels’ It proceeds with complex interactions as the impact of the management decisions get closer and closer to the point of care, and is neither stopped nor mitigated by one or more levels of defense that were designed to reduce the risk of harm associated with unsafe acts in the very first place. This is the Swiss Cheese Model of accident causation. The Swiss Cheese analogy here is that all of the holes or defects in the various levels of the system align to turn a managerial deficiency into an adverse outcome, and the challenge for safety professionals is that because this alignment occurs so infrequently, it is difficult to get the organization to attend to the risks. Those that are active and present every day such as not washing ones hands, and those that are latent or lie dormant in system such as the choice of where to place hand hygiene dispensers in the first place. Latent Conditions and Active Failures Have you ever tried to pull a door open when it can only be pushed open? In this image, the poor design of the door is a latent condition that makes human error more likely to occur: Dr. Rollin Fairbanks, an Associate Professor at Georgetown University and Director at the MedStar Health National Center for Human Factors, took this photo at his local bagel shop after watching a series of patrons try to pull the door open despite the sign that clearly says “push.” How many errors do you think could be prevented if the door had a horizontal bar instead of a handle? Video Transcript: Human Factors in Everday Life Rollin J. (Terry) Fairbanks, MD, MS, CPPS; Director, National Center for Human Factors in Healthcare, MedStar Institute for Innovation; MedStar Health Attending Emergency Physician, MedStar Washington Hospital Center; Associate Professor of Emergency Medicine, Georgetown University There are other factors that might lead us to believe that in retrospect that we should blame the human in the system for this error, such as knowledge that fire code requires the doors in public facilities to push out, and the fact that they came in the door in the first place, which makes us think that they should know how the door operates. So there are many factors that might lead the human in this system to do the right thing when they approach the door. So why did they do the wrong thing so often? Well they did the wrong thing because the design of the door is inconsistent with known human factors engineering design principles. The design of the door provides very strong cognitive cues that tell the user to pull on the handle; it's a pull handle after all, and our brains have learned this association over time. The human factors term for this kind of learning is affordance, and this is a skill-based error. As you approach doors, you don't think about the task at hand. Instead, in skill- based mode, you're constantly receiving cues from the environment about what to do. If the design is not savvy from a human factors standpoint, then the cues can lead you to do the wrong thing. In the Swiss cheese model, the holes in the cheese represent both latent conditions and active failures. Latent conditions are defects in the design and organization of processes and systems — things like poor equipment design, inadequate training, or insufficient resources. These errors are often unrecognized, or just become accepted aspects of the work, because their effects are delayed. Latent conditions lead to active failures — easily observed by Dr. Fairbanks or anyone else. Active failures are errors whose effects are seen and felt immediately: someone pushing an incorrect button, ignoring a warning light, or grabbing the wrong medication. In health care, the person on the front line — e.g., the doctor, nurse, pharmacist, or technician — might be the proximal “cause” of the active error, but the real root causes of the error have often been present within the system for a long time. The example shows how a series of contributing factors, including both latent conditions and an active failure, could lead to a medication error. Swiss Cheese Model: Tenerife Disaster Monument by Dutch artist Rudi van de Wint erected in memory of the victims of the Tenerife airport disaster (March 27, 1977), the deadliest air crash in history. You may recall this case from PS 101: Introduction to Patient Safety: We discussed an accident that led the aviation industry to rethink blame as a response to error. Now, let’s look at how the accident happened — and how it illustrates the Swiss cheese model of harm. Video Transcript: Learning from the Tenerife Disaster: Latent Conditions John Nance, pilot, aviation analyst for ABC News, and advocate for patient safety Jacob, this particular day, is a very upset guy. The reason he's upset is because he's had to divert to a place he didn't want to go. See, the chief pilot has to fly every now and then, just like the director of the medical staff has to stay current, and he's gotten out and gone in a 747. He's taken this charter down to the Canary Islands, and he's going to turn around pick up another group of passengers, take them back to Amsterdam, then he’ll get back to the office and somebody else will fly the airplane on. Except it's not working that way. Murphy has gotten in the works and there's been a bomb threat over Las Palmas, which is the main airport of the Canary Islands, so he's had to divert with several other airplanes to another airfield with a slightly higher altitude, with only one runway — it's kind of short, and it’s fog-bound today. It’s not really fog; It’s clouds blowing across the runway. But it's just the same. He's had trouble getting his fuel. He's had trouble getting out of there, and he's about to run out of the thing we call “crew duty time.” This is the maximum amount of time that an air crew may remain on duty before we've got to put them to bed to get them some rest. I know this is a completely foreign concept to health care. Anyway, Jacob’s an upset guy this day, because he's finally gotten his airplane started. He has 10 minutes to get this huge 747 to the end of this fog- bound runway and get off the ground without running out of crew duty time. And here's the penalty if he runs out. He's got to put everybody to bed at Las Palmas, buy $30,000 worth of hotel rooms and delays, and it's going to be very embarrassing. He wants to go, and as they get the airplane down the field, they have to taxi down the runway about halfway, because there are no taxiways stressed for a 747 in the first part. Then he has to turn around, get on the taxiway, come to the end, turn around, line up with the runway, they can only see about 300 yards into the fog, and as they line up, the first officer who was very senior at KLM but very junior to Jacob Van Zanten, has never flown with Jacob before, and the second officer who's very senior at the airline as a second officer and flight engineer, but very junior to the first officer, who's very junior to the captain … you get the hierarchy. As they get in position, the first officer and co-pilot sees the captain’s right hand coming forward on the throttles with these four huge JT 90 engines, 50,000 pounds of thrust a piece, and he knows that they don't have a clearance to take off. On top of that, they don't have what's known as an air traffic control clearance to actually go over to Las Palmas, and he turns toward his commander with wide eyes and says, “Sir, we don't have a takeoff clearance.” And Jacob Van Zanten pulls the throttles back, and — in the inimitable fashion that all of us who qualify as airline captains learn — he says, “I knew that. Get the clearance.” The first officer punches the button, talks to the tower, asks for the clearance. The clearance is read to him, he reads the clearance back. We have a little linguistic disconnect here, because the guys in the KLM cockpit speak Dutch, that's their native language, and yet they're communicating by radio in a thing we call “aviation English,” which is kind of a stylized version of English. The fellow the tower speaks Spanish because this is a Spanish possession. He is trying to communicate in another language, aviation English, and there is even an air crew on the field moving around out there, who, according to my British friends, don't speak English at all — Pan-American. I'm told we speak American — we don't speak English. At any rate, there is a linguistic disconnect, so when the first officer is finishing his read-back and notices with increasing horror that the captain's hand is once again coming forward on the throttles, they have now the air traffic control clearance but they still don't have the physical clearance to take off. These are two separate clearances required. By the way, this guy is not a dummy, this is not a dumb individual, this is you or me sitting in that right seat. This is everybody in this room who has ever been in a position to see a senior individual doing something for the second time, and you got by with it first. You were well-treated when you pointed it out the first time. Do you really want to tell them again that they're fouling up? He would like to find another way to do it, so the first officer, knowing that the captain is starting the takeoff roll without permission, keeps his finger on the transmit button and says, “And we are at takeoff, KLM 1422.” The problem with this is that it doesn't make sense in aviation English. We are at takeoff, but as we all do as human beings, we fill in the blanks. Don’t we? You expect to hear something, and it's close, so you just go ahead and fill in the blanks. We are at takeoff, we are in take-off position. Yeah, yeah, that’ll work, that's what he means, we're going to take off position. However, there's something wrong, and the tower controller is not really satisfied with this, and the controller presses his transmit button and says “OK. Stand by, KLM. I will call you.” But before he can get the second part of that phrase out, somebody else who's worried transmits and the two transmissions cancel themselves out and the only thing heard in the headsets for the KLM crew is the word “OK.” “Set power!” says Jacob. “We go!” Now the first officer's attention is entirely skewed to serving his commander, as this big jet begins to roll forward into the fog. Five knots. Ten knots. Twenty knots. There’s another radio transmission out there someplace on the air patch, and the first officer and the captain are too busy with takeoff roll to really pay attention to it, but the second officer/flight engineer, the guy who sits sidesaddle, hears this, and it worries him, and he leans forward at 35, 40 knots, and says, “Is he not clear then? That Pan-Am?” Fifty knots. Fifty-five knots. Sixty knots. “What?” says Jacob. “Huh?” says the first officer. Sixty-five knots. Seventy knots. Now in a more timorous voice, second officer leans forward and says, “Is he not clear then? That Pan-Am?” Seventy-five knots. Eighty knots. “Yes!” says Jacob, angrily, unhappy to be interrupted in the middle of his take-off roll. “Yes!” echoes the first officer, and the second officer sits back, shuts up, and says no more. Ninety knots. One hundred knots, 105, 110, 112 knots. They finally come out of the wall of fog and they can see ahead. What they can see is the worst thing an airline captain could possibly imagine. Another airplane sideways on the runway, right in front of them. Pan-Am had not left the runway yet. Van Zanten pulls the yoke into his chest as hard as he can, the airplane’s nose comes off the ground, and it beds the tail in the concrete, making 50 yards of sparks. As the big bird begins to lift off the ground it’s 25 knots too slow, and yet he leaps it off the ground, and for a moment it looks like he's going to make it. The huge Pan-Am logo slides past the left side of the peripheral vision of the cockpit. The nose gear passes safely over the back of the Pan-Am, but the body gear and the wing gear don't make it. They bite through the back of the structural integrity of the other 747. Its wings fall to the ground in flames. It comes apart. KLM’s undercarriage ripped away from it, and it falls back to the runway in flames, and within 30 seconds, 572 human beings lose their lives. Which of the following factors do you think contributed to the crash? [Select all that apply.] (A) Foggy weather that limited visibility (B) Stress and distraction on the part of the pilot (C) Incompetent staff (D) Time pressure due to crew duty hours (E) Language differences (F) Miscommunication between the two planes and the control tower (G) Problems with the radio transmission (H) A hierarchical culture that limited the junior crew’s willingness to speak up (I) Mechanical problems More Info The only factors that didn’t contribute to the Tenerife disaster were mechanical problems and staff incompetence. The Spanish Accident Board that investigated the crash found that human error on the part of the captain was the proximal cause of the accident. Captain Van Zanten took off without clearance, and ignored several warnings that he did not have clearance — this was the active failure in the Swiss cheese event. But this accident also laid bare many latent conditions that made the system unsafe: a lack of a clear communication protocol for take-off, stress from the urgency of the take-off, and a hierarchical culture that made it difficult for the co-pilot to speak up about the error. These conditions in combination allowed the active error to occur, and lead to disaster. Which of the following would be an effective solution to help prevent the same tragedy from happening again? (A) Stop flying on foggy days. (B) Train people about safety culture to help them speak up in a hierarchical system. (C) Fire the individuals who failed to deliver the message that the runway wasn’t clear for take off. (D) Standardize communication about clearance for take off. (E) Hire more competent pilots and flight crews. More Info After the Tenerife disaster, the aviation industry began to look at safety as a property of a system. Instead of blaming the individuals involved in the crash, it devised systems solutions, such as standardizing the terms to communicate about clearance and training staff to communicate openly about safety issues, regardless of hierarchy (answers B and D). Aviation will always exist in a hazardous environment; changing conditions in the system that allowed the hazardous environment to cause a disaster — i.e., filling the holes in the cheese (opportunities for processes to fail) and adding more slices (layers of defense) — is the best way to prevent a recurrence of the same tragedy Bad Systems, Not Bad People To prevent harm, we must study and apply the science of improvement. Because of the known risk for CLABSIs, many health systems have put in place multiple protocols (“layers of cheese”) to keep patients safe from infection. Yet, even with all these layers of defense, sometimes bacteria still enters the patient’s bloodstream — a classic Swiss cheese event. So what is the result of this Swiss cheese event? Like Nora Boström, the patient we described on page 1, is the patient always harmed? No. In fact, there are several possible outcomes: . The insertion site may become infected, so the providers move the catheter and treat the patient with antibiotics (mild and temporary harm). . The patient may develop a CLABSI, a complication that causes death in roughly one out of five cases (serious harm, possibly permanent). . The patient may have no reaction at all, because the human body is resilient and has its own defenses. In such cases, no one may even realize there has been an error. . Think of the relationship between harm and error like this:1 Sometimes, errors will occur without causing any harm. And sometimes harm will occur when no one can pinpoint an error — which does not diminish the harm the patient experiences. As we take a closer look at error and harm individually in the next two lessons, exploring the different scenarios represented by the diagram above, remember the powerful implication of the Swiss cheese model: If you want to prevent harm, instead of telling people to be more careful, you need to improve the systems in which they work. As Doug Bonacum urges us all to remember, errors are events whose causes can be identified and mitigated: Video Transcript: Making Systems Safer Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente In his landmark article, Error in Medicine, published in JAMA in 1994, Dr. Lucian Leape concluded that errors were common, their causes were relatively well known, and while many errors were caused by areas that relied on weak aspects of cognition, systems failures were at the root of most medical errors. The way that I’ve interpreted this statement is that errors are events whose causes can be identified and mitigated. Let me say that again, errors are events whose causes can be identified and mitigated. That means when you review a close call or a preventable adverse outcome, and you conclude that the cause of the event was human error, you have both failed the patient and family, as much as you have failed the individual practitioner involved in the event. The error wasn’t the cause, the error was an event whose cause can be identified and mitigated. Furthermore, once an error occurs and it reaches the patient, reliable systems typically have methods in place to detect and further mitigate them before they cause significant harm. That means our plan for building reliable systems includes a plan for handling how they would predictably and reasonably fail, and what our systems are going to do in response. Post-Lesson Quiz Use the following scenario to answer questions 1–2: Nearing the end of her 18-hour work shift, a resident sees a patient with extremely high blood glucose levels. She writes the patient a prescription for insulin; however, in her exhaustion, she closes her “U” (for “units”), and it looks more like an extra zero. As a result, the pharmacist dispenses an insulin dose that’s ten times stronger than the patient needs. Which of the following is a latent unsafe condition in the system that contributes to the resident’s error? (A) Long work schedule (B) Fatigue (C) Inadequate training (D) None of the above The correct answer is the resident's long work schedule. Latent conditions are flaws in the design of systems that create opportunities for error. To prevent this problem from happening again, which of the following would be the best course of action? (A) Punish the resident and the pharmacist for their careless actions. (B) Require both the resident and the pharmacist to take additional training. (C) Develop a system that prevents messy handwriting from causing miscommunication that leads to error. (D) Ensure that no prescribing physician is ever tired or distracted. The best answer is to develop a system that prevents messy handwriting from causing confusion that leads to error. For example, the organization could switch to an electronic ordering system. Mandating additional training and/or punishing the resident and pharmacist for an unintentional error won’t prevent them or anyone else from making the same mistake in the future. Providers are human beings, and there will always be days when they’re tired or distracted. “Latent errors” are best defined as: (A) Defects in the design and organization of processes and systems. (B) Errors in patient care that don’t ever result in harm and thus go undetected. (C) Mistakes in patient care that providers fail to report due to fear of punishment. (D) Errors in patient care that cause immediate adverse effects. Latent errors are defects in the design or organization of processes and systems. These insidious errors can go unnoticed or ignored, but in time are likely to result in patient harm — or, a so-called “active error” in care. For example, operating on the wrong surgical site is an active error with immediate effects; however, any number of latent errors in surgical processes can contribute to a wrong-site surgery. Use the following scenario to answer questions 4–5: Two women — one named Camilla Tyler, the other named Camilla Taylor — arrive at a particularly busy emergency department at about the same time. Ms. Tyler needs a sedative, and Ms. Taylor needs an antibiotic. The doctor orders the medications, but mixes up the patients when filling out the order sheets. The pharmacist dispenses the medications as ordered, and the nurse administers an antibiotic to Ms. Tyler and a sedative to Ms. Taylor. What is the active error in this scenario? (A) The forms are completed by hand at the same time for different patients. (B) The nurse administers an antibiotic to Ms. Tyler and a sedative to Ms. Taylor. (C) The emergency department is particularly busy. (D) The pharmacist doesn't notice that the order sheets are incorrectly filled out. The active error is the human error that led to patient harm. In this case, it’s the nurse administering an antibiotic to Ms. Tyler and a sedative to Ms. Taylor. What is one of the latent errors in this scenario? (A) The emergency department is particularly busy. (B) The nurse administers an antibiotic to Ms. Tyler and a sedative to Ms. Taylor. (C) The forms are completed by hand at the same time for different patients. (D) The two patients in this case have very similar names. Latent errors include any systemic problems that allowed the potential for an active error to occur and lead to patient harm. In this case, the fact that the forms were completed by hand at the same time for different patients turned out to be a latent error. The busy department and patients with similar names were not errors; they are just inherently challenging qualities of the system. Understanding Unsafe Acts Introduction to Unsafe Acts Let’s reconsider James Reason’s Swiss cheese model that we studied in the previous lesson and look again at the story of Nora Boström, the 3-year-old girl who suffered from four catheter-associated bloodstream infections (CLABSIs). As we mentioned before, placing a central line (a tube that goes into a vein connected to the heart) is inherently hazardous. That’s why, in 2001, a critical care specialist named Peter Pronovost decided to create a checklist of steps to help prevent infections from this invasive procedure. Today, many health systems use an adapted version of this checklist. Each step on the list represents a layer of defense (a “slice of cheese” in James Reason’s analogy) against the hazard of infection from the catheter. But even with all these layers of defense, there is still opportunity for error to occur, and possibly harm the patient: Steps to Prevent CLABSIs Wash hands with soap or alcohol before placing the catheter. Maintain a sterile field. Remove the catheter as soon as it’s no longer needed. Avoid placing the catheter in the groin. In this lesson, we will continue our discussion of the Swiss cheese model by exploring how unsafe acts help errors slip through an organization’s safety system (as “holes in the cheese”), and contribute to harm in the health care system. You will see how just trying to be perfectly adherent to protocol is not a rational or effective approach to prevent patient harm — because the most well-meaning of providers still make errors. Classifying Unsafe Acts The patterns of errors that occur in health care are no different from those that occur in any other setting; you might fail to wear proper footwear at home for the same reasons you fail to wear protective clothing at work. So what exactly do we mean when we say “unsafe act”? In his groundbreaking book Human Error, James Reason defines an unsafe act as “an error or a violation committed in the presence of a potential hazard.”1 According to Reason, unsafe acts may be categorized as either violations, when the person deliberately deviates from known rules, or errors, if the act is not a violation. Errors may be further categorized as slips, lapses, and mistakes. Click the image to explore each category: Doug Bonacum, an expert in systems thinking and design, will give an example of each type of error as defined by James Reason: Video Transcript: Classifying Unsafe Acts Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente Most preventable harm to patients receiving health care today is caused by unsafe acts of the very practitioners who are trying to help them. I like to think about this as human error but in need of system solution. Unsafe acts may be categorized as either human error such as the slips, lapses, and mistakes, or procedural violations such as the slow insidious drift from safe practice over time or, quite frankly, it can be the blatant disregard for important safety rules. As described by Jim Reason in his book entitled, Human Error, errors occur because of one of two main types of failures: either actions do not go as intended, or the intended action is a wrong one. The former situation is a so called error of execution and could be fully described either as being a slip, if the action is observable or a lapse if it is not. An example of a slip is the inadvertent selection of the wrong medication from a dropdown menu in an automated medical record. An example of a lapse might be forgetting to implement a physician order in the desired time window. A mistake on the other hand, is a failure of planning. That is, the plan is wrong. This can be either rule-based, because the wrong rule was applied, or knowledge-based because the clinician does not take the correct course of action. An example of a rule-based mistake might be getting a diagnosis wrong and then embarking on an inappropriate treatment plan based upon a known set of rules on how to proceed. A knowledge-based mistake tends to occur when clinicians are confronted with for them what might be a new or novel situation. Keep in mind that we’re not just teaching you terms; we are providing you with a framework by which to understand how medical errors happen and how to prevent them. The term “medical error” is slightly misleading because the patterns of errors that occur in health care are no different from those that occur in any other setting when we are distracted, in a hurry, or just plain forgetful. Consider the following scenario: A nurse, Anila, oversleeps and is running late to work. As she rushes through her morning routine, she forgets she promised to call her mother to confirm plans for later in the day. Anila makes it out the door in record time and decides to drive toward the highway, thinking it will be faster than her usual route to the hospital — only to find traffic is backed up for miles! At last, Anila makes it to work. As she hurries inside, she finally remembers to call her mom. But in her hurry, she accidentally calls her boyfriend instead. “Three errors and I haven’t even seen a patient yet,” she thinks. Which type(s) of error did Anila make? (Select all that apply.) (A) Slip (B) Lapse (C) Mistake (D) None of the above More Info Anila made all three types of errors. First, she experiences a lapse, forgetting she promised to call her mother to confirm plans for later in the day. Then, she decides to drive toward the highway, thinking it will be faster than her usual route to the hospital. But she quickly realizes her error in planning (mistake) because traffic is backed up for miles. Finally, Anila knows she wants to call her mom, but makes an observable error of execution (slip), and accidentally calls her boyfriend instead. A Closer Look at Violations Although they are deliberate departures from the rules, violations are not necessarily the result of deviant behavior or intended to cause harm. People often do not fully recognize the risks they are taking or feel the risks are justified. In the scenario you just analyzed, you can imagine Anila breaking a couple of traffic laws because she was in a hurry to get to work, where there were patients waiting who needed her help. Doug Bonacum puts it this way: “If to err is human, then — one may argue — so, too, is to violate”: Video Transcript: Violations Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente If to err is human, then one may argue, so too is to violate. Just observe your own driving habits and those of other cars on the road and this notion becomes very clear. Driving above the speed limit, talking or texting on a cell phone, rolling through stop signs, speeding up at yellow lights, and tailgating are all violations, and all are as alarmingly clear and present as the day. And Reason notes that while errors may be defined in relation to the cognitive processes of the individual, violations or deliberate deviations from safe practice may be defined with regard to a social context and behavior that is governed by things like operating procedures, codes of conduct, values, rules, norms, and the like. Both errors and violations can be, and often are, present in the same action sequence. I conceptualize this as the perfect storm of preventable patient harm. Or if you were still stuck on the driving example, the perfect storm of automobile accidents. In the previous course, PS 101: Introduction to Patient Safety, we introduced the foundational ideas of David Marx, a lawyer and engineer who developed the concept of just culture; you learned that Marx drew a line between at-risk behavior and reckless behavior, and he recommended disciplinary action for the latter only. On the next page, we’ll give you a tool to help you apply this theory in health care. A 52-year-old man with a history of ulcers and bleeding in his gastrointestinal tract as a result of taking ibuprofen visits his primary care doctor with a running injury. After examining him, the physician tries to prescribe ibuprofen to treat his condition. The medication order entry system issues an alert — the 25th one that day — and the physician ignores the alert without reviewing the patient’s medical record, thinking the alert is likely to be another “false alarm.” Behind on his schedule, he chooses to override the alert and prescribe the ibuprofen. After taking the medication, the patient develops bleeding in his gastrointestinal tract and has to be admitted to the hospital. What type of unsafe act, if any, is represented in this case example? (A) Slip (B) Lapse (C) Mistake (D) Violation (E) None of the above More Info The physician who ignored the medication order entry system alert without reviewing the patient’s medical record made a conscious decision not to follow standard safe practice, succumbing to the time pressures of his busy schedule and convincing himself that it was another false alarm. Although it’s easy to see why the violation occurred, that is what it is because he deliberately deviated from the rules. A 28-year-old, obese woman goes to a clinic complaining of severe calf pain that keeps getting worse. She tells her primary care physician that she thinks the pain is due to the new shoes she bought and her new commitment to walking and exercising more. She has no history of leg trauma, and her only medication is a birth control pill. After examining her, the physician does not see anything unusual, and prescribes ibuprofen and muscle relaxants. A week later, the patient has a heart attack and is unable to be resuscitated. A post-mortem examination reveals a massive blockage in the artery that passes through her lung. What type of unsafe act, if any, does this case demonstrate? (A) Slip (B) Lapse (C) Mistake (D) Violation (E) None of the above More Info The physician who assessed the patient made an error in judgment: a mistake. The patient’s obesity and use of birth control medication were both risk factors for deep venous thrombosis (blood clots). The doctor knew this, but when the patient attributed her pain to exercise, the physician almost immediately concluded the patient had strained her calf muscle, similar to many other patients who increase activity levels quickly. Because the doctor had the right knowledge but applied it incorrectly, this was a rule-based mistake. (In comparison, knowledge-based mistakes are when the person does not have the necessary knowledge to respond to the situation correctly.) This example involves the concept of heuristics — cognitive shortcuts that facilitate rapid decision-making but make people vulnerable to common mistakes. We will explore heuristics in the next course, PS 103: Human Factors and Safety. Blame vs. Accountability Although blaming and punishing individuals for errors and unintended events are not appropriate responses, that does not mean individuals should not be accountable for their actions. It is critical that each organization define blameworthy events and actions that will be handled or dealt with using administrative or human resource systems. A common definition of blameworthy events includes events that are the result of criminal acts, patient abuse, alcohol or substance abuse on the part of the provider, or acts defined by the organization as being intentionally or deliberately unsafe.1 James Reason’s decision tree for determining culpability for unsafe acts, and other tools like it, help health care professionals distinguish accidents and honest mistakes from behavior that is truly reckless. For example, the substitution test asks whether three other individuals with similar experiences would take the same action in the same situation.2 Use the decision tree to answer the questions below. (Click the image to enlarge it.) A physician attends a luncheon at a restaurant near her work, and consumes several alcoholic beverages. Once at work, she inadvertently drops equipment on the floor several times during a procedure, presenting an infection control risk. She yells at the staff members who remove the contaminated items. Should she be held accountable for her actions? (A) Yes (B) No More Info Most likely yes: The physician is deliberately impaired, placing the safety of the patient at risk. As you see on the decision tree above, she abused a substance and showed malevolence toward staff by yelling at them. Note that if a medical condition is present, such as addiction, that should be taken into account. In the middle of a long and busy day, a dedicated obstetrics nurse is providing medical care and emotional support to a 16-year-old woman having her first baby. Thinking she is giving the patient an IV antibiotic, she instead accidentally administers a local anesthetic, which is sitting on the same counter, housed in similar packaging. The patient dies within minutes from an adverse reaction. Should the nurse be punished for this error? (A) Yes (B) No More Info Blaming and punishing the nurse would not address the real issues that led to the error and the patient’s death — system failures outside the control of the nurse: . The hospital encouraged nurses to work back-to-back shifts. . Medications were labeled in a similar way and sitting next to each other. Furthermore, punishing the nurse may cause other clinicians to feel afraid of punishment, which might discourage them from reporting errors and unsafe conditions, making the system less safe overall. Accommodating the Human Condition James Reason said providers (at the “sharp end”) are often set up to fail by factors that reside in the local workplace and the organization at large (“blunt end”).1 Expecting providers to be perfect is not a rational or effective approach to preventing patient harm; human errors and violations will always occur, and blaming and punishing well-meaning individuals will do nothing to prevent the vast majority of unsafe acts. Here is Doug Bonacum again, to explain how system-level solutions must address the root causes of problems, most often embedded in the organization’s processes and culture: Video Transcript: Mitigating Unsafe Acts Doug Bonacum, CPPS; Vice President, Quality, Patient Safety, and Resource Stewardship, Kaiser Permanente In addition to Reason’s contribution to safety science with the Swiss cheese model of accident causation, he used a simple triangular shape to present the principle stages involved in the development of an organizational accident. This model has three levels or sections. At the top of the triangle is the person or unsafe act, and what he called the sharp end of care. It’s where the rubber really meets the road. The local workplace or error and violation producing conditions that impact the workers behavior are in the middle of this model. And the organization itself is represented as the base or the foundation upon all of which this lays. It is sometimes called the blunt end. This simple structure links the various contributing elements of an adverse outcome into a coherent sequence that runs bottom up in causation and top down in investigation. Systems consist of a multitude of complex processes, all of which need to be considered as important context when an error occurs. Until you fix the system, the same error is just as likely to occur again. Implementing a “systems approach” to addressing medical error means: . Focusing largely on the conditions under which individual providers and care teams work . Redesigning workflow and adding defenses to avert errors . Minimizing the conditions that lend themselves to violations . Putting mechanisms in place to mitigate unsafe acts that may inevitably occur . As opposed to other failed approaches we’ve discussed, this approach can be quite successful in preventing medical error and making patients safe. Your Turn Focusing again on the prevention of CLABSIs, you should now recognize how each deviation from the checklist at the beginning of the lesson represented a different type unsafe act. Can you identify the slip, lapse, violation, and mistake? The physician skips hand-washing because she’s in a hurry and just washed her hands in the previous patient visit. fa More Info This is a violation. The nurse accidentally brushes the sterile tube against a non-sterile surface. fa More Info This is a slip. The physician forgets to remove the catheter when it’s no longer needed. sfd More Info This is a lapse. The resident physician does not have sufficient training to perform a subclavian placement for the line. asdf More Info This is a knowledge-based mistake. Systems Solutions for CLABSIs To help mitigate the risk of unsafe acts, in addition to outlining the steps an individual provider should take in the form of a checklist, many hospital systems implement additional measures to help make sure staff follow each of the procedures every time they insert a central line: time-outs and double-checks, for example. Together, all of these efforts to “fill the holes in the cheese” make a difference, and many hospitals have reduced rates of CLABSIs significantly — even down to zero. Occasionally, however, Swiss cheese events still happen. Let’s suppose a physician is placing a catheter and the following sequence of events occurs: Latent failures, such as distractions and a fear of speaking up, led to an error even with many layers of defense in place. Of note in this scenario: No one recorded the error, which was almost invisible. What is the learning? In the final video of this lesson, Dr. David Bates, another leader of the patient safety movement, will explain why CLABSIs are an excellent example of why health care workers need to focus less on error, and more on harm. This is exactly what we will do in the next lesson. Video Transcript: Shifting the Focus to Harm David W. Bates, MD, MSc, Senior Vice President and Chief Innovation Officer for Brigham and Women’s Hospital Everyone makes errors all the time. It's sort of a part of the human condition. Even someone who tries their very best will still make some errors. We move from focusing on error to focusing more on harm in part because it's become clear that for many things that we thought were not preventable we can actually prevent a very large proportion. The classic example is CLABSI, or catheter-related bloodstream infections, where at one time we thought that was just a complication of having a central venous catheter in place, but now we know that if you follow a set of procedures, or bundle, you can essentially eliminate those infections. We’ve had similar findings for a number of other areas, so I think it makes much more sense to focus primarily on harm. Post Lesson Quiz According to James Reason, by definition an “unsafe act” always includes: (A) A potential hazard (B) Harm to one or more patients (C) One or more mistakes (D) All of the above James Reason calls unsafe acts errors or violations committed in the presence of a potential hazard. Errors can be further divided into slips, lapses, and mistakes. They may or may not actually result in harm, but the potential for harm is present. Use the following scenario to answer the next question. Anita, a nurse practitioner, is seeing Mr. Drummond in clinic. Mr. Drummond is a 57- year-old man with diabetes and chronic kidney disease. Having kept up on the literature, Anita is aware that tightly controlling his diabetes can slow the progression of his renal disease. She discusses her plan to increase his dose of glargine (long-acting insulin) by 12 units per day with one of the family physicians in the clinic, who agrees. At the end of the day, as she is working on her documentation, she realizes she never told Mr. Drummond to increase his insulin dose. This is an example of what type of error? (A) Lapse (B) Mistake (C) Slip (D) Error of planning (E) Violation Anita had a memory failure, which is a classic lapse. She understood what should be done and created a good plan. She even discussed it with a co-worker. However, in the midst of a busy clinic schedule she likely got distracted and forgot to implement the plan. Use the following scenario to answer the next question. Roger, a pharmacist in a hospital, is working in the discharge pharmacy filling medications for patients who are going home. He sees a prescription for ciprofloxacin, an antibiotic, and he asks his pharmacy technician Mike to fill it quickly, as the patient is waiting and anxious to leave. Mike checks the shelves and sees they are out of ciprofloxacin, but they do have levofloxacin (an antibiotic in the same class that covers most, but not all, of the same types of infections). Mike knows he should usually check with the prescribing physician before making a substitution. However, in the interest of efficiency in this particular case, Mike deems it OK to go ahead. He substitutes the medications. This is an example of what type of unsafe act? (A) Mistake (B) Slip (C) Lapse (D) Error of planning (E) Violation This is a violation because Mike made a deliberate decision to disregard standard procedure when he changed antibiotics for this patient without the prescriber’s authorization. This change may result in harm if the levofloxacin does not treat the organism and site of infection. Which of the following is the most significant advantage of shifting to a systems view of safety within health care? (A) It is easier to identify and remove people who are unsafe (B) It allows us to change the conditions under which humans work (C) It prevents human mistakes (D) It allows us to view unsafe acts as violations (E) All of the above Having a systems view of health care allows us to change the conditions under which humans work by recognizing that humans are not perfect and systems have a significant role to play in safety. This view is applicable in all patient care settings, as all care settings these days are complex. Use the following scenario to answer the next question. At University Hospital, the rate of Clostridium Difficile colitis has doubled during the past year. After reviewing the data, the hospital’s senior leaders conclude that this is due to poor hand hygiene on the part of the staff, even though they have a clear hand washing policy in place and don’t believe most staff are intentionally disregarding the policy. They decide to start a hand washing campaign and post signs all over the hospital reminding providers to wash their hands. What type of error is this intervention best designed to address? (A) Mistake (B) Slip (C) Lapse (D) Error of Planning (E) Violation Signs and other reminders are good strategies for addressing lapses, specifically memory failure, which is what the leadership believes is generally happening in this hospital. While these types of campaigns may also address violations or mistakes, they are generally less successful in these areas. A closer look at Harm Focusing on Harm (Not Errors) Throughout this course, we’ve discussed 3-year-old Nora Boström, who had four central line–associated bloodstream infections (CLABSIs) in the year before she died. Not long ago, hospitals accepted a small number of CLABSIs as inevitable. If the hazard of a catheter so close to a patient’s heart occasionally resulted in harm, people thought it was an unfortunate but justifiable side effect of life-saving medical care. In reality, health systems had recommendations for proper handling of central lines, but no one had designed an effective safety system to ensure providers followed every recommended step every time. This changed in 2003, when intensive care units (ICUs) in Michigan, USA, began using a checklist of the steps that help prevent infections. Beyond the checklist, they made changes to help providers follow the steps reliably. Within three months of implementation, the infection rate in Michigan’s ICUs dropped 66 percent. Many cut their infection rate to zero, and sustained the results for years.1 Dr. Bob Wachter, a leading expert on patient safety from University of California San Francisco, explains how the CLABSIs example helped change the way providers thought about human error and patient harm. People began to recognize that (1) many of the complications of care they once thought were inevitable were actually preventable and (2) many occurrences of patient harm are not easily tied to provider error: Video Transcript: Errors Versus Harms Robert M. Wachter, MD; Associate Chair, Department of Medicine, University of California San Francisco In the early years, we talked about and thought about errors mostly; we thought about, you know, trying to decrease the number of errors. As this graphic from my book Understanding Patient Safety illustrates, we’re thinking now more about adverse events and harms rather than errors, which tend to be a little more blame-associated, and a little more finger-pointing to individual providers. And we’ve recognized many harms that occur in the health care setting, things like venous thrombosis, or falls, or decubitus ulcers, or certain health care associated infections, don’t relate to errors per se in the way we’ve traditionally thought about errors, but they are preventable adverse events for which we know certain strategies with which, if adhered to religiously, would decrease the rate of those preventable adverse events. So that tends to be the way we’re thinking about it more and more. You see there are certain adverse events that are not preventable at least in today’s thinking, although it’s important to recognize the line between those two can shift, and central line associated bloodstream infections are a perfect example where we would’ve said ten, fifteen years ago that most of them are adverse events and kind of acts of God, and there’s not much you can do- bad things happen to sick people in ICUs- and we now know through the use of checklists and associated culture changing strategies, that most of those adverse events can be prevented. As you see, within the category of preventable adverse events, there is a category of negligent events, but it’s a relatively small percentage of the category, one that gets overemphasized by the malpractice system but is really not the crux of the problem as we focus on improving patient safety. Defining Harm So what, exactly, is harm? Is it anything bad that happens to patients in the course of their care? There are many definitions of harm, so let’s start by taking a close look at one way IHI has defined harm, for the purpose of measuring it: Harm is “unintended physical injury resulting from or contributed to by medical care that requires additional monitoring, treatment, or hospitalization, or that results in death.”1 Note that the definition, which comes from the IHI Global Trigger Tool for Measuring Adverse Events, includes these important criteria: . Medical care (but not the absence of care) causes the harm. . The harm leads to additional care (for example, more time in the hospital). . The harm is physical, as opposed to psychological. In using the IHI Global Trigger Tool, researchers have found people identify at least 10 times more adverse events compared to what is recorded in hospitals’ voluntary reporting systems.2 However, as Dr. Wachter points out, there is still plenty of harm that trigger tools overlook: Video Transcript: Measuring Harm Robert M. Wachter, MD; Associate Chair, Department of Medicine, University of California San Francisco Of course, there’s a tendency to focus on what we can measure, those are things that are easier for clinicians and patients to see, easier for policy makers to create reporting and payment policies around them. That’s fine, it’s natural, but there’s something that happens that you as health care professionals and safety professionals need to be aware of, which is a phenomenon which we are focusing unduly on measurable safety hazards and less on ones that are harder to measure but perhaps more important. Diagnostic errors for example get very little respect. You would think if you read the medical literature that central line associated bloodstream infections are a more important patient safety hazard than diagnostic errors, which turn out to be completely untrue. But we can measure and we know how to prevent many cases of central line infections. That’s great, and it’s a success story. But diagnostic errors are the more important safety hazard. That means we have to focus more on trying to understand how to measure them, but it also means that if you develop a safety program, it’s worth making sure that you’re putting some bandwidth and resources into preventing diagnostic errors even though they are more difficult to measure. We’re getting better at measuring things, we have trigger tools, active surveillance, and new uses of the electronic health record. It’s also important as you build measurement systems to recognize that they each provide a unique lens into your safety hazards, but no one of them provides a perfect lens, and specifically, incident reporting systems, useful information for those of us trying to improve patient safety but really are only a single lens tend to capture errors reported more by nurses, and doctors tend not to focus on diagnostic errors, for example. And so as Kaveh Shojania has written, it’s like the old story with the man and the elephant, the part that you’re feeling or seeing gives you a certain view of the whole, but not the whole view, so you need to work hard to look at the entire picture in order to get an accurate and robust view of patient safety. Your Turn This quick exercise will check your understanding of the types of harm that are more and less likely to be observed and recorded. Choose whether each of the following cases represents harm according to the definition from the IHI Global Trigger Tool. A nurse administers too much of a sedative because he misinterprets an order, but the patient doesn’t feel different. (A) Harm (B) No Harm More Info The error didn’t lead to physical injury that required additional care, so this definition would not classify it as harm. Nevertheless, the nurse should report this event so that the system can improve. Although the error didn’t cause harm in this instance, it may contribute to harm in the future. A surgeon mistakenly operates on the wrong foot, and the patient has to undergo a second surgery. (A) Harm (B) No Harm More Info This event would be an example of harm because it caused a physical injury that required more care (the second surgery). A doctor tells a patient that she is HIV-positive because of a false lab result, but corrects the mistake before there is any impact to her care. (A) Harm (B) No Harm More Info While the patient might certainly consider this harm, it would not count as harm under the IHI Global Trigger Tool definition. The suffering is psychological rather than physical. A patient’s test result shows she may have a blood clot, but the primary care clinic doesn’t order the next test to confirm. In the meantime, the blood clot causes the patient to have a stroke. (A) Harm (B) No Harm More Info The IHI Global Trigger Tool definition only includes harms caused by active medical care (also known as errors of commission), not the absence of care, as in this scenario. This definition also excludes diagnostic error (when providers reach the wrong diagnosis). (Re) Defining Harm In the United States, medical bills can force families to deplete their savings and lose their homes. If we don’t consider something harm, we’re not likely to try to prevent it — just as providers once accepted a small number of CLABSIs as an unavoidable complication of care instead of asking themselves how to prevent them. Here are a few examples of types of injury that could be considered harm, especially from the patient’s perspective: • Errors of omission: Most definitions of harm have focused on errors of commission — i.e., something that health care providers did that resulted in harm. For example, a patient suffered a stroke after receiving too much of an anticoagulant. But what if the stroke was after providers failed to do something, such as follow up with important test results? • Psychological harm: What about all those experiences that don’t require additional medical care, but injure patients psychologically? Is it harmful when a provider rudely wakes a patient up in the night for no medical reason, when her privacy is violated, or when a provider callously delivers a cancer diagnosis? It’s difficult to define and measure, but every provider has the power to alleviate a patient’s emotional suffering by showing empathy. • Financial harm: In the United States, medical bills are the leading cause of personal bankruptcy.2 Harm from health care can force people to take time off of work and deplete their savings. Many people put off needed services because they can’t afford them. After knee replacement surgery, Rosie Bartel contracted a methicillin- resistant Staphylococcus aureus (MRSA) infection. In the video, she describes the extent of the harm she continues to experience: Video Transcript: A Series of Losses Rosemary Bartel, Patient Saftey Advocate I cannot walk at this point because I have an amputated leg. And because they had to amputate it so short, there's no guarantee I'm going to get a prosthesis. And even at this point, I have to have a vac on this leg because there's still some infection in there and they're trying to clean that out. So at least two months yet before it may even heal well enough to even consider a prosthesis. My surgeon is saying to me, don't count on it. We couldn't stay living in the home we were living in because the door ways weren't wide enough, the rooms weren't big enough to move around in so we had to move. It affected us financially. It caused us to make decisions in our life that we never would have made at this time in our life. We went from owning a home to renting because we had to look at expenses even with insurance. People say you're one illness away from bankruptcy, well that's pretty true. That's pretty true when you have something devastating happen to you and it lasts for three years or more. Preventing Harm The notion of “preventable” harm changes as scientific knowledge and health systems evolve. To appreciate just how quickly the preventability of harm changes, consider these examples from recent medical history: • When chemotherapy for cancer first emerged in the 1980s, the idea was that “if a little was good, then more is better.” Today, oncologists know that low doses of chemotherapy are often just as effective as high doses, require less medical care to manage side effects, and cause less harm to patients.1 • Similarly, surgeons used to regularly perform radical mastectomies on women with breast cancer, removing the entire breast, including the nipple. Today, most women prefer less invasive procedures, which conserve the breast with comparable success.2 • Antibiotics are one of the most life-saving medical innovations of all time. But excessive use of these drugs, including in viral infections where they have no effect, has caused an epidemic of resistance. Today, public health efforts are targeting the public and doctors to try to minimize the unnecessary use of antibiotics. • Patients on ventilators are usually heavily sedated, and they often become delirious and lose strength from weeks of bed rest. Some patients even show signs of post-traumatic stress disorder after leaving the ICU. Today, innovators in critical care are rethinking how much sedation — and what kind — is really necessary in the ICU. They’ve found that if patients come off sedation sooner, they can actually get up and move — which greatly improves their recovery To improve our health care system from the perspective of patients and families, you can see how we have to expand our definition of harm. As IHI’s patient safety experts have said, “Today’s ‘unpreventable events’ are only an innovation away from being preventable.”3 In some cases, the innovation required is a new or better treatment or medical device. In other cases, it’s a new process or system. When it came to improving pain management for patients with Sickle cell disease at Boston Medical Center, reducing harm to patients required a new process and a new way of thinking about harm. Chief Quality Officer Dr. James Moses tells the story: Video Transcript: Improving Pain Management at BMC James Moses, MD, MPH; Medical Director of Quality Improvement, Boston Medical Center There’s just been a long historical disparity for sickle cell population that is wrapped in race and socio-economic status. These are patients who basically have their red blood cells sickling within their microvascular system, causing pain and kind of morbidity throughout every organ system that has small vessels. And they have objective and valid reasons to have pain episodes, and yet our health care system over time has really mitigated and un-validated these pain episodes largely because of provider bias and health care bias and [the patients’] need for opioid medication to treat their pain. So we did an improvement project tied to improving the time to pain management for these patients, really from start to finish in our pediatric ED. And we achieved success — yes, through some concrete tools of a checklist and an algorithm and a pain medication calculator — but those were really just tools to get to where the meat of the problem was, which was kind of the bias of the ED in not realizing that this pain of a 10 out of 10 should be treated as an emergency. And what I’ve really gotten a lot of reward from out of that project was seeing the emergency department staff change their lens through learning. We were able to bring them on a journey where they were able to see these patients and their complaints in a different light. Building Safer Systems What will be the next way that health care prevents harm to patients? You can be part of this work — even through small steps such as treating patients with kindness. In this lesson, we’ve described how the patient safety field has broadened its focus from reducing error alone to encompass efforts to prevent harm. This makes sense for a few reasons. As you have learned in this course: • Most harm in today’s health system isn’t primarily caused by the unsafe acts of frontline providers, but by larger systems issues: latent unsafe conditions, too few layers of defense against hazards, and the presence of unnecessary hazards. • Harm can occur when there is no clearly identifiable error, but it does not change the impact to the patient. • The notion of “preventable” harm changes as scientific knowledge and health systems evolve, and experts encourage providers to take a broad view of what harm is. With a new understanding of the relationship between error and harm, health care systems have learned to prevent many harms that used to be considered unavoidable complications of care — CLABSIs are just one example. By studying how people perform under different circumstances, providers can design better systems, which help people navigate safely through a hazardous workspace. This area of study, involving many disciplines such as anatomy, physiology, physics, and biomechanics, is called human factors engineering. It will be the subject of our next course. What intervention helped prove that catheter-associated bloodstream infections (CLABSIs) were preventable consequences of care? (A) A new guideline that required all staff to wash their hands with alcohol and soap before inserting a catheter (B) A checklist of evidence-based practices applied consistently and collectively every time a catheter is used (C) A new device that no longer required catheters to inject medications and draw blood (D) A new standard that encouraged providers to take patients off ventilators sooner The correct answer is B, a set of evidence-based practices applied consistently and collectively. Reductions in CLABSIs are due to this improvement of the system for placing catheters, not technical innovation or isolated guidelines. Ventilators aren’t related to bloodstream infections. What is one reason that patient safety has shifted to work on reducing harm in addition to preventing errors? (A) Human error has become less common in health care. (B) Harm is more preventable than providers once thought. (C) Identifying errors rarely leads to improvement. (D) Patients are only concerned about errors that cause harm. The correct answer is B, that harm is more preventable than providers once thought. A good example is central line-associated bloodstream infections (CLABSIs), a small number of which were once thought to be an inevitable complication of life-saving health care. Providers realized that it was actually possible to almost eliminate central- line infections through improvement efforts including a checklist to ensure all precautions were taken every time. Human error is still a big problem in health care, but reducing error is not the only way to reduce harm. Identifying errors is an important part of improvement, because it allows health care systems to improve unsafe conditions before they cause harm. Patients care about errors, in addition to harm, because errors undermine trust in the health care system. Which of the following is included in the IHI Global Trigger Tool definition of harm? (A) Psychological harm such as a miscommunication about a diagnosis (B) Financial harm from expensive medical bills (C) The absence of needed care that contributes to harm, such as missed treatment for hypertension that leads to a stroke (D) Physical injury caused by medical care that triggers additional care The correct answer is D, physical injury. Some patient safety leaders want to expand the definition of harm and include the other types of harm listed in this question, including financial harm, psychological harm, and so-called errors of omission, but those are not included in the IHI Global Trigger Tool definition of harm. The Swiss cheese model of accident causation illustrates what important concept in patient safety? (A) Unsafe acts (including errors and violations) are the most important cause of harm to patients. (B) Both latent unsafe conditions and active failures (unsafe acts) contribute to harm. (C) Harm results when the layers of defense in a system fail to prevent a hazard from reaching a patient. (D) B and C The correct answer is D. The Swiss cheese model illustrates how a hazard results in harm by passing through the many “holes” in a safety system, which represent unsafe conditions that lead to unsafe acts and fail to prevent them from causing harm. By understanding the Swiss cheese model of harm, we can see that safety is the result of a system and not just the acts of providers. Why do some patient safety leaders believe the definition of harm should be broader than the definition in the IHI Global Trigger Tool? (A) Because health care systems have eliminated all harms included in the current definition (B) Because expanding the definition of harm would make it easier to measure (C) Because health care systems should work to prevent more types of harm than the current definition includes (D) Because health care providers aren’t usually concerned about harms such as psychological injury The correct answer is C, because health care systems should work to prevent more types of harm than the current definition includes. Leaders in the field of patient safety argue that health systems should be more expansive in their definition of harm because the definition affects the scope of improvement work. They believe that health systems can and should prevent more types of harm than the definition includes. [Show More]

Last updated: 1 year ago

Preview 1 out of 108 pages

Add to cart

Instant download

We Accept:

We Accept
document-preview

Buy this document to get the full access instantly

Instant Download Access after purchase

Add to cart

Instant download

We Accept:

We Accept

Reviews( 0 )

$15.00

Add to cart

We Accept:

We Accept

Instant download

Can't find what you want? Try our AI powered Search

OR

REQUEST DOCUMENT
71
0

Document information


Connected school, study & course


About the document


Uploaded On

Nov 09, 2022

Number of pages

108

Written in

Seller


seller-icon
Nolan19

Member since 2 years

10 Documents Sold


Additional information

This document has been written for:

Uploaded

Nov 09, 2022

Downloads

 0

Views

 71

Document Keyword Tags

Recommended For You

Get more on EXAM »

$15.00
What is Browsegrades

In Browsegrades, a student can earn by offering help to other student. Students can help other students with materials by upploading their notes and earn money.

We are here to help

We're available through e-mail, Twitter, Facebook, and live chat.
 FAQ
 Questions? Leave a message!

Follow us on
 Twitter

Copyright © Browsegrades · High quality services·