Organizational Change in the Face of Highly Public Errors—II. The Duke Experience
Frush K. Organizational Change in the Face of Highly Public Errors—II. The Duke Experience. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2005.
Frush K. Organizational Change in the Face of Highly Public Errors—II. The Duke Experience. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2005.
Perspective
Editor's Note: In February 2003, 17-year-old Jessica Santillan died at Duke University Medical Center due to a mismatched heart-lung transplantation. As with the Dana-Farber experience, the death made headlines around the world and devastated the leaders and providers at Duke, one of the world's preeminent academic medical centers. We asked Dr. Karen Frush, a pediatrician who became Duke's Chief Patient Safety Officer in November 2004, to describe the impact of this case and the changes it engendered.
Patient safety and high-quality medical care have long been priorities at Duke University Hospital and Health System. Physicians, nurses, and other providers who care for patients at Duke are hard-working, intelligent, and well-trained individuals, committed to delivering safe and compassionate care. However, in spite of all our good intentions and dedicated efforts, a tragic and devastating transplant mismatch occurred in February 2003, leading to the death of a 17-year-old girl.
Devastating is, in fact, an appropriate word to describe the impact of this event on many individuals at Duke. Providers and administrators alike speak of a personal sense of great sadness, disappointment, frustration, and failure when reflecting on the transplant and its tragic outcome. How could such a terrible mistake happen to a team of highly qualified and dedicated individuals in an institution known for its commitment to excellence?
As Duke physicians, nurses, and hospital leaders tried to find meaningful lessons in the aftermath of this medical error, it became clear that the first step was to begin to understand what decades of research (mostly from outside of health care) have shown: most errors are made by good but fallible people, working in imperfect systems.(1) What has become painfully clear since November 1999, when the Institute of Medicine (IOM) report (2) on medical errors was published, is that the American health care system is, indeed, challenged and imperfect. This system, which served Americans well in the early part of the twentieth century, has changed dramatically during the past several decades. Physicians and nurses now have more potent medications to administer to patients, more complex and technologically advanced equipment to diagnose disease or deliver treatment, and much more information to process. However, all of these innovations create more opportunities for error, unless they are accompanied by a far more robust infrastructure to manage the new levels of complexity.
Leaders in patient safety often point to several other industries that, like health care, are highly complex, high risk, and high tech, yet seem to be "models for safety." In these industries, safety has been "built in" to the system so that mistakes made by workers are caught or intercepted, and harm is often prevented. For example, the aviation and nuclear power industries have been developing, evaluating, and refining safety strategies for more than 50 years, establishing impressive safety records.(3) In contrast, "systems thinking" remains in its infancy in health care.
In the past 2 years, we have seen a movement toward a hospital– and health system–wide focus on and approach to "systems thinking" related to safety at Duke. A Patient Safety and Quality Assurance Committee of the Health System Board of Directors was appointed to provide oversight and guidance from the highest level of leadership. This group endorsed the formation of a Patient Safety Council, an interdisciplinary group of physicians, nurses, hospital executives, and others, charged to define and implement appropriate policies and procedures, standards, technologies, and educational activities to improve safety. To date, the Council's efforts have led to several major initiatives, including a computerized physician order entry system, a web-based reporting system, and an electronic automated surveillance system for detection of potential adverse drug events. Work is also under way to create an outpatient electronic medical record and systems that cue providers about important missing results. A "dashboard" of safety and quality indicators has been developed to help track performance against national benchmarks and internal historic patterns.
As we have begun to implement new strategies and technologies to improve safety, we also recognize the need to promote a culture of safety throughout the health system. We are striving to develop a culture that encourages safety awareness and reporting and stresses the shared responsibility of all Duke employees and staff to reduce risks. A number of unit-based, interdisciplinary safety teams have been created across the hospital. These teams meet regularly to discuss safety issues, prioritize safety concerns, develop and implement risk-reduction strategies, and monitor results, with the help and support of middle-management safety teams. We have also involved patients in our safety efforts by conducting safety walk rounds.(4) During these rounds, physician leaders and hospital executives join local safety teams in talking with patients and families at the bedside about safety issues and concerns.
As mentioned, a web-based voluntary reporting system has been implemented, allowing care providers to easily and confidentially report concerns related to the safety of patients. Once reported, events are analyzed and information is fed back to the reporting provider, as well as to safety teams in units or areas that may be affected. Through these efforts, we are trying to develop a culture in which adverse events and near misses are openly identified, reviewed, and responded to in a manner that allows us to share lessons learned.
Although many of these efforts are in their early stages and we still have much to do and to learn, we have nonetheless moved forward in the journey toward a safer system of care. We realize this is a long journey; the transition to an optimal culture of shared responsibility and accountability does not happen overnight. Yet we remain committed to this end.
We are often reminded that our tragic transplant mismatch was a nationally recognized medical error, one that clearly catalyzed our efforts and helped to generate sufficient resources to support our mission to improve patient safety. An obvious question follows: Can one generate such resources and make patient safety a priority without first experiencing a cataclysmic error?
We hope that the answer is yes, and, toward that end, we share our story with colleagues across the country at every opportunity. While we've heard from many whose programs have made significant changes as a result of our experience, we've also heard from others whose institutions have maintained the status quo, based on the mistaken premise that "mistakes like that could never happen here." This is truly unfortunate, since it is both wrong and short sighted: the lessons learned from catastrophic events come at great cost, making them far too valuable to be lost as a result of complacency. We will continue to encourage colleagues to speak up and share their stories—with our experience or their own in mind—and take the lead in improving patient safety within their own organizations. We have come to believe that creating a safe system of health care depends on this type of cooperative leadership.
Karen Frush, MDChief Patient Safety Officer, Duke University Health SystemMedical Director, Pediatric Emergency Medicine
References
1. Wachter R, Shojania K. Internal Bleeding: The Truth Behind America's Terrifying Epidemic of Medical Mistakes. New York, NY: Rugged Land; 2004.
2. Kohn LT, Corrigan JM Donaldson MS, eds. To Err is Human: Building a Safer Health System. Institute of Medicine. Committee on Quality of Health Care in America. Washington, DC: National Academy Press; 1999. Available at: http://books.nap.edu/books/0309068371/html/
3. Leape LL. Error in medicine. JAMA. 1994;272:1851-1857. [ go to PubMed ]
4. Frankel A, Graydon-Baker E, Neppl C, Simmonds T, Gustafson M, Gandhi TK. Patient Safety Leadership WalkRounds. Jt Comm J Qual Saf. 2003;29:16-26. [ go to PubMed ]