Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Errors and Near Misses: What Health Care Could Learn From Aviation

Carl Macrae, PhD | December 1, 2016 
View more articles from the same authors.

Macrae C. Errors and Near Misses: What Health Care Could Learn From Aviation. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2016.

Save
Print
Cite
Citation

Macrae C. Errors and Near Misses: What Health Care Could Learn From Aviation. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2016.

Perspective

Some of the most urgent challenges in patient safety concern how to identify, understand, and act on the early signs of emerging problems before those risks cause harm to patients. One important approach is to learn from minor errors and near-miss incidents, such as when a doctor records the wrong dose in a prescription but a pharmacist notices and corrects it. These brief encounters with risk are a normal part of all organized human activity and provide valuable opportunities to improve safety.(1,2)

Health care systems have mainly attempted to learn from errors and near-miss events through incident reporting systems, inspired by successes in other safety critical industries, primarily aviation.(3,4) Perhaps the most powerful ambition was that health care systems would ultimately learn from failures in the same way that the discovery of a faulty orange wire in an aircraft might be rapidly addressed across the entire airline industry.(5)

This aspiration has not yet been realized (6) in health care, partly due to key principles being lost or deemphasized when they were adapted from the aviation industry.(7) It is therefore instructive to look again at what insights might be gained from the approaches used in aviation to learn from errors and near-miss events. Some of the most useful insights concern three areas: how to create a just culture that is conducive to detecting early signs of failure; how minor errors and disruptions can be systematically investigated and understood; and how improvement actions can be organized and implemented to ensure safety is enhanced.

Identifying errors and near-miss events can be challenging. There are two broad strategies for detecting early signs of failure. One involves drawing on the collective intelligence of staff to notice and report problems, such as via incident reporting systems. The other involves ubiquitous monitoring technologies that can automatically identify unusual events such as the flight data recording or "black box" systems in commercial aircraft, which have recently been experimented with in operating rooms.(8) These approaches are complex and challenging to implement effectively. Critically, both strategies depend on a culture of safety, in which staff feel comfortable providing information on errors and safety issues. Although there is general agreement about the benefits of such a culture, many health care settings have been unable to achieve it, as staff are often fearful of the consequences of reporting errors, sometimes even going so far as to actively hide evidence of errors.(9)

A culture of safety requires negotiating a delicate balance of rights and responsibilities embedded in both cultural norms and legal protections. Aviation has spent decades establishing a just culture of safety in which there are careful agreements regarding what safety data is collected, who has access to it, and how it may be used. Within individual airlines, staff are assured that all safety data will be held by an independent safety unit and will not be used for disciplinary or punitive purposes, unless events indicate recklessness or other wrongdoing.(10) These protections are replicated across the entire industry. For example, European just culture legislation in November 2015 extended a range of legal protections around the appropriate use of aviation safety data.(11) Such just culture principles remain patchily developed across many health care systems. Without this clarity and consistency, the very first step of learning from error-noticing and sharing that one has occurred-may not happen.

Analyzing and understanding minor errors and near misses can be difficult. The core purpose of analyzing errors is to identify weaknesses in systems and practices.(12) One practical difficulty is resolving the inherent ambiguity in near-miss events: does the fact that there was ultimately no harm indicate that systems worked effectively and are safe, or that systems are worryingly risky and degraded? A range of sophisticated methods have been developed to analyze the structure and causes of error and applied in different health care contexts.

But beyond the technical analysis is the more challenging judgment regarding what is an acceptable level of reliability and risk in any given situation. This question arises in every risky industry. What is deemed an acceptable level of risk for a fighter jet or a shuttle launch is not appropriate on a commercial passenger flight. Equally, what is acceptable in an emergency department during a mass casualty event may not be appropriate in a community pharmacy setting. Determining the level of acceptable risk in each specific context, along with the appropriate safety strategies to achieve it, is work still urgently needed in health care.(13)

One of the key insights from aviation comes from examining how the most serious accidents and incidents are responded to, which has shaped the way minor events and near misses are now investigated. The most serious air accidents are rigorously investigated by an independent investigation body to understand the system-wide causes and issue recommendations to any relevant organization, from regulators to equipment manufacturers to training providers. Equally, investigations into minor errors or incidents in an individual airline now routinely consider the interactions that span the aviation system, and it may be passed to the relevant equipment manufacturer to examine their contribution and understand the interactions between the organizations. This can routinely involve requesting input from another service provider such as an air traffic control organization, a regional airport, or a maintenance provider.

On the other hand, most health care systems struggle to conduct integrated investigations that span multiple organizations. But change may be coming: in 2016 the English National Health Service has launched a new Healthcare Safety Investigation Branch modeled on air accident investigators.(14,15) This process has the potential to model integrated, systematic investigation that many health care systems might learn from.

The ultimate purpose of analyzing errors and near-miss events is to improve safety. One of the most striking differences between health care and aviation is a bias in health care toward collecting and analyzing large quantities of incident data, compared to a bias in aviation toward prompt investigation and action. Learning and improvement is a contact sport: it requires people to actively reflect on their own practices and work collaboratively to reorganize systems. This participative approach is built in to the aviation approach to incident investigation.(16) Investigations are conducted in close collaboration with those involved in an event and those who must make changes to improve safety.

In aviation, improvements can be rapidly integrated into practice thanks to the close integration of activities like simulation training and the continual updating and use of standardized processes for routine activities. It is common for aviation investigation reports to describe changes that have already been made following an event instead of needing to issue recommendations for action. This is because the activities of incident investigation are much more closely integrated with active processes of reflective inquiry and improvement, rather than a passive process of analysis. The recent development of the "RCA2: Root Cause Analysis and Action" (17) model of health care investigation represents an important and much needed reorientation toward improvement as the ultimate objective.

Setting aside the radical differences between the work of treating patients and transporting passengers, some of the greatest insights to be gained from aviation reflect the integrated infrastructure that supports open and honest inquiry into errors, investigation of systems, and system-wide improvement. Many health care systems have already made great strides forward in the complex task of improving patient safety. One of the next priorities should be to create integrated systems of safety analysis and improvement that recognize the complexities of the social, technical, and cultural processes needed to learn from the past and improve the future.

Carl Macrae, PhD Senior Research Fellow, Risk and Safety Group Department of Experimental Psychology University of Oxford, UK

References

1. Carthey J, de Leval MR, Reason JT. Institutional resilience in healthcare systems. Qual Health Care. 2001;10:29-32. [go to PubMed]

2. Hollnagel E, Braithwaite J, Wears RL. How to make health care resilient. In: Resilient Health Care. Hollnagel E, Braithwaite J, Wears RL (eds). Boca Raton, FL: CRC Press; 2013. ISBN: 9781409469780

3. Kohn L, Corrigan J, Donaldson M, eds. To Err Is Human: Building a Safer Health System. Washington, DC: Committee on Quality of Health Care in America, Institute of Medicine. National Academies Press; 1999. ISBN: 9780309068376.

4. Donaldson L. An Organisation with a Memory: Report of an Expert Group on Learning from Adverse Events in the NHS Chaired by the Chief Medical Officer. London, England: The Stationery Office; 2000.

5. Donaldson L. When will health care pass the orange-wire test? Lancet. 2004;364:1567-1568. [go to PubMed]

6. Mitchell I, Schuster A, Smith K, Pronovost P, Wu A. Patient safety incident reporting: a qualitative study of thoughts and perceptions of experts 15 years after "To Err Is Human." BMJ Qual Saf. 2016;25:92-99. [go to PubMed]

7. Macrae C. The problem with incident reporting. BMJ Qual Saf. 2016;25:71-75. [go to PubMed]

8. Bowermaster R, Miller M, Ashcraft T, et al. Application of the aviation black box principle in pediatric cardiac surgery: tracking all failures in the pediatric cardiac operating room. J Am Coll Surg. 2015;220:149-155. [go to PubMed]

9. Kirkup B. The Report of the Morecambe Bay Investigation. London, UK: The Stationery Office; 2015. ISBN: 9780108561306.

10. Macrae C. Close Calls: Managing Risk and Resilience in Airline Flight Safety. Basingstoke, UK: Palgrave Macmillan; 2014. ISBN: 9780230220843.

11. Regulation (EU) No 376/2014 of the European Parliament and of the Council of 3 April 2014 on the reporting, analysis and follow-up of occurrences in civil aviation, amending Regulation (EU) No 996/2010 of the European Parliament and of the Council and repealing Directive 2003/42/EC of the European Parliament and of the Council and Commission Regulations (EC) No 1321/2007 and (EC) No 1330/2007 Text with EEA relevance. OJ. 2014;L122:18-43. [Available at]

12. Vincent CA. Analysis of clinical incidents: a window on the system not a search for root causes. Qual Saf Health Care. 2004;13:242-243. [go to PubMed]

13. Vincent CA, Amalberti R. Safer Healthcare: Strategies for the Real World. New York, NY: SpringerOpen; 2016. ISBN: 9783319255576.

14. Macrae C, Vincent C. Learning from failure: the need for independent safety investigation in healthcare. J R Soc Med. 2014;107:439-443. [go to PubMed]

15. Report of the Expert Advisory Group: Healthcare Safety Investigation Branch. London, UK: Parliament; May 2016. [Available at]

16. Macrae C. Learning from patient safety incidents: creating participative risk regulation in healthcare. Health Risk Soc. 2008;10:53-67. [Available at]

17. RCA2: Improving Root Cause Analysis and Actions to Prevent Harm. Boston, MA: National Patient Safety Foundation; 2015.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Cite
Citation

Macrae C. Errors and Near Misses: What Health Care Could Learn From Aviation. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2016.