Sorry, you need to enable JavaScript to visit this website.
Skip to main content

Artificial Intelligence and Diagnostic Errors

January 31, 2020 
View more articles from the same authors.

Hall KK, Fitall E. Artificial Intelligence and Diagnostic Errors. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2020.

Save
Print
Cite
Citation

Hall KK, Fitall E. Artificial Intelligence and Diagnostic Errors. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2020.

Definition of Artificial Intelligence

The definition of artificial intelligence (AI) used in healthcare, is broad with no true consensus among experts. However, as a general concept AI refers to a computer applying human intellectual characteristics to problem solve, namely the ability to reason, make generalizations, and to learn from previous experiences.[1],[2] AI is an applicable term across numerous technologies, including machine learning, natural language processing, rule-based expert systems, physical robots, and robotic process automation.[3] In healthcare, AI is currently applied in diagnostics, population health management, patient engagement, patient adherence promotion, and in administrative activities.3 However, the focus of this Perspective is on the use of AI in healthcare diagnostics, specifically imaging.

Current Use of Artificial Intelligence in Diagnostic Imaging

Medical imaging is one of the most promising areas for the application and innovative use of AI. The use of AI in radiology has the potential to improve the efficiency and efficacy of medical imaging. Its use may also alleviate some of the burden and burnout experienced by radiologists who feel overwhelmed by the proliferation in the volume of imaging studies performed and unable to devote sufficient time to providing meaningful, patient-centric care.

Use of AI in diagnostic imaging can be included in processes such as acquiring the image, processing the image, interpreting the findings, determining follow-up care, and selecting appropriate data storage.2 When conducting an imaging study, the use of AI can improve the quality of the image captured. AI systems can detect at the time of imaging whether the quality of the data acquired is optimal for analysis and then alert radiologists, should additional scans be necessary. Automated protocols can also ensure that no necessary components of the scan are overlooked by the providers during its examination and that all required images are captured.[4] Further, AI systems can learn the features of a high-quality image, apply computational strategies to increase the odds of producing that image, and automatically compensate for any distortions.4,[5],[6] As a result, AI use during image capture can optimize staffing, reduce scanner time, and decrease radiation dosing for the patient.2 

Once the image has been captured, AI can support imaging analysis. Approaches utilizing AI for imaging analysis have been an area of rapid growth.3 AI algorithms look at images to identify patterns and then use pattern recognition to identify abnormalities. That may include flagging apparent abnormal findings or actually identifying masses and fractures.2,3 AI may be particularly beneficial when using imaging devices that produce a high number of images for each study conducted, such as MRIs. An electronic system can efficiently review significantly more images than would be feasible for an individual provider. AI can then support the diagnosis and treatment decision-making process by facilitating the integration of the imaging results within the patient electronic medical record. Once incorporated, the image can then be used alongside patient clinical data and medical history in computer-aided diagnosis. In some instances, AI may even predict which treatment protocols are most likely to be successful.3

Once the imaging study has been conducted, AI systems can help ensure continuity in provider communication and patient care. For example, AI can review patient records to ensure that an imaging diagnosis is correlated with the radiological reports and that there is an associated treatment plan.4 Providers can then be alerted to any discrepancies. This can ensure that findings from radiological reports are addressed expeditiously and avoid unnecessary patient return visits.

Moving beyond Imaging

As the technology supporting AI and the sophistication of its applications continues to advance, the role AI plays in imaging diagnostics will likely expand. For example, with improvements in image analysis, systems may be used to autonomously triage patients for review by a radiologist.[7] Additionally, as predictive algorithms become more advanced and adaptive, the role of AI in the review of both pathology and radiology images will grow.3 AI can also be expected to play a more direct role in the recommendation of treatment protocols. 

Risk Associated With the Implementation of AI

AI in imaging has already demonstrated a great deal of potential and opportunity to improve patient safety through enhancing imaging processes, aiding physician diagnosis, and minimizing discrepancies. However, there are several ethical concerns directly related to patient safety that must be addressed as the use of AI becomes more pervasive and plays a greater role in patient diagnosis. The first is in the evaluation of the AI technology and determining what level of accuracy is required, and conversely the percentage of misses that are acceptable, to substitute review and decision making by a human. Establishing a standardized benchmark for what constitutes “good enough” in AI products may be beneficial both for approval processes by the Food and Drug Administration (FDA) and also for guiding its use in facilities.

This first consideration directly leads into the second question of accountability. Should the use of AI directly or indirectly lead to misdiagnosis and improper treatment recommendations, who is (and who should be) held at fault? Similarly, should a physician opt not to use available AI and the patient is misdiagnosed, is the physician accountable for that decision? In either instance, is it possible to prove that using or not using AI would have ended in a different result for the patient?

Finally, while AI is intended to reduce diagnostic errors, there is the risk that the use of AI can introduce new potential errors. New potential errors have been detailed in a 2019 analysis by Challen et al. One example notes the potential for error resulting from discrepancies between the data used to train AI systems and the real-world clinical scenario due to limited availability of high-quality training data. AI systems are not as equipped as humans to recognize when there is a relevant change in context or data that can impact the validity of learned predictive assumptions. Therefore, AI systems may unknowingly apply programmed methodology for assessment inappropriately, resulting in error. Another example includes an insensitivity to potential impact. AI systems may not be trained in the same ways as humans to ‘err on the side of caution’. While that can result in more false positives, this approach may be appropriate when the alternative is a serious safety outcome for the patient.7

Conclusion

Despite the potential of AI in diagnostic imaging, in the short term it is most likely to complement rather than replace traditional approaches used by radiologists. With many unanswered questions associated with the use of AI and concerns regarding the introduction of new patient safety risks, AI will continue to serve as an adjunct rather than an alternative to a radiologist. However, appropriate incorporation of AI has the potential to alleviate some of the workflow burden experienced by radiologists and allow them to spend more time on other aspects of their role in caring for the patient. This includes providing emotional support and guidance, implementing interventional procedures, and participating in multidisciplinary clinical team patient safety initiatives.2

Kendall K. Hall, MD, MS
Managing Director, IMPAQ Health
IMPAQ International
Columbia, MD

Eleanor Fitall, MPH
Research Associate, IMPAQ Health
IMPAQ International
Washington, DC

References

[1] Bali J, Garg R, Bali RT. Artificial intelligence (AI) in healthcare and biomedical research: Why a strong computational/AI bioethics framework is required? Indian J Ophthalmol. 2019;67(1):3-6.

[2] Pesapane F, Codari M, Sardanelli F. Artificial intelligence in medical imaging: threat or opportunity? Radiologists again at the forefront of innovation in medicine. Eur Radiol Exp. 2018;2:35.

[3] Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6(2):94-98.

[4] Souquet J. AI is transforming diagnostic imaging. beckershospitalreview.com. https://www.beckershospitalreview.com/healthcare-information-technology/ai-is-transforming-diagnostic-imaging.html. Published December 3, 2018. Accessed December 19, 2019.

[5] Davoudi N, Deán-Ben XL, Razansky D. Deep learning optoacoustic tomography with sparse data. National Machine Intelligence. 2019.

[6] Improving the quality of medical imaging with artificial intelligence. nih.gov. Published July 2, 2018. Accessed December 19, 2019. 

[7] Challen R, Denny J, Pitt M, et al. Artificial intelligence, bias and clinical safety. BMJ Qual Saf. 2019;28(3):231-237.

This project was funded under contract number 75Q80119C00004 from the Agency for Healthcare Research and Quality (AHRQ), U.S. Department of Health and Human Services. The authors are solely responsible for this report’s contents, findings, and conclusions, which do not necessarily represent the views of AHRQ. Readers should not interpret any statement in this report as an official position of AHRQ or of the U.S. Department of Health and Human Services. None of the authors has any affiliation or financial involvement that conflicts with the material presented in this report. View AHRQ Disclaimers
Save
Print
Cite
Citation

Hall KK, Fitall E. Artificial Intelligence and Diagnostic Errors. PSNet [internet]. Rockville (MD): Agency for Healthcare Research and Quality, US Department of Health and Human Services. 2020.

Related Resources From the Same Author(s)