A small percentage of Women Trust AI Alone to Interpret Mammograms

Published Date: April 21, 2025

Recent research in Radiology: Imaging Cancer indicates that a small percentage of women, 4%, are comfortable with the use of artificial intelligence as the only method to read their mammograms.

AI technology has been rapidly adopted by hospitals and radiology practices to help aid in breast cancer detection, with little input from the patient.

Researchers at the University of Texas Southwestern Medical Center in Dallas recently conducted a study to gauge consumer attitudes toward AI-assisted software in medical imaging. The survey, which included 518 women, found that while most participants were not in favor of AI performing solo interpretations, approximately 71% indicated they would be comfortable with AI acting as a second reader.

"Patient perspectives are crucial because successful AI implementation in medical imaging depends on trust and acceptance from those we aim to serve. If patients are hesitant or skeptical about AI's role in their care, this could impact screening adherence and, consequently, overall healthcare outcomes," stated study author Basak E. Dogan, MD, director of breast imaging research at UT Southwestern.

Dogan and colleagues administered their 29-question survey between February and August of 2023, targeting all women undergoing screening mammography at their institution. Most were between the ages of 40 and 69 (about 73%), were college graduates (67%) and non-Hispanic white (51%). Only 23 of the 518 patients surveyed said they were comfortable with AI as a solo interpreter while 368 preferred the technology to be used as a second reader. 

ADVERTISEMENT

If they experienced an AI-reported abnormal screening, 89% of women said they’d want a radiologist review before scheduling a follow-up appointment. That’s compared to about 51% who said they wanted a radiologist-initiated recall reviewed by artificial intelligence. Higher educational attainment and knowledge about AI both were associated with greater acceptance of the technology. A patient’s race also was associated with higher concern for bias among Latino and black patients compared to white participants. 

ADVERTISEMENT

"These results suggest that demographic factors play a complex role in shaping patient trust and perceptions of AI in breast imaging,” Dogan said in the announcement.

Medical history also appeared to impact individuals’ trust in AI, “emphasizing the need for personalized AI integration strategies.” For instance, patients who had a close relative diagnosed with breast cancer were more likely to ask for additional reviews. But they showed a high degree of trust in AI and radiologist reviews, when a mammogram comes back as normal. 

“Our study shows that trust in AI is highly individualized, influenced by factors such as prior medical experiences, education and racial background,” Dogan added. “Incorporating patient perspectives into AI implementation strategies ensures that these technologies improve and not hinder patient care, fostering trust and adherence to imaging reports and recommendations.”

5 not found