AI Model Matches or Surpasses Humans in Detecting Cancer in Real-Time Specimen Mammography
Using artificial intelligence (AI) to detect cancer in real-time during breast cancer surgery may soon become a reality, thanks to a groundbreaking study conducted by researchers at the University of North Carolina (UNC) School of Medicine. The study found that an AI model developed by the researchers performed as well as, or even better than, human doctors in predicting whether cancerous tissue had been fully removed during surgery.
During breast-conserving surgery, it is crucial to remove all cancerous tissue to prevent the recurrence of cancer. Currently, surgeons examine the outer edges of the removed tissue, known as negative margins, to ensure that no cancer cells remain. Specimen mammography, which provides immediate feedback and can be done in the operating room, is commonly used for this purpose. However, the accuracy of specimen mammography can be inconsistent, leading to the need for additional surgery if cancer cells are detected later.
To address this issue, the UNC researchers developed an AI model capable of analyzing mammograms of removed tissue in real-time to determine if cancerous tissue has been fully removed. The model was trained using 821 specimen mammography images and corresponding final specimen reports from pathologists. The researchers also provided demographic data from patients, such as age, race, tumor type, and tumor size.
The results of the study showed that the AI model had a sensitivity of 85%, a specificity of 45%, and an area under the receiver operating characteristic curve (AUROC) of 0.71. Sensitivity refers to the model’s ability to detect positive instances correctly, while specificity measures the model’s ability to correctly identify true negatives. AUROC is a comprehensive measure of a model’s overall performance, where a value closer to one indicates better performance.
Comparing the accuracy of the AI model to human interpretation, the researchers found that the model performed as well as, if not better than, humans. Previous studies have reported sensitivities ranging from 20% to 58% and AUROCs ranging from 0.60 to 0.73 for specimen mammography.
The AI model proved particularly effective in discerning margins in patients with higher breast density since both higher-density breast tissue and tumors appear as bright white on mammograms, making it challenging to differentiate healthy tissue from cancerous tissue.
The researchers believe that their AI model could be particularly useful in hospitals with limited resources, where access to specialist surgeons, radiologists, or pathologists might be restricted. By providing immediate feedback and support, the model could assist surgeons in making well-informed decisions during surgeries, potentially reducing the need for additional procedures and benefiting patients.
As the AI model is still in its early stages, the researchers plan to further train it using more specimen mammography images to improve its accuracy in identifying margins. Before its implementation in a clinical setting, the model will need to be validated through additional studies.
The development of AI models like this one showcases the potential for artificial intelligence to support healthcare professionals in making critical decisions. By leveraging computer vision technology, AI models can enhance doctors’ and surgeons’ capabilities, potentially improving patient outcomes and reducing the need for repeat surgeries.
In conclusion, the UNC researchers’ AI model has demonstrated its ability to match, and at times surpass, human doctors in real-time detection of cancerous tissue during breast cancer surgery. While more research is needed, this study provides promising evidence for the integration of AI in improving surgical outcomes and enhancing patient care.