Keywords: Diagnosis/Prediction, Radiomics, Explainability, Analysis/Processing, Cancer, Diagnosis/Prediction, Machine Learning/Artificial Intelligence, Prostate, Software Tools
Motivation: Clinical use of computer-aided diagnosis systems for prostate cancer is currently hindered by their internal complexity. Explainability tools can give insight into the functioning of these machine learning (ML) models.
Goal(s): Our goal was to supplement the predictions of an MRI radiomics-based ML model for prostate cancer detection with explanations based on clinical concepts currently used in radiological assessment.
Approach: We clustered correlating MRI radiomics features into groups representing clinical concepts underlying the PI-RADS system. We used SHAP analysis to explain the importance of these concepts in each predicted lesion.
Results: Explainability based on clinical concepts gives insight into ML model predictions.
Impact: Our machine learning pipeline combines accurate prostate cancer detection on MRI with intrinsic explainability, potentially resulting in an easier integration into clinical use.
How to access this content:
For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.
After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.
After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.
Keywords