Meeting Banner
Abstract #1387

Prediction of IDH Status Using Hierarchical Attention-Based Deep 3D Multiple Instance Learning

Qinqin Xie1, Yuxia Liang2, Yu Shang3, Jin Wang1, Ming Zhang2, and Chen Niu2
1Xi'an Jiaotong University, Xi'an, China, 2The first affiliated hospital of Xian Jiaotong University, Xi'an, China, 3School of Future Technology, Xi'an Jiaotong University, Xi'an, China

Synopsis

Keywords: Diagnosis/Prediction, Brain

Motivation: Accurate prediction of isocitrate dehydrogenase (IDH) mutations from multimodal MRI remains challenging due to tumor heterogeneity.

Goal(s): We proposed a Hierarchical Attention Based Multi Instance Learning (HAB-MIL) framework for preoperative IDH prediction.

Approach: This method utilizes features associated with IDH mutations and incorporates positional encoding to enhance pixel position cues. Additionally, it employs attention-based gated pooling on 3D instances to improve the understanding of labels.

Results: Our method achieved an AUC of 91.1% and accuracy of 93.7% on the TCIA dataset, outperforming state-of-the-art techniques. Finally, Grad-CAM was employed to visualize the results of the model.

Impact: The incorporation of a dynamic attention mechanism in HAB-MIL effectively explores tumor-related features for IDH prediction, while also fully leveraging tumor positional information. This enhancement improves the model's interpretability, providing more valuable support for clinical diagnosis.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords