Meeting Banner
Abstract #0712

Deep learning-based stroke segmentation and patient outcome prediction

Hae Sol Moon1, Lindsay Heffron2, Ali Mahzarnia3, Barnabas Obeng-Gyasi3, Matthew Holbrook3, Cristian T. Badea1,3, Wuwei Feng4, and Alexandra Badea1,3,4,5
1Biomedical Engineering, Duke University, Durham, NC, United States, 2Orthopaedic Surgery, Duke University School of Medicine, Durham, NC, United States, 3Radiology, Duke University School of Medicine, Durham, NC, United States, 4Neurology, Duke University School of Medicine, Durham, NC, United States, 5Brain Imaging and Analysis Center, Duke University School of Medicine, Durham, NC, United States

Synopsis

Keywords: Multimodal, Data Analysis, Deep learning; Segmentation; Lesion Load; StrokeWe compared the ability of 2D and 3D U-Net Convolutional Neural Network (CNN) architectures to segment ischemic stroke lesions and predict patient outcome using single-contrast (DWI) and dual-contrast images (T2w FLAIR and DWI). The predicted lesion segmentation metrics and location relative to corticospinal tract correlated with post-stroke patient outcome measured by National Institutes of Health Stroke Scale (NIHSS). The 2D multi-modal CNN achieved the best results with mean Dice of 0.74. The highest correlation was for weighted-lesion load with both baseline and 90-days NIHSS (80%, p<0.001). Our results support that multi-contrast MR helps automate lesion segmentation and predict post-stroke outcomes.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords