Meeting Banner
Abstract #4926

QSM Inversion Through Parcellated Deep Neural Networks

Juan Liu1, Robin Karr2, Brad Swearingen2, Andrew Nencka1,2, and Kevin Koch1,2

1Joint Department of Biomedical Engineering, Marquette University and Medical College of Wisconsin, Milwaukee, WI, United States, 2Department of Radiology, Medical College of Wisconsin, Milwaukee, WI, United States

Quantitative Susceptibility Mapping (QSM) can estimate tissue susceptibility distributions and reveal pathology in conditions such as Parkinson's disease and multiple sclerosis. QSM reconstruction is an ill-posed inverse problem due to a mathematical singularity of the requisite dipole convolution kernel. State-of-art QSM reconstruction methods either suffer from image artifacts or long computation times. To overcome the limitations of these existing methods, a deep-learning-based approach is proposed and demonstrated in this work. 200 QSM datasets were utilized to compare current QSM reconstruction methods (TKD, closed-form L2, and MEDI) with the proposed deep-learning approach using visual scoring assessment of streaking artifacts and image sharpness. These multi-reader study results showed that the deep learning solution can produce QSM images with improved scores in both streaking artifacts and image sharpness evaluation while providing an almost instantaneous inversion computation through neural network inferencing.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords