Meeting Banner
Abstract #0509

DeepCEST 3T: Robust neural network prediction of 3T CEST MRI parameters including uncertainty quantification

Felix Glang1, Anagha Deshmane1, Sergey Prokudin2, Florian Martin1, Kai Herz1, Tobias Lindig3, Benjamin Bender3, Klaus Scheffler1,4, and Moritz Zaiss1,5
1Magnetic Resonance Center, Max Planck Institute for Biological Cybernetics, Tübingen, Germany, 2Department of Perceiving Systems, Max Planck Institute for Intelligent Systems, Tübingen, Germany, 3Department of Diagnostic and Interventional Neuroradiology, Eberhard Karls University Tübingen, Tübingen, Germany, 4Department of Biomedical Magnetic Resonance, Eberhard Karls University Tübingen, Tübingen, Germany, 5Department of Neuroradiology, University Clinic Erlangen, Erlangen, Germany

Analysis of CEST data often requires complex mathematical modeling before contrast generation, which can be error prone and time-consuming. Here, a probabilistic deep learning approach is introduced to shortcut conventional Lorentzian fitting analysis of 3T in-vivo CEST data by learning from previously evaluated data. It is demonstrated that the trained networks generalize to data of a healthy subject and a brain tumor patient, providing CEST contrasts in a fraction of the conventional evaluation time. Additionally, the probabilistic network architecture enables uncertainty quantification, indicating if predictions are trustworthy, which is assessed by perturbation analysis.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords