Meeting Banner
Abstract #4478

Neural Bloch-McConnell fitting (NBMF): unsupervised test-time learning of clinical semisolid MT/CEST MRF reconstruction

Alex Finkelstein1, Nikita Vladimirov1, Simon Weinmüller2, Moritz Zaiss2,3,4, and Or Perlman1,5
1Department of Biomedical Engineering, Tel Aviv University, Tel-Aviv, Israel, 2Institute of Neuroradiology, University Hospital Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), Erlangen, Germany, 3Magnetic Resonance Center, Max-Planck-Institute for Biological Cybernetics, Tübingen, Germany, 4Department of Artificial Intelligence in Biomedical Engineering, Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU), Erlangen, Germany, 5Sagol School of Neuroscience, Tel Aviv University, Tel-Aviv, Israel

Synopsis

Keywords: CEST / APT / NOE, Molecular Imaging, AI, Deep Learning, Unsupervised Learning, Bloch-McConnell, Differentiable Physics

Motivation: MRF-based quantification of semi-solid MT/CEST proton-exchange requires a computationally demanding dictionary synthesis/matching. Recently reported unsupervised learning alternatives were incompatible with pulsed clinical CEST and multi-pool imaging.

Goal(s): To develop a training-set-free MRF reconstruction method, learning directly from the acquired data via pulsed-saturation-compatible physical modeling.

Approach: A differentiable multi-pool Bloch-McConnel simulator was designed and embedded within a test-time learning framework. Validation was performed using L-arginine phantoms and a human subject at 3T.

Results: The method enabled quantitative MT/CEST reconstruction in ~1 minute. The resulting maps were highly correlated with ground-truth in-vitro (Pearson’s r>0.95). In-vivo, semi-solid volume fractions were in agreement with MRF-based maps (r~0.8).

Impact: A one-stop-shop for semisolid MT and CEST MRF reconstruction was developed, enabling a training-set-free rapid quantification of exchange parameters on clinical scanners. This accessible approach could help a variety of Bloch-fitting applications to benefit from deep learning through differentiable spin-physics.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords