Meeting Banner
Abstract #0330

Unsupervised physics-informed deep learning (N=1) for solving inverse qMRI problems – Relaxometry and field mapping from multi-echo data

Ilyes Benslimane1, Thomas Jochmann2, Robert Zivadinov1,3, and Ferdinand Schweser1,3
1Buffalo Neuroimaging Analysis Center, Department of Neurology, Jacobs School of Medicine and Biomedical Sciences, University at Buffalo, The State University of New York, Buffalo, NY, United States, Buffalo, NY, United States, 2Department of Computer Science and Automation, Technische Universität Ilmenau, Ilmenau, Germany, Jena, Thuringia, Germany, 3Center for Biomedical Imaging, Clinical and Translational Science Institute at the University at Buffalo, Buffalo, NY, USA, Buffalo, NY, United States

Modeling the non-linear relationship of the Magnetic Resonance (MR) signal and biophysical sources is computationally expensive and unstable using conventional methods. We develop an unsupervised physics-informed deep learning algorithm that quantifies MR parameters from multi-echo GRE data in a single computational pass. The algorithm produced accurate B0 and R2* field maps without phase wrapping artifacts and with typical contrast variations. The success of this network demonstrates the feasibility of physics-informed quantitative MRI (qMRI) without the need for ground truth training data, typically required by similar networks. This developed tool could provide fast and comprehensive tissue characterization in qMRI.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords