Meeting Banner
Abstract #1766

Non-central chi likelihood loss for quantitative MRI from parallel acquisitions with self-supervised deep learning

Christopher S Parker1, Daniel C Alexander1, and Hui Zhang1
1Centre for Medical Image Computing, University College London, London, United Kingdom

Synopsis

Keywords: AI Diffusion Models, Quantitative Imaging, parallel imaging; apparent diffusion coefficient; IVIM; parameter estimation

Motivation: The distribution of reconstructed MRI signals, used as input for quantitative MRI with self-supervised deep learning, depends on the number of receiver coils. Current loss functions do not account for this, leading to bias.

Goal(s): Develop a non-central chi likelihood (NLC) loss that accounts for the distribution of MRI measures in the most common scenario of parallelised acquisitions.

Approach: Implement and evaluate the NLC loss and compare its performance against the MSE and Rician likelihood loss in simulated data.

Results: The NLC improves performance compared to the Rician likelihood and MSE loss for the mono-exponential ADC model in simulated data.


Impact: The NLC loss permits fast inference of parameters from MRI signals reconstructed from parallelised acquisitions and may reduce bias compared to the Rician and MSE loss. The NLC loss is widely applicable due to the abundance of parallelised MRI acquisitions.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords