Meeting Banner
Abstract #2812

Deep Learning-Assisted Joint Estimation for 3D Retrospective Motion Correction: An In-Vivo Validation

Brian Nghiem1,2, Zhe Wu1, Sriranga Kashyap1, Lars Kasper1,3, and Kâmil Uludağ1,2
1BRAIN-To Lab, University Health Network, Toronto, ON, Canada, 2Department of Medical Biophysics, University of Toronto, Toronto, ON, Canada, 3Toronto Neuroimaging Facility, Department of Psychology, University of Toronto, Toronto, ON, Canada

Synopsis

Keywords: Motion Correction, Motion Correction, Neuroimaging, AI

Motivation: Data-driven retrospective motion correction methods currently face challenges with respect to robustness and long runtimes, which can be addressed by combining deep learning- and physics-based methods.

Goal(s): To validate a novel deep learning-assisted joint estimation algorithm on real motion-corrupted 3D MRI data.

Approach: A dataset of motion-corrupted data was acquired on 4 healthy volunteers. The performance of the proposed method was compared to a state-of-the-art deep learning method and a physics-based method.

Results: The proposed method outperformed the deep learning- and physics-based methods, yielding better image correction and converging faster.

Impact: The proposed retrospective motion correction method can be adopted into clinical practice as an alternative to rescanning, having demonstrated that it can salvage real motion-corrupted data without special hardware and requiring minimal sequence modifications.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords