Meeting Banner
Abstract #3497

MoPED: Motion Parameter Estimation DenseNet for accelerating retrospective motion correction

Julian Hossbach1,2,3, Daniel Nicolas Splitthoff2, Stephen Farman Cauley4, and Andreas Maier1
1Pattern Recognition Lab, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany, 2Siemens Healthcare GmbH, Erlangen, Germany, 3Erlangen Graduate School in Advanced Optical Technologies, Erlangen, Germany, 4Martinos Center for Biomedical Imaging, Charlestown, MA, United States

We exploit the data redundancy and the locality of motion in k-space for an estimation of the motion parameters using a Deep Learning approach. The exploratory Motion Parameter Estimation DenseNet (MoPED) extracts the in-plane motion parameters between echo trains of a TSE sequence. As input, the network receives the center patch of the k-space from multiple coils; the network’s output can serve multiple purposes. While an image rejection/reacquisition can be triggered by the motion guess, we show that motion aware reconstruction can be accelerated using MoPED.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords