Meeting Banner
Abstract #3497

MoPED: Motion Parameter Estimation DenseNet for accelerating retrospective motion correction

Julian Hossbach1,2,3, Daniel Nicolas Splitthoff2, Stephen Farman Cauley4, and Andreas Maier1
1Pattern Recognition Lab, Friedrich-Alexander-University Erlangen-Nuremberg, Erlangen, Germany, 2Siemens Healthcare GmbH, Erlangen, Germany, 3Erlangen Graduate School in Advanced Optical Technologies, Erlangen, Germany, 4Martinos Center for Biomedical Imaging, Charlestown, MA, United States

We exploit the data redundancy and the locality of motion in k-space for an estimation of the motion parameters using a Deep Learning approach. The exploratory Motion Parameter Estimation DenseNet (MoPED) extracts the in-plane motion parameters between echo trains of a TSE sequence. As input, the network receives the center patch of the k-space from multiple coils; the network’s output can serve multiple purposes. While an image rejection/reacquisition can be triggered by the motion guess, we show that motion aware reconstruction can be accelerated using MoPED.

This abstract and the presentation materials are available to members only; a login is required.

Join Here