Meeting Banner
Abstract #3343

Evaluation of motion correction capability in retrospective motion correction with MoCo MedGAN

Thomas Küstner1,2,3, Friederike Gänzle3, Tobias Hepp2, Martin Schwartz3,4, Konstantin Nikolaou5, Bin Yang3, Karim Armanious2,3, and Sergios Gatidis2,5
1Biomedical Engineering Department, School of Biomedical Engineering and Imaging Sciences, King's College London, London, United Kingdom, 2Medical Image and Data Analysis (MIDAS), University Hospital Tübingen, Tübingen, Germany, 3Institute of Signal Processing and System Theory, University of Stuttgart, Stuttgart, Germany, 4Section on Experimental Radiology, University Hospital Tübingen, Tübingen, Germany, 5Department of Radiology, University Hospital Tübingen, Tübingen, Germany

Motion is the main extrinsic source for imaging artifacts which can strongly deteriorate image quality and thus impair diagnostic accuracy. Numerous motion correction strategies have been proposed to mitigate or capture the artifacts. These methods require some a-priori knowledge about the expected motion type and appearance. We have recently proposed a deep neural network (MoCo MedGAN) to perform retrospective motion correction in a reference-free setting, i.e. not requiring any a-priori motion information. In this work, we propose a confidence-check and evaluate the correction capability of MoCo MedGAN with respect to different motion patterns in healthy subjects and patients.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords