Meeting Banner
Abstract #3241

Turing Testing the Realism of U-Net, GAN, and Real Images for Virtual Contrast-Enhanced Breast MRI

Aju George1, Hannes Schreiter1, Julilan Hossbach1,2, Tri-Thien Nguyen1,2, Ihor Horishnyi1, Chris Ehring1, Shirin Heidarikahkesh1, Lorenz A Kapsner1,3, Frederik B Laun1, Sabine Ohlmeyer1, Michael Uder1, Sebastian Bickelhaupt1,4, and Andrzej Liebert1
1Institute of Radiology, Uniklinikum Erlangen, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany, Erlangen, Germany, 2Pattern Recognition Lab, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany, 3Lehrstuhl für Medizinische Informatik, Friedrich-Alexander-Universität Erlangen-Nürnberg, Erlangen, Germany, 4German Cancer Research Center, Heidelberg, Germany, Erlangen, Germany

Synopsis

Keywords: Analysis/Processing, Machine Learning/Artificial Intelligence

Motivation: Contrast agent-based breast MRI is limited by cost, time, and contraindications for certain patients.

Goal(s): Assess the realism of virtual contrast-enhanced (vCE) images compared to real contrast-enhanced images.

Approach: Compare GAN and U-Net models in generating vCE images from unenhanced sequences, evaluating results with quantitative metrics and a Turing test.

Results: The U-Net outperformed the GAN on quantitative metrics. The Turing test showed similar classification rates for real and GAN images within each reader, though there were disagreements between readers. In contrast, U-Net images showed greater variability, with some readers rating them as real at notably high or low rates.

Impact: DL-based vCE provides an effective alternative to significantly reduce reliance on contrast agents, addressing concerns of cost, time, and contraindications. Incorporating adversarial training enables the model to learn intricate details of contrast-enhanced images, resulting in more realistic outputs.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords