Meeting Banner
Abstract #0874

Deep learning enabled MRI general denoising at 0.55T

Zheren Zhu1, Azaan Rehman2, Michael Ohliger1, Yoo Jin Lee1, Hui Xue2,3, and Yang Yang1
1Radiology and Biomedical Imaging, University of California, San Francisco, CA, United States, 2National Institutes of Health, Bethesda, MD, United States, 3National Heart Lung and Blood Institute, Bethesda, MD, United States

Synopsis

Keywords: AI/ML Image Reconstruction, Visualization, Mid-Field MRI, Denoising

Motivation: Recent advancements in 0.55T MRI systems present promising opportunities for affordable and accessible MRI. Enhancing SNR to mitigate the inherent limitations of mid field strength is a crucial step in advancing this technology.

Goal(s): In this study, we aim to advance 0.55T MRI for speed and quality through a deep-learning-driven general denoise method processing low-SNR scans of various body parts and sequences.

Approach: We constructed a model with a spatial-temporal attention mechanism and employed massive complex image data for training.

Results: The proposed method significantly improves low SNR single-repetition images at 0.55T, making the results comparable or superior to the averages of multi-repetitions.

Impact: With robust denoising on mid-field systems, enhanced image quality and quicker scans can be expected for more accurate diagnoses and improved patient experience. New sequences can be developed and paired to further advance the system.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords