Meeting Banner
Abstract #2654

Joint MRI Reconstruction and Denoising Using Noise-Adaptive Self-Supervised Learning

Nikola Janjusevic1,2, Amirhossein Khalilian-Gourtani3, Yao Wang4, and Li Feng1,2
1Radiology, Bernard and Irene Schwartz Center for Biomedical Imaging, New York University Grossman School of Medicine, New York, NY, United States, 2Radiology, Center for Advanced Imaging Innovation and Research (CAI2R), New York University Grossman School of Medicine, New York, NY, United States, 3Neurology, New York University Grossman School of Medicine, New York, NY, United States, 4Electrical and Computer Engineering, New York University Tandon School of Engineering, Brooklyn, NY, United States

Synopsis

Keywords: AI/ML Image Reconstruction, AI/ML Image Reconstruction, joint reconstruction and denoising

Motivation: Deep learning methods are state-of-the-art in accelerated MRI reconstruction. However, they often lack robustness to changing SNR levels between training and inference, which may hinder their successful deployment in low-SNR regimes, and in particular low-field scanners.

Goal(s): To introduce LPDSNet, a novel approach for robust deep-learning based joint MRI reconstruction and denoising without ground-truth data.

Approach: LPDSNet directly parameterizes an unrolled primal-dual splitting algorithm, and achieves noise-robustness via learned noise-adaptive clipping.

Results: LPDSNet demonstrates superior performance in both supervised and self-supervised learning compared to state-of-the-art networks. Additionally, we show novel noise-level robustness in self-supervised joint MRI reconstruction and denoising, where competing methods fail.

Impact: LPDSNet surpasses current methods, especially under mismatched noise-level conditions between training and testing, making it highly effective for noisy, limited-sample MRI datasets and promising for low-SNR, low-field MRI applications.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords