Meeting Banner
Abstract #1373

Locally Adaptive Low Rank Regularization with Collaborative Data Selection for Arterial Spin Labeling MRI Denoising

Hangfan Liu1, Bo Li1, Yiran Li1, John A Detre2, and Ze Wang1
1University of Maryland School of Medicine, Baltimore, MD, United States, 2Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA, United States

Synopsis

Keywords: Sparse & Low-Rank Models, Arterial spin labelling, Denoising, MRI

Motivation: Address the challenge of low SNR in arterial spin labeling (ASL) MRI that hinders its clinical and research potential.

Goal(s): Develop an advanced ASL denoising algorithm that enhances image quality and overcomes limitations in ASL due to low SNR.

Approach: Propose a Locally Adaptive low rank regularization with Collaborative data Selection (LACS) scheme that utilizes the structural characteristics of ASL images for collaborative data selection to improve low-rank modeling. The proposed low-rank regularization fundamentally performs locally adaptive PCA without explicit training.

Results: Using a single ASL image pair, LACS significantly outperformed state-of-the-art MRI denoising methods and the standard pipeline.

Impact: The proposed scheme has the potential to benefit researchers, clinicians, and patients by setting a new benchmark for ASL MRI denoising. It opens doors to exploring ASL's full clinical potential and offers opportunities for innovative research.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords