Meeting Banner
Abstract #0930

Multidimensional MR Image Reconstruction Using A Disentangled Representation

Ruiyang Zhao1,2 and Fan Lam1,2,3
1Department of Electrical and Computer Engineering, University of illinois, Urbana Chamapign, Champaign, IL, United States, 2Beckman Institute for Advanced Science and Technology, University of illinois Urbana-Champaign, Champaign, IL, United States, 3Department of Bioengineering, University of illinois Urbana-Champaign, Champaign, IL, United States

Synopsis

Keywords: AI/ML Image Reconstruction, AI/ML Image Reconstruction

Motivation: Learning-based high-dimensional image reconstruction may benefit from pre-training strategies and representations better exploiting multidimensional correlations.

Goal(s): To develop a disentangled representation that can effectively model and constrain different types of features in multidimensional images for reconstruction.

Approach: The representation was learned via image transfer autoencoder training. A new formulation was proposed to incorporate the pre-learned representation for constrained reconstruction.

Results: Our model can disentangle contrast and geometry features in multi-contrast MR images and the proposed algorithm improved quantitative MRI reconstruction.

Impact: The proposed method may provide a new perspective for learning-based, high-dimensional MRI reconstruction, for which small or even no data are available problem-specific supervised training.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords