Meeting Banner
Abstract #0833

SVD Compression for Nonlinear Encoding Imaging with Model-based Deep Learning Reconstruction

Zhehong Zhang1, Kartiga Selvaganesan1, Yonghyun Ha2, Chenhao Sun2, Anja Samardzija1, Heng Sun1, Gigi Galiana1,2, and R. Todd Constable1,2,3,4
1Department of Biomedical Engineering, School of Engineering and Applied Science, Yale University, New Haven, CT, United States, 2Department of Radiology and Biomedical Imaging, School of Medicine, Yale University, New Haven, CT, United States, 3Interdepartmental Neuroscience Program, School of Medicine, Yale University, New Haven, CT, United States, 4Department of Neurosurgery, School of Medicine, Yale University, New Haven, CT, United States

Synopsis

Keywords: Image Reconstruction, Signal Representations, Nonlinear EncodingModel-based deep learning reconstruction with a nonlinear encoding matrix poses unique challenges to GPU memory, due to the densely connected computational graph nodes in the physics model part. In this work, SVD compression is demonstrated as necessary for such networks, and it is applied to the highly nonlinear case of Bloch-Siegert encoding from a low-field MR scanner. The redundancy across all nonlinear encoding dimensions is exploited for compression. With the compressed encoding matrix, the model-based network is feasible to implement. It outperforms the traditional reconstruction at all levels of simulated Gaussian noise and has advantages over commonly used regularization terms.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords