Meeting Banner
Abstract #4054

DEMO: Deep MR Parametric Mapping using Unsupervised Multi-tasking Framework

Jing Cheng1, Yuanyuan Liu1, Xin Liu1, Hairong Zheng1, Yanjie Zhu1, and Dong Liang1
1Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China

In this work, we propose a novel deep learning-based framework DEMO for fast and robust MR parametric mapping. Different from current deep learning-based methods, DEMO trains the network in an unsupervised way. Specifically, a CS-based loss function is used in DEMO to avoid the necessity of using fully sampled k-space data as the label, and thus make it an unsupervised learning approach. DEMO reconstructs the parametric weighted images and generates the parametric map simultaneously, which enables multi-tasking learning. Experimental results show the promising performance of the proposed DEMO framework in quantitative MR T1ρ mapping.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords