Meeting Banner
Abstract #5193

Constraint function for IVIM quantification using unsupervised learning

Wonil Lee1, Beomgu Kang1, Jongyeon Lee1, Georges El Fakhri2, Chao Ma2, Yeji Han3, Jun-Young Chung4, Young Noh5, and HyunWook Park1
1The School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Yuseong-gu, Korea, Republic of, 2Massachusetts General Hospital, Boston, MA, United States, 3Department of Biomedical Engineering, Gachon University, Incheon, Korea, Republic of, 4Department of Neuroscience, College of Medicine, Gachon University, Incheon, Korea, Republic of, 5Department of Neurology, Gil Medical Center, Gachon University College of Medicine, Incheon, Korea, Republic of

Synopsis

Keywords: Machine Learning/Artificial Intelligence, Quantitative Imaging, Intravoxel incoherent motionRecently, various methods have been proposed to quantify intravoxel incoherent motion parameters. Many studies have shown that quantification methods using deep learning can accurately estimate IVIM parameters. Unsupervised learning is useful when quantifying IVIM parameters for in-vivo data because it does not require label data. However, in some cases, loss function does not converge as iteration increases. Constraint functions can be used to solve these problems by limiting the range of estimated outputs. In this study, we investigated the effects of constraint function to limit the range of estimated output.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords