Meeting Banner
Abstract #3231

Imputing Longitudinal Infant Brain MRI Features Using a Self-Supervised Transformer Model

Chenglin Ning1, Ruoke Zhao1, Yiwei Chen1, Haoan Xu1, Mingyang Li1, Tianshu Zheng1, Xinyi Xu1, Ruike Chen1, Yuqi Zhang1, Li Zhao1, and Dan Wu1
1Key Laboratory for Biomedical Engineering of Ministry of Education, Department of Biomedical Engineering, College of Biomedical Engineering & Instrument Science, Zhejiang University, Hangzhou, China

Synopsis

Keywords: Diagnosis/Prediction, Machine Learning/Artificial Intelligence

Motivation: Morphological changes of infant brain provide critical information of brain development and disease progress. However, inevitable data loss of follow-up visit hinders the longitudinal study of neural development.

Goal(s): To develop a robust deep learning method which can impute missing MRI in longitudinal studies.

Approach: A transformer-based model was proposed with self-supervised learning framework and a time-level imputation loss.

Results: The proposed model demonstrated superior performance with reduced MAE by 55.7%, 55.8%, 28.1%, 12.9%, and 14.6% for four cortical features compared to the benchmarks of BRITS, USGAN, TIMESNET, SAITS, and original Transformer. It also enhanced downstream longitudinal prediction task by 19.1% in MSE.

Impact: The proposed model is able to impute the missing data in longitudinal studies of infants, which may enrich the information along development trajectory and downstream analyses.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords