Meeting Banner
Abstract #4524

Using ResNet to utilize 4-class T2-FLAIR slice classification based on the cholinergic pathways hyperintensities scale for pathological aging

Wei-Chun Kevin Tsai1, Yi-Chien Liu2, Ming-Chun Yu2, Chia-Ju Chou2, Sui-Hing Yan3, Yang-Teng Fan4, Yan-Hsiang Huang3, Yen-Ling Chiu5, Yi-Fang Chuang6,7, Ran-Zan Wang1, and Yao-Chia Shih4
1Department of Computer Science and Engineering, Yuan Ze University, Taoyuan City, Taiwan, 2Department of Neurology, Cardinal Tien Hospital, New Taipei City, Taiwan, 3Department of Neurology, Far Eastern Memorial Hospital, New Taipei City, Taiwan, 4Graduate Institute of Medicine, Yuan Ze University, Taoyuan City, Taiwan, 5Department of Medical Research, Far Eastern Memorial Hospital, New Taipei City, Taiwan, 6Department of Psychiatry, Far Eastern Memorial Hospital, New Taipei City, Taiwan, 7Institute of Public Health, National Yang Ming Chiao Tung University, Taipei, Taiwan

Synopsis

Keywords: Analysis/Processing, Machine Learning/Artificial Intelligence, T2-FLAIR, white matter hyperintensity, dementia, cholinergic pathway

Motivation: Cholinergic Pathways Hyperintensities Scale (CHIPS) is a visual rating scale to evaluate the burden of cholinergic white matter hyperintensities in T2-FLAIR image, indicating the severity of dementia. However, it is still time-consuming to screen slices throughout the whole brain to choose 4 specific slices for rating.

Goal(s): To develop a deep-learning-based model to automatically select 4 slices specific to CHIPS.

Approach: We used ADNI T2-FLAIR dataset (N=150) to train a 4-class slice classification model (BSCA) utilized by ResNet, and a local dataset (N=30) to test its performance.

Results: Our model achieved the accuracy of 99.82% and F1-score of 99.83%.

Impact: BSCA can be an automatic screening tool to efficiently provide 4 specific T2-FLAIR slices covering the white matter landmarks along the cholinergic pathways for clinicians to help evaluate whether patients have the high risk to develop clinical dementia.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords