Meeting Banner
Abstract #3387

Large Language Model Based Identification of Brain MRI Sequences

Radhika Bhalerao1, Harshita Kukreja2, and Andreas Rauschecker2
1UC Berkeley and UCSF, Berkeley, CA, United States, 2UCSF, San Francisco, CA, United States

Synopsis

Keywords: Language Models, Language Models, sequence identification, AI

Motivation: Classifying MRI sequences automatically is critical for developing labeled datasets for deep learning applications in medical imaging.

Goal(s): The study evaluates large language models(LLMs) performance in classifying MRI sequences, comparing them to current methods like CNNs and string-matching, focusing on accuracy and interpretability.

Approach: We applied a GPT-4-based LLM to classify 1490 brain MRI sequences from UCSF and compared to CNN and string-matching classifiers using sensitivity, specificity, and accuracy.

Results: The LLM classifier outperformed both CNN and string-matching methods, achieving 0.83 accuracy, with high sensitivity and specificity across sequence types. Its interpretability offered additional insights, improving classification transparency and minimizing false positives.

Impact: LLMs provide a more accurate and interpretable approach for MRI sequence classification, offering clinicians and researchers a more reliable tool. This could enhance research workflows, reduce manual labeling time, and allow for more robust deep learning models in medical imaging.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords