Meeting Banner
Abstract #2099

Automatic segmentation of spinal cord nerve rootlets

Jan Valosek1,2,3,4, Theo Mathieu1, Raphaëlle Schlienger5, Olivia Kowalczyk6,7, and Julien Cohen-Adad1,2,8,9
1NeuroPoly Lab, Polytechnique Montreal, Montreal, QC, Canada, 2Mila - Quebec AI Institute, Montreal, QC, Canada, 3Department of Neurosurgery, Faculty of Medicine and Dentistry, Palacký University Olomouc, Olomouc, Czech Republic, 4Department of Neurology, Faculty of Medicine and Dentistry, Palacký University Olomouc, Olomouc, Czech Republic, 5Laboratoire de Neurosciences Cognitives (UMR 7291), CNRS – Aix Marseille Université, Marseille, France, 6Department of Neuroimaging, Institute of Psychiatry, Psychology & Neuroscience, King’s College London, London, United Kingdom, 7Wellcome Centre for Human Neuroimaging, University College London, London, United Kingdom, 8Functional Neuroimaging Unit, CRIUGM, Université de Montréal, Montreal, QC, Canada, 9Centre de Recherche du CHU Sainte-Justine, Université de Montréal, Montreal, QC, Canada

Synopsis

Keywords: Analysis/Processing, Spinal Cord, Deep Learning; Nerve Rootlets; Segmentation

Motivation: Precise identification of spinal nerve rootlets is relevant for studying functional activity in the spinal cord.

Goal(s): Our goal was to develop a deep learning-based tool for the automatic segmentation of spinal nerve rootlets from multi-site T2-w images coupled with a method for the automatic identification of spinal levels.

Approach: Active learning was employed to iteratively train a nnUNet model to perform multi-class spinal nerve rootlets segmentation.

Results: The code/model is available on GitHub and is currently being validated by several laboratories worldwide.

Impact: Currently, most spinal cord fMRI studies use vertebral levels for groupwise registration, which is inaccurate. This new tool enables researchers to identify spinal levels via the automatic segmentation of nerve rootlets, improving fMRI analysis pipeline accuracy.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords