Meeting Banner
Abstract #1290

Exploring the possibilities with deep learning to compute shape measures of the brain's white matter connections

Yui Lo1,2,3, Yuqian Chen1,2, Dongnan Liu3, Jon Haitz Legarreta1,2, Leo Zekelman2,4, Jarrett Rushmore5,6, Fan Zhang7, Yogesh Rathi1,2, Nikos Makris1,5, Alexandra J. Golby1,2, Weidong Cai3, and Lauren J. O'Donnell1,2
1Harvard Medical School, Boston, MA, United States, 2Brigham and Women’s Hospital, Boston, MA, United States, 3The University of Sydney, Sydney, Australia, 4Harvard University, Cambridge, MA, United States, 5Massachusetts General Hospital, Boston, MA, United States, 6Boston University, Boston, MA, United States, 7University of Electronic Science and Technology of China, Chengdu, China

Synopsis

Keywords: Analysis/Processing, Tractography, Shape

Motivation: Studies have shown the potential of tractography shape measures to provide insight into the brain’s structural connections and their relationship to human cognition. However, existing shape computation methods can be highly time-consuming when dealing with large-scale tractography datasets.

Goal(s): We investigate the possibility of deep learning to compute shape measures of the brain's white matter connections.

Approach: We propose a novel framework that leverages a point cloud representation of tractography to compute shape measures.

Results: TractShapeNet outperforms other point cloud-based models for shape computation. Results demonstrate that our approach enables faster and more efficient shape-measure computation than the conventional DSI-Studio.

Impact: We investigate the possibility of deep learning to compute shape measures of the brain's white matter connections without intermediate steps to convert geometric tractography streamline data to an image data representation using a voxel grid with our novel framework, TractShapeNet.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords