Meeting Banner
Abstract #1027

Deep Learning-based Fetal-Uterine Motion Modeling from Volumetric EPI Time Series

Muheng Li1, Yi Xiao1, Tingyin Liu2, Junshen Xu3, Esra Turk4, Borjan Gagoski4,5, Karen Ying1, Polina Golland2,3, P. Ellen Grant4,5, and Elfar Adalsteinsson3,6
1Department of Engineering Physics, Tsinghua University, Beijing, China, 2Computer Science and Artificial Intelligence Laboratory (CSAIL), Massachusetts Institute of Technology, Cambridge, MA, United States, 3Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA, United States, 4Fetal-Neonatal Neuroimaging and Developmental Science Center, Boston Children’s Hospital, Boston, MA, United States, 5Harvard Medical School, Boston, MA, United States, 6Institute for Medical Engineering and Science, Massachusetts Institute of Technology, Cambridge, MA, United States

We propose a three-dimensional convolutional neural network applied to echo planar EPI time series of pregnant women for the automatic segmentation of the uterus (placenta excluded) and fetal body. The segmentation results are utilized to create a dynamic model for the fetus for retrospective analyses. The 3D dynamic fetal-uterine motion model will provide quantitative information of fetal motion characteristics for diagnostic purposes and may guide future fetal imaging strategies where adaptive, online slice prescription is used to mitigate motion artifacts.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords