Meeting Banner
Abstract #1718

Automated Multi-Organ Segmentation in Fetal MRI

Adam Lim1,2, Matthias W. Wagner3,4, Birgit Ertl-Wagner3,4, Logi Vidarsson3,4, and Dafna Sussman1,2,5
1Department of Electrical, Computer and Biomedical Engineering, Toronto Metropolitan University, Toronto, ON, Canada, 2Institute for Biomedical Engineering, Science and Technology (iBEST) at Toronto Metropolitan University and St. Michael’s Hospital, Toronto, ON, Canada, 3Department of Diagnostic Imaging, Division of Neuroradiology, The Hospital for Sick Children, Toronto, ON, Canada, 4Department of Medical Imaging, University of Toronto, Toronto, ON, Canada, 5Department of Obstetrics and Gynecology, University of Toronto, Toronto, ON, Canada

Synopsis

Keywords: Analysis/Processing, Segmentation

Motivation: Fetal MRI is essential for monitoring development and clarifying inconclusive ultrasound results. Biometrics like estimated fetal weight, amniotic fluid volume, and placental volume indicate fetal and maternal health, yet assessing these currently requires time-intensive manual segmentation.

Goal(s): Create an automated 3D deep-learning model for accurate fetal MRI segmentation, enabling precise volume and weight estimations for timely diagnosis and monitoring.

Approach: Developed a U-Net-based neural network with attention mechanisms and multi-level feature extraction, trained on a dataset of 58 healthy pregnancies.

Results: Achieved DSC scores of 95.06% for the fetal body, 95.50% for amniotic fluid, and 88.27% for the placenta, outperforming other segmentation networks.

Impact: The Fetal MRI Segmentation Network (FetSegNet) enables precise fetal body, amniotic fluid, and placenta segmentation, enhancing clinical efficiency and supporting more accurate pregnancy monitoring, paving the way for improved maternal-fetal health diagnostics and a deeper understanding of fetal development.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords