Meeting Banner
Abstract #5188

Comparison of activation functions for optimizing deep learning models solving QSM-based dipole inversion

Simon Graf1, Nora Küchler1, Walter Wohlgemuth1, and Andreas Deistung1
1University Hospital Halle (Saale), Halle (Saale), Germany

Synopsis

Keywords: Machine Learning/Artificial Intelligence, Quantitative Susceptibility mappingDeploying deep learning models for quantitative susceptibility mapping is driven by optimizing hyper-parameters and using suitable architectures. We investigated the impact of activation functions on network model training. ELU-, leaky ReLU- and ReLU-models with 16 and 32 initial channels were tested for solving dipole inversion on synthetic susceptibility data. All models showed convergence after completing 100 training epochs. However, the 16-channel-ELU-model achieved low losses after only 20 training epochs and showed similar reconstruction performance to the 32-channel-ELU-model. Using the ELU activation allows the use of smaller network models resulting in fewer memory requirements and less training time.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords