Meeting Banner
Abstract #1952

Breast tumor segmentation network based on local attention

Binze Han1,2, Long Yang2, Heng Zhang2,3, Zhou Liu4, Meng Wang4, Ya Ren4, Qian Yang4, Wei Cui5, Ye Li2,6,7, Dong Liang2,6,7, Xin Liu2,6,7, Hairong Zheng2,6,7, and Na Zhang2,6,7
1Southern University of Science and Technology (SUSTech), Shenzhen, China, 2Paul C. Lauterbur Research Center for Biomedical Imaging, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China, 3Faculty of Robot Science and Engineering, Northeastern University, Shenyang, China, 4National Cancer Center/National Clinical Research Center for Cancer/Cancer Hospital & Shenzhen Hospital, Chinese Academy of Medical Sciences and Peking Union Medical College, Shenzhen, China, 5MR Research, GE Healthcare, Beijing, China, 6Key Laboratory of Biomedical Imaging Science and System, Chinese Academy of Sciences, Shenzhen, China, 7United Imaging Research Institute of Innovative Medical Equipment, Shenzhen, China

Synopsis

Keywords: Analysis/Processing, Cancer, Breast

Motivation: Accurate segmentation of the lesion region is the first step toward early diagnosis. The transformer, on the other hand, has very competitive performance but also extremely high computational complexity.

Goal(s): Finding an efficient and computationally inexpensive method is currently a great challenge for the application of transformers in medical image segmentation.

Approach: We adopt the shift local self-attention method to extract features, which reduces the computational complexity while obtaining very high segmentation accuracy.

Results: Experimental results on a dataset comprising 130 breast tumor cases demonstrate that the proposed network accurately segments breast tumors, surpassing the accuracy of many other convolution-based or transformer-based networks.

Impact: This study may inspire scientists to create simpler, efficient components for reduced self-attention computational cost while preserving long-range modeling. The achievement in high-precision segmentation can ease clinicians' workload by reducing image annotation.

How to access this content:

For one year after publication, abstracts and videos are only open to registrants of this annual meeting. Registrants should use their existing login information. Non-registrant access can be purchased via the ISMRM E-Library.

After one year, current ISMRM & ISMRT members get free access to both the abstracts and videos. Non-members and non-registrants must purchase access via the ISMRM E-Library.

After two years, the meeting proceedings (abstracts) are opened to the public and require no login information. Videos remain behind password for access by members, registrants and E-Library customers.

Click here for more information on becoming a member.

Keywords