Edge-guided multi-scale adaptive feature fusion network for liver tumor segmentation

Automated segmentation of liver tumors on CT scans is essential for aiding diagnosis and assessing treatment. Computer-aided diagnosis can reduce the costs and errors associated with manual processes and ensure the provision of accurate and reliable clinical assessments. However, liver tumors in CT...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports Vol. 14; no. 1; pp. 28370 - 14
Main Authors: Zhang, Tiange, Liu, Yuefeng, Zhao, Qiyan, Xue, Guoyue, Shen, Hongyu
Format: Journal Article
Language:English
Published: London Nature Publishing Group UK 17-11-2024
Nature Publishing Group
Nature Portfolio
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Automated segmentation of liver tumors on CT scans is essential for aiding diagnosis and assessing treatment. Computer-aided diagnosis can reduce the costs and errors associated with manual processes and ensure the provision of accurate and reliable clinical assessments. However, liver tumors in CT images vary significantly in size and have fuzzy boundaries, making it difficult for existing methods to achieve accurate segmentation. Therefore, this paper proposes MAEG-Net, a multi-scale adaptive feature fusion liver tumor segmentation network based on edge guidance. Specifically, we design a multi-scale adaptive feature fusion module that effectively incorporates multi-scale information to better guide the segmentation of tumors of different sizes. Additionally, to address the problem of blurred tumor boundaries in images, we introduce an edge-aware guidance module to improve the model's feature learning ability under these conditions. Evaluation results on the liver tumor dataset (LiTS2017) show that our method achieves a Dice coefficient of 71.84% and a VOE of 38.64%, demonstrating the best performance for liver tumor segmentation in CT images.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-79379-y