HMT-UNet: A hybird Mamba-Transformer Vision UNet for Medical Image Segmentation
In the field of medical image segmentation, models based on both CNN and Transformer have been thoroughly investigated. However, CNNs have limited modeling capabilities for long-range dependencies, making it challenging to exploit the semantic information within images fully. On the other hand, the...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
20-08-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the field of medical image segmentation, models based on both CNN and
Transformer have been thoroughly investigated. However, CNNs have limited
modeling capabilities for long-range dependencies, making it challenging to
exploit the semantic information within images fully. On the other hand, the
quadratic computational complexity poses a challenge for Transformers. State
Space Models (SSMs), such as Mamba, have been recognized as a promising method.
They not only demonstrate superior performance in modeling long-range
interactions, but also preserve a linear computational complexity. The hybrid
mechanism of SSM (State Space Model) and Transformer, after meticulous design,
can enhance its capability for efficient modeling of visual features. Extensive
experiments have demonstrated that integrating the self-attention mechanism
into the hybrid part behind the layers of Mamba's architecture can greatly
improve the modeling capacity to capture long-range spatial dependencies. In
this paper, leveraging the hybrid mechanism of SSM, we propose a U-shape
architecture model for medical image segmentation, named Hybird Transformer
vision Mamba UNet (HTM-UNet). We conduct comprehensive experiments on the
ISIC17, ISIC18, CVC-300, CVC-ClinicDB, Kvasir, CVC-ColonDB, ETIS-Larib PolypDB
public datasets and ZD-LCI-GIM private dataset. The results indicate that
HTM-UNet exhibits competitive performance in medical image segmentation tasks.
Our code is available at https://github.com/simzhangbest/HMT-Unet. |
---|---|
DOI: | 10.48550/arxiv.2408.11289 |