A Novel Multi-Task Learning Network Based on Melanoma Segmentation and Classification with Skin Lesion Images

Melanoma is known worldwide as a malignant tumor and the fastest-growing skin cancer type. It is a very life-threatening disease with a high mortality rate. Automatic melanoma detection improves the early detection of the disease and the survival rate. In accordance with this purpose, we presented a...

Full description

Saved in:
Bibliographic Details
Published in:Diagnostics (Basel) Vol. 13; no. 2; p. 262
Main Authors: Alenezi, Fayadh, Armghan, Ammar, Polat, Kemal
Format: Journal Article
Language:English
Published: Switzerland MDPI AG 10-01-2023
MDPI
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Melanoma is known worldwide as a malignant tumor and the fastest-growing skin cancer type. It is a very life-threatening disease with a high mortality rate. Automatic melanoma detection improves the early detection of the disease and the survival rate. In accordance with this purpose, we presented a multi-task learning approach based on melanoma recognition with dermoscopy images. Firstly, an effective pre-processing approach based on max pooling, contrast, and shape filters is used to eliminate hair details and to perform image enhancement operations. Next, the lesion region was segmented with a VGGNet model-based FCN Layer architecture using enhanced images. Later, a cropping process was performed for the detected lesions. Then, the cropped images were converted to the input size of the classifier model using the very deep super-resolution neural network approach, and the decrease in image resolution was minimized. Finally, a deep learning network approach based on pre-trained convolutional neural networks was developed for melanoma classification. We used the International Skin Imaging Collaboration, a publicly available dermoscopic skin lesion dataset in experimental studies. While the performance measures of accuracy, specificity, precision, and sensitivity, obtained for segmentation of the lesion region, were produced at rates of 96.99%, 92.53%, 97.65%, and 98.41%, respectively, the performance measures achieved rates for classification of 97.73%, 99.83%, 99.83%, and 95.67%, respectively.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2075-4418
2075-4418
DOI:10.3390/diagnostics13020262