Feature fusion and Ensemble learning-based CNN model for mammographic image classification
In recent times, the world has faced an alarming situation regarding breast cancer patients. The early diagnosis of this deadly disease can make the treatment more accessible and practical. In this regard, a Computer-Aided Diagnosis (CAD) system can assist the radiologists in distinguishing the norm...
Saved in:
Published in: | Journal of King Saud University. Computer and information sciences Vol. 34; no. 6; pp. 3310 - 3318 |
---|---|
Main Authors: | , , , , |
Format: | Journal Article |
Language: | English |
Published: |
Elsevier B.V
01-06-2022
Elsevier |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In recent times, the world has faced an alarming situation regarding breast cancer patients. The early diagnosis of this deadly disease can make the treatment more accessible and practical. In this regard, a Computer-Aided Diagnosis (CAD) system can assist the radiologists in distinguishing the normal and abnormal tissues and diagnosing the pathological stages. The classification task is challenging in CAD systems because of noisy and low contrast mammogram images, tumors' shape and location variations, and the high resemblance between the normal and tumor regions of interest (ROI). We propose a novel deep convolution neural network (DCNN) approach based on feature fusion and ensemble learning strategies to improve the detection and classification of abnormalities in mammographic scans. The feature fusion helps to detect discriminative features between the classes properly, while ensemble learning in the last block better classify the normal and tumor ROIs to get more authenticated results. Moreover, the role of spatial dropout and depthwise separable convolution is investigated for mammogram classification to better deal with overfitting and small dataset problems in medical images. The proposed model is evaluated on two publicly available datasets, MIAS and BCDR, getting high sensitivity, specificity, and accuracy of 0.995, 0.994, and 0.994, respectively, on the MIAS dataset. |
---|---|
ISSN: | 1319-1578 2213-1248 |
DOI: | 10.1016/j.jksuci.2022.03.023 |