Fully automatic model‐based segmentation and classification approach for MRI brain tumor using artificial neural networks

Summary The accuracy of brain tumor diagnosis based on medical images is greatly affected by the segmentation process. The segmentation determines the tumor shape, location, size, and texture. In this study, we proposed a new segmentation approach for brain tissues using MR images. The method includ...

Full description

Saved in:
Bibliographic Details
Published in:Concurrency and computation Vol. 32; no. 1
Main Authors: Arunkumar, N., Mohammed, Mazin Abed, Mostafa, Salama A., Ibrahim, Dheyaa Ahmed, Rodrigues, Joel J.P.C., Albuquerque, Victor Hugo C.
Format: Journal Article
Language:English
Published: Hoboken Wiley Subscription Services, Inc 10-01-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Summary The accuracy of brain tumor diagnosis based on medical images is greatly affected by the segmentation process. The segmentation determines the tumor shape, location, size, and texture. In this study, we proposed a new segmentation approach for brain tissues using MR images. The method includes three computer vision fiction strategies which are enhancing images, segmenting images, and filtering out non ROI based on the texture and HOG features. A fully automatic model‐based trainable segmentation and classification approach for MRI brain tumour using artificial neural networks to precisely identifying the location of the ROI. Therefore, the filtering out non ROI process have used in view of histogram investigation to avert the non ROI and select the correct object in brain MRI. However, identification the tumor kind utilizing the texture features. A total of 200 MRI cases are utilized for the comparing between automatic and manual segmentation procedure. The outcomes analysis shows that the fully automatic model‐based trainable segmentation over performs the manual method and the brain identification utilizing the ROI texture features. The recorded identification precision is 92.14%, with 89 sensitivity and 94 specificity.
ISSN:1532-0626
1532-0634
DOI:10.1002/cpe.4962