Fully automatized parallel segmentation of the optic disc in retinal fundus images

•This paper presents a fully automatic software for the localization of the optic disc in retinal fundus color images.•The developed method is an hybrid of well known algorithms (Hough transform) with new ideas (AGP-color segmentator is based on Membrane Computing).•It has been programmed in paralle...

Full description

Saved in:
Bibliographic Details
Published in:Pattern recognition letters Vol. 83; no. 1; pp. 99 - 107
Main Authors: Díaz-Pernil, Daniel, Fondón, Irene, Peña-Cantillana, Francisco, Gutiérrez-Naranjo, Miguel A.
Format: Journal Article
Language:English
Published: Amsterdam Elsevier B.V 01-11-2016
Elsevier Science Ltd
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•This paper presents a fully automatic software for the localization of the optic disc in retinal fundus color images.•The developed method is an hybrid of well known algorithms (Hough transform) with new ideas (AGP-color segmentator is based on Membrane Computing).•It has been programmed in parallel and implemented with the Graphics Processing Units (GPU) technology.•A comparison with 13 state-of-the-art algorithms shows a significant improvement in quality terms.•A drastic improvement is obtained in terms of accuracy and consumed time. This paper presents a fully automatic parallel software for the localization of the optic disc (OD) in retinal fundus color images. A new method has been implemented with the Graphics Processing Units (GPU) technology. Image edges are extracted using a new operator, called AGP-color segmentator. The resulting image is binarized with Hamadani’s technique and, finally, a new algorithm called Hough circle cloud is applied for the detection of the OD. The reliability of the tool has been tested with 129 images from the public databases DRIVE and DIARETDB1 obtaining an average accuracy of 99.6% and a mean consumed time per image of 7.6 and 16.3 s respectively. A comparison with several state-of-the-art algorithms shows that our algorithm represents a significant improvement in terms of accuracy and efficiency.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2016.04.025