A Region-Growing Segmentation Algorithm for GPUs
This letter proposes a parallel version for graphics processing units (GPU) of a region-growing image segmentation algorithm widely used by the geographic object-based image analysis (GEOBIA) community. Initially, all image pixels are considered as seeds or primitive segments. Fine-grained parallel...
Saved in:
Published in: | IEEE geoscience and remote sensing letters Vol. 10; no. 6; pp. 1612 - 1616 |
---|---|
Main Authors: | , , , |
Format: | Journal Article |
Language: | English |
Published: |
IEEE
01-11-2013
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This letter proposes a parallel version for graphics processing units (GPU) of a region-growing image segmentation algorithm widely used by the geographic object-based image analysis (GEOBIA) community. Initially, all image pixels are considered as seeds or primitive segments. Fine-grained parallel threads assigned to individual pixels merge adjacent segments iteratively always ensuring to minimize the overall heterogeneity increase. Besides spectral features the merging criterion considers morphological features that can be efficiently computed in the underlying GPU architecture. Two alternatives using different merging criteria are proposed and tested. An experimental analysis upon five different test images has shown that the parallel algorithm may run up to 19 times faster than its sequential counterpart. |
---|---|
ISSN: | 1545-598X 1558-0571 |
DOI: | 10.1109/LGRS.2013.2272665 |