Spectral–Spatial Classification of Hyperspectral Imagery with 3D Convolutional Neural Network

Recent research has shown that using spectral-spatial information can considerably improve the performance of hyperspectral image (HSI) classification. HSI data is typically presented in the format of 3D cubes. Thus, 3D spatial filtering naturally offers a simple and effective method for simultaneou...

Full description

Saved in:
Bibliographic Details
Published in:Remote sensing (Basel, Switzerland) Vol. 9; no. 1; p. 67
Main Authors: Li, Ying, Zhang, Haokui, Shen, Qiang
Format: Journal Article
Language:English
Published: Basel MDPI AG 01-01-2017
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recent research has shown that using spectral-spatial information can considerably improve the performance of hyperspectral image (HSI) classification. HSI data is typically presented in the format of 3D cubes. Thus, 3D spatial filtering naturally offers a simple and effective method for simultaneously extracting the spectral-spatial features within such images. In this paper, a 3D convolutional neural network (3D-CNN) framework is proposed for accurate HSI classification. The proposed method views the HSI cube data altogether without relying on any preprocessing or post-processing, extracting the deep spectral-spatial-combined features effectively. In addition, it requires fewer parameters than other deep learning-based methods. Thus, the model is lighter, less likely to over-fit, and easier to train. For comparison and validation, we test the proposed method along with three other deep learning-based HSI classification methods-namely, stacked autoencoder (SAE), deep brief network (DBN), and 2D-CNN-based methods-on three real-world HSI datasets captured by different sensors. Experimental results demonstrate that our 3D-CNN-based method outperforms these state-of-the-art methods and sets a new record.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2072-4292
2072-4292
DOI:10.3390/rs9010067