Fast cropping method for proper input size of convolutional neural networks in underwater photography
The convolutional neural network (CNN) is widely used in object detection and classification and shows promising results. However, CNN has the limitation of fixed input size. If the input image size of the CNN is different from the image size of the system to which the CNN is applied, additional pro...
Saved in:
Published in: | Journal of the Society for Information Display Vol. 28; no. 11; pp. 872 - 881 |
---|---|
Main Authors: | , , |
Format: | Journal Article |
Language: | English |
Published: |
Campbell
Wiley Subscription Services, Inc
01-11-2020
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The convolutional neural network (CNN) is widely used in object detection and classification and shows promising results. However, CNN has the limitation of fixed input size. If the input image size of the CNN is different from the image size of the system to which the CNN is applied, additional processes, such as cropping, warping, or padding, are necessary. They take additional time to process these processes, and fast cutting methods are required for systems that require real‐time processing. The purpose of our system to which the CNN model will be applied is to classify fish species in real time, using cameras installed in a shallow stream. Therefore, in this paper, we propose a straightforward real‐time image cropping method for fast cutting to the proper input size of CNN. In the experiments, we evaluate the proposed method using CNNs (AlexNet, Vgg 16, Vgg 9, and GoogLeNet).
The convolutional neural network (CNN) is widely used in object detection and classification and shows promising results. However, CNN has the limitation of fixed input size. In this paper, we propose a straightforward real‐time image cropping method for fast cutting to the proper input size of CNN in underwater photography. And we evaluate the proposed method using CNNs. |
---|---|
ISSN: | 1071-0922 1938-3657 |
DOI: | 10.1002/jsid.911 |