Improved Convolutional Neural Network Based on Fast Exponentially Linear Unit Activation Function

The activation functions play increasingly important roles in deep convolutional neural networks. The traditional activation functions have some problems such as gradient disappearance, neuron death and output offset, and so on. To solve these problems, we propose a new activation function in this p...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 7; pp. 151359 - 151367
Main Authors: Qiumei, Zheng, Dan, Tan, Fenghua, Wang
Format: Journal Article
Language:English
Published: Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The activation functions play increasingly important roles in deep convolutional neural networks. The traditional activation functions have some problems such as gradient disappearance, neuron death and output offset, and so on. To solve these problems, we propose a new activation function in this paper, Fast Exponentially Linear Unit (FELU), aiming to speed up exponential linear calculations and reduce the time of network running. FELU has the advantages of Rectified Linear Unit (RELU) and Exponential Linear Unit (ELU), leading to have better classification accuracy and faster calculation speed. We test five traditional activation functions such as ReLU, ELU, SLU, MPELU, TReLU, and our new activation function on the cifar10, cifar100 and GTSRB data sets. Experiments show that the proposed activation function FELU not only improves the speed of the exponential calculation, reducing the time of convolutional neural network running, but also effectively enhances the noise robustness of network to improve the accuracy of classification.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2948112