Hybrid no-propagation learning for multilayer neural networks

A hybrid learning algorithm suitable for hardware implementation of multi-layer neural networks is proposed. Though backpropagation is a powerful learning method for multilayer neural networks, its hardware implementation is difficult due to complexities of the neural synapses and the operations inv...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 321; pp. 28 - 35
Main Authors: Adhikari, Shyam Prasad, Yang, Changju, Slot, Krzysztof, Strzelecki, Michal, Kim, Hyongsuk
Format: Journal Article
Language:English
Published: Elsevier B.V 10-12-2018
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A hybrid learning algorithm suitable for hardware implementation of multi-layer neural networks is proposed. Though backpropagation is a powerful learning method for multilayer neural networks, its hardware implementation is difficult due to complexities of the neural synapses and the operations involved in error backpropagation. We propose a learning algorithm with performance comparable to but easier than backpropagation to be implemented in hardware for on-chip learning of multi-layer neural networks. In the proposed learning algorithm, a multilayer neural network is trained with a hybrid of gradient-based delta rule and a stochastic algorithm, called Random Weight Change. The parameters of the output layer are learned using the delta rule, whereas the inner layer parameters are learned using Random Weight Change, thereby the overall multilayer neural network is trained without the need for error backpropagation. Experimental results showing better performance of the proposed hybrid learning rule than either of its constituent learning algorithms, and comparable to that of backpropagation on the benchmark MNIST dataset are presented. Hardware architecture illustrating the ease of implementation of the proposed learning rule in analog hardware vis-a-vis the backpropagation algorithm is also presented.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2018.08.034