Further improved stability results for generalized neural networks with time-varying delays

This paper is concerned with a new Lyapunov–Krasovskii functional (LKF) approach to delay-dependent stability for generalized neural networks with time-varying delays (DNN). A new LKF is constructed by employing more information of the DNN. The state, the activation function and their ramifications...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) Vol. 367; pp. 308 - 318
Main Authors: Feng, Zongying, Shao, Hanyong, Shao, Lin
Format: Journal Article
Language:English
Published: Elsevier B.V 20-11-2019
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper is concerned with a new Lyapunov–Krasovskii functional (LKF) approach to delay-dependent stability for generalized neural networks with time-varying delays (DNN). A new LKF is constructed by employing more information of the DNN. The state, the activation function and their ramifications are introduced, and more cross terms of the activation function and their ramifications are included in the LKF. Moreover, the new LKF also makes best of the characteristic of the activation function. On the other hand, when estimating the derivative of the LKF, we take advantages of some equations and inequalities that reveal the relationship among the state, the activation function and their ramifications, employ advanced inequalities to deal with integrals arising from the derivative of the LKF, thus resulting in a tight upper bound of the derivative of the LKF. By checking the negative definiteness of the upper bound that is a quadratic function in the time-delay, a novel delay-dependent stability result is derived. Finally, three examples are given to illustrate the stability result is less conservative than some recently reported ones.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2019.07.019