A One-Layer Projection Neural Network for Nonsmooth Optimization Subject to Linear Equalities and Bound Constraints

This paper presents a one-layer projection neural network for solving nonsmooth optimization problems with generalized convex objective functions and subject to linear equalities and bound constraints. The proposed neural network is designed based on two projection operators: linear equality constra...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems Vol. 24; no. 5; pp. 812 - 824
Main Authors: Liu, Qingshan, Wang, Jun
Format: Journal Article
Language:English
Published: New York, NY IEEE 01-05-2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper presents a one-layer projection neural network for solving nonsmooth optimization problems with generalized convex objective functions and subject to linear equalities and bound constraints. The proposed neural network is designed based on two projection operators: linear equality constraints, and bound constraints. The objective function in the optimization problem can be any nonsmooth function which is not restricted to be convex but is required to be convex (pseudoconvex) on a set defined by the constraints. Compared with existing recurrent neural networks for nonsmooth optimization, the proposed model does not have any design parameter, which is more convenient for design and implementation. It is proved that the output variables of the proposed neural network are globally convergent to the optimal solutions provided that the objective function is at least pseudoconvex. Simulation results of numerical examples are discussed to demonstrate the effectiveness and characteristics of the proposed neural network.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2013.2244908