Exponential stability of a neural network for bound-constrained quadratic optimisation

Abdesselam Bouzerdoum*, Tim R. Pattison

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

A recurrent neural network is presented which performs quadratic optimisation subject to bound constraints on each of the optimisation variables. The optimisation strategy employed by the neural network falls into the general class of gradient methods for constrained nonlinear optimisation, and is compared briefly with the strategies employed by conventional techniques for bound-constrained quadratic optimisation. Conditions on the quadratic problem and the network parameters are established under which exponential asymptotic stability is achieved. These conditions are shown to provide a tighter bound on the degree of exponential stability than that previously established for this network. Through suitable choice of the network parameters, the system of differential equations governing the network activations is preconditioned in order to reduce its sensitivity to noise and roundoff-errors and to accelerate convergence.

Original languageEnglish
Pages918-923
Number of pages6
Publication statusPublished - 1994
Externally publishedYes
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: 27 Jun 199429 Jun 1994

Conference

ConferenceProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period27/06/9429/06/94

Fingerprint

Dive into the research topics of 'Exponential stability of a neural network for bound-constrained quadratic optimisation'. Together they form a unique fingerprint.

Cite this