Acceleration of back propagation through initial weight pre-training with delta rule

Gang Li*, Hussein Alnuweiri, Yuejian Wu, Hongbing Li

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

39 Citations (Scopus)

Abstract

A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
Editors Anon
PublisherPubl by IEEE
Pages580-585
Number of pages6
ISBN (Print)0780312007
Publication statusPublished - 1993
Externally publishedYes
Event1993 IEEE International Conference on Neural Networks - San Francisco, CA, USA
Duration: 28 Mar 19931 Apr 1993

Publication series

Name1993 IEEE International Conference on Neural Networks

Conference

Conference1993 IEEE International Conference on Neural Networks
CitySan Francisco, CA, USA
Period28/03/931/04/93

Fingerprint

Dive into the research topics of 'Acceleration of back propagation through initial weight pre-training with delta rule'. Together they form a unique fingerprint.

Cite this