Sensitivity and learning of two digital artificial neural network structures

V. E. DeBrunner*, S. C. Li, S. Lewandowsky

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

2 Citations (Scopus)

Abstract

We extend the analysis of parameter sensitivity and interdependence to two digital artificial neural network structures, the backpropagation and ALCOVE. This paper compares the two networks, and we generalize to show that a highly sensitive weight contributes more to the prediction of the network than does an insensitive parameter. This suggests that the information structure of an input pattern can be determined by looking at the sensitivity of the interconnection weights, which has ramifications in network design. Additionally, results from a different set of simulations indicate that information about weight sensitivity and interdependence is predictive of the learning behavior of the networks.

Original languageEnglish
Title of host publicationProceedings - IEEE International Symposium on Circuits and Systems
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages445-448
Number of pages4
Volume3
Publication statusPublished - 1996
EventProceedings of the 1996 IEEE International Symposium on Circuits and Systems, ISCAS. Part 1 (of 4) - Atlanta, GA, USA
Duration: 12 May 199615 May 1996

Conference

ConferenceProceedings of the 1996 IEEE International Symposium on Circuits and Systems, ISCAS. Part 1 (of 4)
CityAtlanta, GA, USA
Period12/05/9615/05/96

Structured keywords

  • Memory

Fingerprint

Dive into the research topics of 'Sensitivity and learning of two digital artificial neural network structures'. Together they form a unique fingerprint.

Cite this