The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.
|Number of pages||7|
|Journal||Advances in Neural Information Processing Systems|
|Publication status||Published - 1 May 1997|
Bibliographical noteCopyright of the Massachusetts Institute of Technology Press (MIT Press)
- approximator, back-propagation, symmetric phases, realizable cases, noiseless data