In the past decade, the methods of statistical mechanics have been successfully applied to the theory of neural networks. The current status of the statistical physics of neural networks was reviewed at this workshop and prospects for the future examined. Recent interdisciplinary developments involving related fields such as computational learning theory, statistics, information theory, and nonlinear dynamics were also presented. These proceedings include contributions by S Amari, H Sompolinsky, Y LeCun, I Kanter, D Haussler, H S Seung, M Opper, M Kearns and M Biehl.
Contents:
- Learning Curves:
- Statistical Theory of Learning Curves (S Amari et al.)
- Generalization in Two-Layer Neural Networks (J-H Oh et al.)
- Dynamics:
- On-Line Learning of Dichotomies: Algorithms and Learning Curves (H Sompolinsky et al.)
- The Bit-Generator and Time-Series Prediction (E Eisenstein et al.)
- Associative Memory and Other Topics:
- The Cavity Method: Applications to Learning and Retrieval in Neural Networks (K Y M Wong)
- Storage Capacity of a Fully Connected Committee Machine (C Kwon et al.)
- Applications:
- Learning Algorithms for Classification: A Comparison on Handwritten Digit Recognition (Y LeCun et al.)
- On the Consequences of the Statistical Mechanics Theory of Learning Curves for the Model Selection Problem (M J Kearns)
- and other papers
Readership: Graduates and researchers in physics and neural networks.