World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.
Parallel Implementations of Backpropagation Neural Networks on Transputers: A Study of Training Set Parallelism cover

This book presents a systematic approach to parallel implementation of feedforward neural networks on an array of transputers. The emphasis is on backpropagation learning and training set parallelism. Using systematic analysis, a theoretical model has been developed for the parallel implementation. The model is used to find the optimal mapping to minimize the training time for large backpropagation neural networks. The model has been validated experimentally on several well known benchmark problems. Use of genetic algorithms for optimizing the performance of the parallel implementations is described. Guidelines for efficient parallel implementations are highlighted.


Contents:
  • Introduction
  • Transputer Topologies for Parallel Implementation
  • Development of a Theoretical Model for Training Set Parallelism in a Homogeneous Array of Transputers
  • Equal Distribution of Patterns Amongst a Homogeneous Array of Transputers
  • Optimization Model for Unequal Distribution of Patterns in a Homogeneous Array of Transputers
  • Pattern Allocation Schemes Using Genetic Algorithm
  • Bibliography
  • Index

Readership: Graduate students, research scientists and practising engineers in artificial neural networks and parallel computing.