World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Speeding Up Backpropagation Algorithms by Using Cross-Entropy Combined with Pattern Normalization

    https://doi.org/10.1142/S0218488598000100Cited by:15 (Source: Crossref)

    This paper demonstrates how the backpropagation algorithm (BP) and its variants can be accelerated significantly while the quality of the trained nets will increase. Two modifications were proposed: First, instead of the usual quadratic error we use the cross entropy as an error function and second, we normalize the input patterns. The first modification eliminates the so called sigmoid prime factor of the update rule for the output units. In order to balance the dynamic range of the inputs we use normalization. The combination of both modifications is called CEN–Optimization (Cross Entropy combined with Pattern Normalization). As our simulation results show CEN–Optimization can't only improve online BP but also RPPROP, the most sophisticated BP variant known today. Even though RPROP yields usually much better results than online BP the performance gap between CEN–BP and CEN–RPROP is smaller than between the standard versions of those algorithms. By means of CEN–RPROP it is nearly guaranteed to achieve an error of zero (with respect to the training set). Simultaneously, the generalization performance of the trained nets can be increased, because less complex networks suffice to fit the training set. Compared to the usual SSE (summed squared error) one can yield lower training errors with fewer weights.