Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleNo Access

    FRACTAL CHARACTERIZATION OF BPN WEIGHTS EVOLUTION

    Training methodology of the Back Propagation Network (BPN) is well documented. One aspect of BPN that requires investigation is whether or not the BPN would get trained for a given training data set and architecture. In this paper the behavior of the BPN is analyzed during its training phase considering convergent and divergent training data sets. Evolution of the weights during the training phase was monitored for the purpose of analysis. The evolution of weights was plotted as return map and was characterized by means of fractal dimension. This fractal dimensional analysis of the weight evolution trajectories is used to provide a new insight to understand the behavior of BPN and dynamics in the evolution of weights.

  • articleNo Access

    A NOVEL CONNECTIONIST FRAMEWORK FOR COMPUTATION OF AN APPROXIMATE CONVEX-HULL OF A SET OF PLANAR POINTS, CIRCLES AND ELLIPSES

    We propose a two layer neural network for computation of an approximate convex-hull of a set of points or a set of circles/ellipses of different sizes. The algorithm is based on a very elegant concept — shrinking of a rubber band surrounding the set of planar objects. Logically, a set of neurons is placed on a circle (rubber band) surrounding the objects. Each neuron has a parameter vector associated with it. This may be viewed as the current position of the neuron. The given set of points/objects exerts a force of attraction on every neuron, which determines how its current position will be updated (as if, the force determines the direction of movement of the neuron lying on the rubber band). As the network evolves, the neurons (parameter vectors) approximate the convex-hull more and more accurately. The scheme can be applied to find the convex-hull of a planar set of circles or ellipses or a mixture of the two. Some properties related to the evolution of the algorithm are also presented.

  • articleNo Access

    FASTER TRAINING USING FUSION OF ACTIVATION FUNCTIONS FOR FEED FORWARD NEURAL NETWORKS

    Multilayer feed-forward neural networks are widely used based on minimization of an error function. Back propagation (BP) is a famous training method used in the multilayer networks but it often suffers from the drawback of slow convergence. To make the learning faster, we propose 'Fusion of Activation Functions' (FAF) in which different conventional activation functions (AFs) are combined to compute final activation. This has not been studied extensively yet. One of the sub goals of the paper is to check the role of linear AFs in combination. We investigate whether FAF can enable the learning to be faster. Validity of the proposed method is examined by performing simulations on challenging nine real benchmark classification and time series prediction problems. The FAF has been applied to 2-bit, 3-bit and 4-bit parity, the breast cancer, Diabetes, Heart disease, Iris, wine, Glass and Soybean classification problems. The algorithm is also tested with Mackey-Glass chaotic time series prediction problem. The algorithm is shown to work better than other AFs used independently in BP such as sigmoid (SIG), arctangent (ATAN), logarithmic (LOG).