Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The storage capacity of the extremely diluted Hopfield Model is studied by using Monte Carlo techniques. In this work, instead of diluting the synapses according to a given distribution, the dilution of the synapses is obtained systematically by retaining only the synapses with dominant contributions. It is observed that by using the prescribed dilution method the critical storage capacity of the system increases with decreasing number of synapses per neuron reaching almost the value obtained from mean-field calculations. It is also shown that the increase of the storage capacity of the diluted system depends on the storage capacity of the fully connected Hopfield Model and the fraction of the diluted synapses.
In this work, the Hopfield neural network model with infinite-range interactions is simulated by using the multicanonical algorithm. All simulations and measurements are done in spin glass states of the model with discrete ± 1 values of the random variables. Physical quantities such as the energy density, the ground-state entropy and the order parameters are evaluated at all temperatures. Our results in the spin glass region show multiple degenerate ground-states and good agreement with the replica symmetry mean field solutions.
We study numerically the nature of the retrieval attractors in an asymmetrically diluted Hopfield neural network through the damage propagation technique. We consider the damage evolution of two replicas, initially very close to each other and having both finite projection with one memorized pattern. By analyzing the asymptotic behavior of the damage, we characterize the dynamical nature of the retrieval attractors. We found, in the recognition phase, a dynamical phase transition separating chaotic and fixed point retrieval trajectories. We also present a conjugate field h associated with the damage, which destroys the dynamical transition and whose corresponding susceptibility presents a sharp peak at the critical parameter.
A Hopfield-type neural network has content addressable memory which emerges from its collective properties. I reinvestigate the controversial question of its critical storage capacity at zero temperature. To locate the discontinuous transition from good retrieval to bad retrieval in infinite systems the decreasing average quality of retrieved information is traced until it falls below a threshold. The cutoff points found for different system sizes are extrapolated towards infinity and yield αc=0.143±0.002.
A speaker-independent isolated word recognizer is proposed. It is obtained by concatenating a Bayesian neural network and a Hopfield time-alignment network. In this system, the Bayesian network outputs the a posteriori probability for each speech frame, and the Hopfield network is then concatenated for time warping. A proposed splitting Learning Vector Quantization (LVQ) algorithm derived from the LBG clustering algorithm and the Kohonen LVQ algorithm is first used to train the Bayesian network. The LVQ2 algorithm is subsequently adopted as a final refinement step. A continuous mixture of Gaussian densities for each frame and multi-templates for each word are employed to characterize each word pattern. Experimental evaluation of this system with four templates/word and five mixtures/frame, using 53 speakers (28 males, 25 females) and isolated words (10 digits and 30 city names) databases, gave average recognition accuracies of 97.3%, for the speaker-trained mode and 95.7% for the speaker-independent mode, respectively. Comparisons with K-means and DTW algorithms show that the integration of the splitting LVQ and LVQ2 algorithms makes this system well suited to speaker-independent isolated word recognition. A cookbook approach for the determination of parameters in the Hopfield time-alignment network is also described.
An autoassociative memory is a device which accepts an input pattern and generates an output as the stored pattern which is most closely associated with the input. In this paper, we propose an autoassociative memory cellular neural network, which consists of one-dimensional cells with spatial derivative inputs, thresholds and memories. Computer simulations show that it exhibits good performance in face recognition: The network can retrieve the whole from a part of a face image, and can reproduce a clear version of a face image from a noisy one. For human memory, research on "visual illusions" and on "brain damaged visual perception", such as the Thatcher illusion, the hemispatial neglect syndrome, the split-brain, and the hemispheric differences in recognition of faces, has fundamental importance. We simulate them in this paper using an autoassociative memory cellular neural network. Furthermore, we generate many composite face images with spurious patterns by applying genetic algorithms to this network. We also simulate a morphing between two faces using autoassociative memory.
In this paper we present an approximation method based on discrete Hopfield neural network (DHNN) for solving temporal constraint satisfaction problems. This method is of interest for problems involving numeric and symbolic temporal constraints and where a solution satisfying the constraints of the problem needs to be found within a given deadline. More precisely the method has the ability to provide a solution with a quality proportional to the allocated process time. The quality of the solution corresponds here to the number of satisfied constraints. This property is very important for real world applications including reactive scheduling and planning and also for over constrained problems where a complete solution cannot be found. Experimental study, in terms of time cost and quality of the solution provided, of the DHNN based method we propose provides promising results comparing to the other exact methods based on branch and bound and approximation methods based on stochastic local search.
We analytically study the dynamical behavior of a two-neuron network with a time-delayed self-connection. The effect of the time delay on the stability of the trivial solution and on the existence of self-sustained periodic solution are investigated. These results could be applied to understand the temporal activity appearing in the olfactory bulb.
In this paper, we review some properties of the "belief propagation" iterative map, used to perform Bayesian inference in a distributed way. We use this algorithm as a starting point to address the inverse problem of encoding observation data into a probabilistic model and focus on the situation where the data have many different statistical components, representing a variety of independent patterns. Asymptotic analysis reveals a connection with some Hopfield model. We then discuss the relevance of these results to the problem of reconstructing and predicting traffic states based on floating car data and show some experiments based on artificial and real data.