Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The artificial neural network learning process of Kohonen's Learning Vector Quantization classifying algorithm is treated in the hydrodynamic limit. The hydrodynamic modes are specified. Transport coefficients are calculated and found to obey an Einstein-type relation, offering an explanation to the sharpness of classification. Neurons representing one class, if initialized in an input range dominated by the opposite class, start learning by a process similar to spinodal decomposition.
Have you ever been thinking about how enormous amount of computing it takes, to see something? There is no optical communication system in your brain: the images received in your eyes are coded by your nervous system in a fantastic richness of details, then processed and restored in your consciousness, correcting for changes of illumination, mood and many other circumstances…
The following sections are included:
Now we want to show how a collective dynamical network as described in the previous chapter can store and retrieve information. Learning and memory is indeed a function of outstanding importance of our nervous system: not only our ability of adaptation to unforeseen environmental variations depends on it, but also as permanent tasks as visual processing are too complicated to be genetically fully encoded and must be acquired by learning in early infancy…
The following sections are included:
The Hopfield model, by its simplicity and the possibility it offers for an indepth theoretical analysis, became a point of reference for numerous studies. In this chapter we review the first generation of these developments. Their distinguishing feature is the way they treat the learning problem: they use a closed formula for the synaptic modification on storing a new pattern. The second generation, using slower but more powerful iterative learning algorithms, will be treated in Chapter 6.
The following sections are included:
A detailed knowledge of how fast the retrieval happens is of obvious importance for any application. Since associative retrieval is expected, one would also like to know something about the size and structure of the basins of attraction: how close to a pattern one has to start to retrieve it, how much the chance of retrieval depends on random noise etc…
The following sections are included:
The dynamical aspects of the Hopfield model are very attractive; our brain certainly does have dynamics, and a Hamiltonian model with simple attractors has good chances to reveal important mechanisms, as it did for the simple task of associative memory…
The following sections are included:
Neural network modelling is a lot of fun for those who do it. What else? From here on, opinions diverge…
The following sections are included:
The following sections are included: