A GENERAL UPDATING RULE FOR DISCRETE HOPFIELD-TYPE NEURAL NETWORK WITH TIME-DELAY AND THE CORRESPONDING SEARCH ALGORITHM
Abstract
In this paper, the Hopfield neural network with delay (HNND) is studied from the standpoint of regarding it as an optimizing computational model. Two general updating rules for networks with delay (GURD) are given based on Hopfield-type neural networks with delay for optimization problems and characterized by dynamic thresholds. It is proved that in any sequence of updating rule modes, the GURD monotonously converges to a stable state of the network. The diagonal elements of the connection matrix are shown to have an important influence on the convergence process, and they represent the relationship of the local maximum value of the energy function to the stable states of the networks. All the ordinary discrete Hopfield neural network (DHNN) algorithms are instances of the GURD. It can be shown that the convergence conditions of the GURD may be relaxed in the context of applications, for instance, the condition of nonnegative diagonal elements of the connection matrix can be removed from the original convergence theorem. A new updating rule mode and restrictive conditions can guarantee the network to achieve a local maximum of the energy function with a step-by-step algorithm. The convergence rate improves evidently when compared with other methods. For a delay item considered as a noise disturbance item, the step-by-step algorithm demonstrates its efficiency and a high convergence rate. Experimental results support our proposed algorithm.
Remember to check out the Most Cited Articles! |
---|
Check out these titles in artificial intelligence! |