World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

A Feed-Forward Neural Network for Increasing the Hopfield-Network Storage Capacity

    https://doi.org/10.1142/S0129065722500277Cited by:1 (Source: Crossref)

    In the hippocampal dentate gyrus (DG), pattern separation mainly depends on the concepts of ‘expansion recoding’, meaning random mixing of different DG input channels. However, recent advances in neurophysiology have challenged the theory of pattern separation based on these concepts. In this study, we propose a novel feed-forward neural network, inspired by the structure of the DG and neural oscillatory analysis, to increase the Hopfield-network storage capacity. Unlike the previously published feed-forward neural networks, our bio-inspired neural network is designed to take advantage of both biological structure and functions of the DG. To better understand the computational principles of pattern separation in the DG, we have established a mouse model of environmental enrichment. We obtained a possible computational model of the DG, associated with better pattern separation ability, by using neural oscillatory analysis. Furthermore, we have developed a new algorithm based on Hebbian learning and coupling direction of neural oscillation to train the proposed neural network. The simulation results show that our proposed network significantly expands the storage capacity of Hopfield network, and more effective pattern separation is achieved. The storage capacity rises from 0.13 for the standard Hopfield network to 0.32 using our model when the overlap in patterns is 10%.

    Remember to check out the Most Cited Articles!

    Check out our titles in neural networks today!