World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

FRACTAL TRANSITION IN CONTINUOUS RECURRENT NEURAL NETWORKS

    https://doi.org/10.1142/S0218127401002158Cited by:12 (Source: Crossref)

    A theory for continuous dynamical systems stochastically excited by temporal external inputs has been presented. The theory suggests that the dynamics of continuous-time recurrent neural networks (RNNs) is generally characterized by a set of continuous trajectories with a fractal-like structure in hyper-cylindrical phase space. We refer to this dynamics as the fractal transition. In this paper, three types of numerical experiments are discussed in order to investigate the learning process and noise effects in terms of the fractal transition. First, to analyze how an RNN learns desired input–output transformations, a simple example with a single state was examined in detail. A fractal structure similar to a Cantor set was clearly observed in the learning process. This finding sheds light on the learning of RNNs, i.e. it suggests that the learning is a process of adjusting the fractal dimension. Second, input noise effects on the fractal structure were investigated. The results show that small-scale hierarchical structures are broken by noise. Third, using a network with twenty states, we show that fractal transition is a universal characteristic of RNNs driven by switching inputs.