World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

A Bi-level Alternating Meta-learning for Recurring Concept on Evolving Data Streams Instructions for Typing

    https://doi.org/10.1142/S0218488525500084Cited by:0 (Source: Crossref)

    Humans’ learning involves remembering patterns of the past to better understand recurring concepts as their knowledge grows. However, a key issue that arises from these cases is that previous knowledge in deep neural networks could be gradually forgotten when they are trained for a new concept. We address this problem by learning a general representation that can remember the previous information and promote future learning. In this pursuit, a new controller is introduced by the meta-learning strategy that guides the network to keep the balance between the previously learned concepts and the new concept, hence it avoids catastrophic forgetting. Compared to previous online incremental learning for evolving data streams, our approach is dedicated to handling recurring concepts. When encountering recurring concepts, the model can remember and recall the previous knowledge and can quickly adapt to this change. In this paper, we propose a Bi-level Alternating Meta-learning approach for recurring concepts (BLAML), which emphasizes the hidden representation learning of different concepts in model-level learning, and obtains a set of shared parameters through the global meta-learning strategy. Through extensive experiments, the effectiveness of the proposed method is proved.