Abstract
Humans’ learning involves remembering patterns of the past to better understand recurring concepts as their knowledge grows. However, a key issue that arises from these cases is that previous knowledge in deep neural networks could be gradually forgotten when they are trained for a new concept. We address this problem by learning a general representation that can remember the previous information and promote future learning. In this pursuit, a new controller is introduced by the meta-learning strategy that guides the network to keep the balance between the previously learned concepts and the new concept, hence it avoids catastrophic forgetting. Compared to previous online incremental learning for evolving data streams, our approach is dedicated to handling recurring concepts. When encountering recurring concepts, the model can remember and recall the previous knowledge and can quickly adapt to this change. In this paper, we propose a Bi-level Alternating Meta-learning approach for recurring concepts (BLAML), which emphasizes the hidden representation learning of different concepts in model-level learning, and obtains a set of shared parameters through the global meta-learning strategy. Through extensive experiments, the effectiveness of the proposed method is proved.