Optimised Continually Evolved Classifier for Few-Shot Learning Overcoming Catastrophic Forgetting
Abstract
Catastrophic forgetting (CF) poses the most important challenges in neural networks for continual learning. During training, numerous current approaches replay precedent data that deviate from the constraints of an optimal continual learning system. In this work, optimisation-enabled continually evolved classifier is used for the CF. Moreover, a few-shot continual learning model is exploited to mitigate CF. Initially, images undergo pre-processing using Contrast Limited Adaptive Histogram Equalisation (CLAHE). Then; the pre-processed outputs are utilised in the classification through Continually Evolved Classifiers based on few-shot incremental learning. Here, the initial training is done by using the CNN (Convolutional Neural Network) model and then the pseudo incremental learning phase is performed. Furthermore, to enhance the performance, an optimisation approach, called Serial Exponential Sand Cat Swarm Optimisation (SExpSCSO), is developed. SExpSCSO algorithm modifies Sand Cat Swarm Optimisation by incorporating the serial exponential weighted moving average concept. The proposed SExpSCSO is applied to train the continually evolved classifier by optimising weights and thus, improves the classifiers performance. Finally, the experimentation analysis reveals that the adopted system acquired the maximal accuracy of 0.677, maximal specificity of 0.592, maximal precision of 0.638, recall of 0.716 and F-measure of 0.675.