Guilin University Of Electronic Technology
Aiming at the catastrophic forgetting problem caused by the class incremental training of neural network models, a class incremental learning method based on a variational pseudo-sample generator with classification feature constraints was proposed. First, a new classifier and a new pseudo sample generator are trained by constructing a pseudo-sample generator to memorize old class samples. The pseudo sample generator is based on the variational autoencoder and uses classification features to constrain the generated samples to better retain the performance of the old class on the classifier. Then, the output of the old classifier is used as the distillation label of the pseudo sample to further retain the knowledge obtained from the old class. Finally, in order to balance the number of samples generated by the old class, pseudo sample selection based on the score of the classifier can be used to select some more representative samples of the old class while maintaining the balance of the number of pseudo samples of each old class. Experimental results on MNIST, FASHION, E-MNIST and SVHN datasets show that the proposed method can effectively reduce the impact of catastrophic forgetting and improve the accuracy of image classification.