This is the official code for FeTrIL (WACV2023): Feature Translation for Exemplar-Free Class-Incremental Learning Abstract Exemplar-free class-incremental learning is very challenging due to the negative effect of catastrophic forgetting. A balance between stability and plasticity of the incremental proc...
Unlike exemplar-based class-incremental learning (EBCIL) which allows storing some old samples, exemplar-free class-incremental learning (EFCIL) faces a more severe forgetting problem due to the complete prohibition on accessing old data. Some previous methods freeze the feature extractor after the ...
Code for NeurIPS 2023 paper - FeCAM: Exploiting the Heterogeneity of Class Distributions in Exemplar-Free Continual Learning This work studies Class-Incremental Learning (CIL) in both normal supervised settings using sufficient training samples (we call it Many-Shot CIL - MSCIL) as well as in fe...
Exemplar-free class-incremental learning (EFCIL) presents a significant challenge as the old class samples are absent for new task learning. Due to the severe imbalance between old and new class samples, the learned classifiers can be easily biased toward the new ones. Moreover, continually ...