Multi-Level Knowledge Distillation and Dynamic Self-Supervised Learning for Continual Learning

📰 ArXiv cs.AI

arXiv:2508.12692v3 Announce Type: replace-cross Abstract: Class-incremental with repetition (CIR), where previously trained classes repeatedly introduced in future tasks, is a more realistic scenario than the traditional class incremental setup, which assumes that each task contains unseen classes. CIR assumes that we can easily access abundant unlabeled data from external sources, such as the Internet. Therefore, we propose two components that efficiently use the unlabeled data to ensure the hi

Published 1 Apr 2026
Read full paper → ← Back to News