Back to Top

Paper Title

Dynamic Knowledge Distillation Strategies for Continual Learning in Lifelong Autonomous Systems Without Catastrophic Forgetting

Keywords

  • continual learning
  • catastrophic forgetting
  • knowledge distillation
  • lifelong learning
  • autonomous systems
  • memory replay
  • teacher-student network.

Article Type

Research Article

Issue

Volume : 6 | Issue : 2 | Page No : 1-6

Published On

March, 2025

Downloads

Abstract

Lifelong learning in autonomous systems demands the ability to acquire new knowledge over time without compromising previously learned information—a challenge known as catastrophic forgetting. This paper explores dynamic knowledge distillation strategies that enable continual learning in neural models deployed in autonomous systems. By leveraging teacher-student architectures, selective memory replay, and adaptive regularization, the proposed framework ensures knowledge retention and optimal adaptation to new tasks. Through comparative evaluations on benchmark datasets, the approach demonstrates marked improvements in accuracy and task retention over existing lifelong learning techniques.

View more >>

Uploded Document Preview