Skip to content
On this page

Continual Learning Methods

Continual Learning Methods

Classification #1

  • Model growing: Increase the model capacity for every new task
    • PNN: Progressive Neural Networks
      • Problems: The model grows linarly with the number of trained tasks, Need to know task labels during test
  • Parameter isolation: Explicitly identify important parameters for each task
    • PackNet
  • Regularization: Penalize (some) parameter variations
    • EWC: Elastic Weight Consolidation
  • Knowledge distillation: Use the model in a previous training state as a teacher
    • LwF: Learning without Forgetting
  • Rehearsal: Store old inputs and replay them to the model.
    • GEM: Gradient Episodic Memory
    • A-GEM: Average GEM

Classification #2

Classification #3

Reference

A Continual Learning Survey: Defying Forgetting in Classification Tasks

More …

continual-learning-type

Edit this page
Last updated on 8/21/2023